This in built 'subjectivity' in the search mechanism represents a kind of fragmentation of the commons the searchable Internet supposedly represents: sometimes I want to know what other people know, what they are looking at, what is popular or interesting for them.
Secondly, grouping searches around an assumption of my interests assumes that my interests are 1/ Statistically quantifiable (solving a loathesome and boring problem may result in many queries), 2/ Particular to me (I may be searching for someone else, or my computer could be shared with another), 3/ Can be built from clear-text (sometimes I might be searching within a context do take me to a binary, like a video, arbitrarily linked in a page (like the comments for instance)).
Finally, isn't there a problem with diminishing returns here? The set that represents my interests will get 'smaller' in subject matter as I continue to search within that set.
I'll certainly be switching if Google's approximation of my interests goes under the radar, digging into cookies when I'm 'signed out'.
This is an example of a computer trying to be smart. The way it behaves changes over time with your interactions with it, but this modified behavior is particular to your connection. If you go search on another machine elsewhere, you'll get different behavior, and you might not know why you aren't finding something you could find easily on your home machines. Smart computers frustrate users. Give me a dumb, predictable computer any way, then I can accuractly predict how it'll respond to my input, and this tailor my input for the exact response I want, every time.
Give me a dumb, predictable computer any way, then I can accuractly predict how it'll respond to my input, and this tailor my input for the exact response I want, every time.
Well said..
So often it's the geeks that are the real humanists - those that know enough about 'intelligent software' to be suspicious of it.
It's not just (suicidal) self reflection but a potent mix of ignorance and laziness that steers us toward Vinge's Singularity.
I completely agree. The privacy aspects aside, this is Google making assumptions which are likely to be wrong. It's just like with their new fade-in homepage: they assume you're there to search, but when they're wrong, the end result is just frustrating.
Or perhaps Google's search mechanisms would determine that you are wanting to know what other people know, and filters more of those results instead of similar results to what you've already found.
A large number of installed systems work by fiat. That is, they work
by being declared to work.
-- Anatol Holt
I Google (Score:5, Insightful)
Secondly, grouping searches around an assumption of my interests assumes that my interests are 1/ Statistically quantifiable (solving a loathesome and boring problem may result in many queries), 2/ Particular to me (I may be searching for someone else, or my computer could be shared with another), 3/ Can be built from clear-text (sometimes I might be searching within a context do take me to a binary, like a video, arbitrarily linked in a page (like the comments for instance)).
Finally, isn't there a problem with diminishing returns here? The set that represents my interests will get 'smaller' in subject matter as I continue to search within that set.
I'll certainly be switching if Google's approximation of my interests goes under the radar, digging into cookies when I'm 'signed out'.
Re: (Score:3, Insightful)
sometimes I want to know what other people know, what they are looking at, what is popular or interesting for them.
so do I, but they complained and now I'm not allowed within 100 yards of them :(
Re:I Google (Score:5, Insightful)
Re: (Score:3)
Well said..
So often it's the geeks that are the real humanists - those that know enough about 'intelligent software' to be suspicious of it.
It's not just (suicidal) self reflection but a potent mix of ignorance and laziness that steers us toward Vinge's Singularity.
Re:I Google (Score:4, Insightful)
I completely agree. The privacy aspects aside, this is Google making assumptions which are likely to be wrong. It's just like with their new fade-in homepage: they assume you're there to search, but when they're wrong, the end result is just frustrating.
You know what they say about assumptions...
Re: (Score:1)
Or perhaps Google's search mechanisms would determine that you are wanting to know what other people know, and filters more of those results instead of similar results to what you've already found.