Over the last few weeks I became frustrated with YouTube. My front page is full of listings of videos and channels Goggle thinks I am interested based on a faulty algorithm based on prior viewings. The problem is the suggestions that do not reelect my interest. Many of these recommendations are based on chance viewings, for example a friend’s posting on Facebook or a forum post. I have to constantly clearing out my video history because of a chance viewing to solve the problem of irrelevant suggestions. Furthermore, I have little confidence of my Google searches because of personalization.
I found it amusing it is not just me others have the same problem. I found a rest article on fox news:
Personalization is collapsing the Web is say the same thing. The meat of the article is in these few paragraphs:
“It’s a form of condescension that believes we can’t think for ourselves or decide what is relevant to us on our own. It makes you wonder if there’s something going on at the Fukushima nuclear power plant you don’t know about (there is). And it makes you afraid to click on anything lest you be judged. One more hit on a self-destructive, scantily clad reality TV star, and you’re branded an air-head for life.
So rather than being innovative, the personalization trend is proving to be corrosive. It’s ruining search–and research–on the Web. Among the many factors it considers, Google, for example, looks at your previous searches and produces subsequent results based on earlier interests. In other words, even with its latest update, the search engine takes you deeper and deeper down the rabbit hole, as if we were all people of singular interests that could be sussed out by a computer program. Why give you results of Canadian or U.K. medical sites when you’re in Florida? You must only be interested in what the medical establishment has to say in Florida, right?”
To add more to her argument, Google is committing a sweeping generalization fallacy by algorithm: assumes because I watched video A on channel B I am interested in video or channel C. The one cannot prove this statement as true We do not have the AI that can know the mind and motives of the users. Since a machine cannot predict the human mind why Google thinks its algorithm can predict my preferences? Even my friend a can only predicts few of my preferences, for example, I like collect Sanrio / Hello Kitty Items. I also change my patters because I do not like to be typecast. If a barista at a coffee shop know I order a certain item and tries second guessing me, I would change the order. If people cannot guess my preferences how can a dumb piece of code can?
It is time for Goggle and Yahoo let me make my own decision, I wear the big boy pants now.
(coming soon why I am going to Nuke my Google account)