Senin, 17 Januari 2011
So, what CAN'T a computer do better than a human brain?
Much is being made of a recent project from IBM called Watson*, a computer trivia program that is set to compete with top players on Jeopardy!, the show to be aired in the middle of next month. The machine beat top champions Ken Jennings and Brad Rutter in a short practice which can be viewed already.
As a computer programmer and a former Jeopardy! champ, I am very impressed with this accomplishment.
It's nothing at all for a computer to have a massive amount of data in storage and the ability to access it at literal lightning speed. The great accomplishment here is the ability understand a sentence in spoken English a person has made intentionally tricky by employing homonyms, puns and other word play. English is by far the language with the most words of any used by humans, roughly twice the size of any other. If the data set is correctly formatted, that doubling of size means searching for words might take ten superfast decisions instead of nine.
But consider the word bat. Are we talking about the flying animal or the piece of wood used in baseball? Don't forget that it's also the name for the piece of wood in the game cricket, and in British usage, a ping pong paddle becomes a tennis table bat. When combined with the adjective "old", it's probably not a flying animal or a piece of wood used to hit a ball, but instead a somewhat outdated piece of slang for a disagreeable older woman.
"He was disgusted when the old bat batted her eyes." There are two ways this can be disgusting, one much more likely than the other. How do you fill a computer with all the possible permutations of word play? What about an audio Daily Double or a video Daily Double? We are used to pulling the human voice out the many sounds in a song. How well can a computer do that?
*As you might have already guessed, this program Watson is named for Thomas Watson, the man who in 1924 renamed Computing Tabulating Recording Corporation to International Business Machines and became the president of the newly named company, which would dominate the tabulating machine industry for many decades to come.
IBM is building on a track record of making software that excels at brain games. Their greatest previous victory was the chess software that defeated legendary world champion Garry Kasparov in a six game match in 1997 by compiling a record of two wins, one loss and three draws. When it comes to speed chess, computers have dominated humans for decades, but this set of matches was played using the regulation time limits for the top professional tournaments, two hours a piece for each player to complete 40 moves and if the game is still undecided, the position is sealed and play begins later with less times allotted. Kasparov wanted a rematch but IBM refused and accusations of cheating were thrown around. Still, the final score was Deep Blue 2.5, Kasparov 1.5, and it is still considered a watershed moment in the history of software development.
On the other hand, there are some brain games that do not have particularly good software opponents available. Most notably the great Asian game known as Go in Japanese (Weiqi in Chinese and Baduk in Korean), a game whose rules are somewhat simpler than chess since pieces never move once put on the board unless they are captured. Many fans of Go say it does not have a good algorithm a computer can follow because it is so much more subtle than chess, but it seems more likely that the problem of making a strong computer opponent lies in the fact that no one has decided to put in the resources necessary to develop such a program. Go is a great game and very subtle, but it is nowhere near as subtle as understanding the spoken English language, and Watson has shown it can do that remarkably well.
As for great games that also involve the element of chance, some have been the focus of very intensive software research and others have not. A computer backgammon program beat the World Champion Luigi Villa back in 1979, and by using backgammon software to analyze positions exhaustively, some players consider the game close to "solved", such that the best plays are known and it just comes down to luckier dice.
On the other, bridge software is still in its infancy. Again, I am willing to concede bridge is more complex than backgammon, but I don't think bidding and playing bridge properly is as tough as understanding natural language, especially natural language where word play is the norm instead of the exception.
Langganan:
Posting Komentar (Atom)
Tidak ada komentar:
Posting Komentar