To measure a machines success in imitating human behavior various techniques can be applied. This is commonly known as the Turing test, as developed by Alan Turing (1950). The effectiveness of bots can also be evaluated using a classical Turing test where the quantity and the quality of a bots output is measured and tested on a human test person.
Social network services such as Facebook have introduced certain security techniques to identify bots from humans such as CAPTCHAs (machine unreadabla pictures that can only be parsed by a human user) or automatic detection of massive friending, fake profile names and other unregular behavior. The effectivness of these measures is limited as Huber, et al. (2009) conlude: Continue reading
Bots, also known as Web Robots or Internet Bots, are software that is used to do simple and repetetive tasks to substitute human labor. The most widespread use of bots is in web site spidering or web site crawling where these programs crawl and index web sites to create a map of the internet.
Bots are part of the internet since the very beginning. There is a growing number of bot types that can be encounter nowadays. Apart from web crawlers, bots have been widely used as spambots to distribute spam emails or as chatterbots (also called talk bots, chatterboxes or artificial conversational entitities), bots that simulate human conversation, mainly in a chat room or instant messaging environment, and intent to fool human users into thinking that the program is a human being. Continue reading
These days I spent a some time investigation social bots. That are computer bots, software programs, that are designed to participate in human compunication via social media.
The idea of bots as helpers or administrators is nothing new to information technology. Lots of them are in use as chat bots observing the behavior chat room users, for instance. But these modern social bots are more.
Social bots are different to classic bots as they try to trick other human users into believing that they are human. Another speciallity is that they cannot only participate in communication, but they also activlely form the typology of their proper social network by creating connections to other users (friending/following).
The quality of such bots is still very inconsistens and many bots can easily be identified as such though engaging them into a profound conversation.
If you have some programming knowledge you can easily create your own sophisticated bot using the real boy (http://ca.olin.edu/2008/realboy/), but there is also a simpler way to maintain your proper twitter bot through the website botize.com (http://www.botize.com/index.php?ln=en).
Yesterday I listened to a great podcast about software development. After an interesting hour of talk radio the host of the podcast Elementarfragen (in German only), asked his two guests about the future of computers and the Internet. This question made me think what I would answer to such a difficult question.
The two guests of the show were wise enough to give a wide answer, focusing on the human to computer interface and the developments we are about to encounter in the field of speech recognition. They also said that many of the most important inventions have already happened and that the next generation of advanced software will not be HAL (from 2001 Space Odysee), but only iOS6. Continue reading