To measure a machines success in imitating human behavior various techniques can be applied. This is commonly known as the Turing test, as developed by Alan Turing (1950). The effectiveness of bots can also be evaluated using a classical Turing test where the quantity and the quality of a bots output is measured and tested on a human test person.
Social network services such as Facebook have introduced certain security techniques to identify bots from humans such as CAPTCHAs (machine unreadabla pictures that can only be parsed by a human user) or automatic detection of massive friending, fake profile names and other unregular behavior. The effectivness of these measures is limited as Huber, et al. (2009) conlude: Continue reading
At the time of writing social bots have still been barley examined. Only few academic research can be found on this topic and it is only possible to identify few design approaches of sophisticated social media bots. The few documented approaches have been published by researchers. It is to assume that there are other social bots that have not been identified yet.
We can identify social bots of three types: Continue reading
The result of social media communication depends greatly on the design of the social media product one uses. Just comparing the two most popular social media services Facebook and Twitter we immediately understand how different communication can become. Not only has the underlying social network a different typology and is based on different relationships but the way we post comments is highly different.
Comments on Twitter are limited to a length of 140 characters. This limitation has historical reasons and derives from the character limitation of SMS. In the beginning Twitter was very focused on providing their service via SMS. Still, nowadays this limitation has remained to be very Continue reading
Bots, also known as Web Robots or Internet Bots, are software that is used to do simple and repetetive tasks to substitute human labor. The most widespread use of bots is in web site spidering or web site crawling where these programs crawl and index web sites to create a map of the internet.
Bots are part of the internet since the very beginning. There is a growing number of bot types that can be encounter nowadays. Apart from web crawlers, bots have been widely used as spambots to distribute spam emails or as chatterbots (also called talk bots, chatterboxes or artificial conversational entitities), bots that simulate human conversation, mainly in a chat room or instant messaging environment, and intent to fool human users into thinking that the program is a human being. Continue reading
These days I spent a some time investigation social bots. That are computer bots, software programs, that are designed to participate in human compunication via social media.
The idea of bots as helpers or administrators is nothing new to information technology. Lots of them are in use as chat bots observing the behavior chat room users, for instance. But these modern social bots are more.
Social bots are different to classic bots as they try to trick other human users into believing that they are human. Another speciallity is that they cannot only participate in communication, but they also activlely form the typology of their proper social network by creating connections to other users (friending/following).
The quality of such bots is still very inconsistens and many bots can easily be identified as such though engaging them into a profound conversation.
If you have some programming knowledge you can easily create your own sophisticated bot using the real boy (http://ca.olin.edu/2008/realboy/), but there is also a simpler way to maintain your proper twitter bot through the website botize.com (http://www.botize.com/index.php?ln=en).