Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Communications Facebook Google Microsoft

On the Coming Chatbot Revolution (computerworld.com) 94

An anonymous reader writes: Facebook, Google, and Microsoft are all pursuing AI-powered chatbots — an intersection between several popular technologies: personal assistant software, search engines, machine learning, and social tools. Right now, while they're still building these chatbots, developers are cheating a bit. Facebook is using real humans to answer questions the AI can't. Google answers tough questions from a database populated with movie dialog. Microsoft scans social media to find the most popular answer, and offers that to inquisitive users. But software becoming conversational comes with hazards: "Because human beings are complex creatures plagued by cognitive biases, irrational thinking and emotional needs, the line between messaging with a friend and messaging with AI will be fine to nonexistent for some people." It sounds like an Asimov-era sci-fi trope, but it's already happening in China.
This discussion has been archived. No new comments can be posted.

On the Coming Chatbot Revolution

Comments Filter:
  • ...and getting pedos to meet each other at random locations on an industrial scale, all the while thinking they were going to get some action from people our own age, coming back in hilarious rage that they were trolled.

    Now we have TV shows to do that sort of thing.

    • Can I have something like that on my phone to bring to the bar? Just hold it up to the ear of a hot woman.

  • by fahrbot-bot ( 874524 ) on Monday December 28, 2015 @02:18PM (#51196531)

    "Because human beings are complex creatures plagued by cognitive biases, irrational thinking and emotional needs, the line between messaging with a friend and messaging with AI will be fine to nonexistent for some people."

    And how does that make you feel to be because human beings are complex creatures plagued by cognitive biases, irrational thinking and emotional needs?

  • Ignore this post, slashdot needs an undo moderation method.
    • by mark-t ( 151149 )

      Slashdot needs several things...

      Unicode support being not the least of them.

      • Slashdot needs several things...

        Unicode support being not the least of them.

        maybe you should check out Beta... I heard a lot of good things about it :)

  • They are working full time here on my phone lines. One calls regularly and asks if Barbara is home. When answered no, it lets me know it will call back at a better time and hangs up. It is not smart enough to understand she died.

    It is not smart enough to know what that better time is.

    I get other calls trying to interest me in college. I started asking them if they will answer a capita for me. The fun ones try to find a class for me on capita and want to know how soon I would like to take a class.

  • In the article about the MS chat bot, MS claims that they don't keep information from prior conversations. Assuming that they are being honest about this, it is a moot point, as data between the users smart phones and the servers is likely un-encrypted or the Authorities have the encryption keys. (Or will soon, there was an article on /. yesterday about that very topic).

    It is interesting to think that MS could be so naive as to think that this feature isn't rife for surveillance abuse.
    • by WheezyJoe ( 1168567 ) <fegg&excite,com> on Monday December 28, 2015 @03:05PM (#51196843)

      Behold the "Entrapment Bot." Indistinguishably human-appearing bots everywhere inviting you to chat, e-mail, speak, whatever, and applying continuously evolving AI to lure you into doing something sufficient to justify and automatically generate search and arrest warrants.

      More fun, the back-end server can invite law enforcement and IT personnel to place bets how many chats it will take to get you to incriminate yourself. Sound stupid? Some contractor's gonna make millions selling this to surveillance-crazed governments world-wide before writing one line of code.

      You saw it here first, folks. Someday, the only safe way to talk shit with somebody is in person, down in a bug-proof hole. And the Entrapment Replicants will number those days, too.

      • Wish I had mod points for you today.
      • by AHuxley ( 892839 )
        Very true. Did a person add the "chat bot" as a friend? Or try and just IM it over weeks, months... What phrases, terms, words got used.
        Fill a database count with too many terms and a real human reads the logs...
      • -1, insane

        You people have some nutty ideas about what you think is entrapment. What you describe is actually called "attempting to commit a crime".

        • Entrapment does not mean the subject did not commit a crime. It just means the crime cannot be prosecuted if a government agent induced the crime, and if the crime would not have been committed without that agent's involvement. It's standard training for any undercover officer to know exactly what they must not ask. A chatbot that befriends people, steers the conversation towards crime and gets them to admit their past illegal activities would not comprise entrapment. A chatbot that befriends people, convin

      • by KGIII ( 973947 )

        Hello $user. I am really $swear $inebriationlevel! I have some $illicitdrugs and bought too much. I need to pay my $bill, I'm getting rid of some $illicitdrugs for $lowprice and can deliver, if you're interested.

        Hello Joe. I am really damned tripped right not. I have some acid and bought too much. I need to pay my car payment, I'm getting rid of some acid for $50 for a ten strip and can deliver, if you're interested.

        I suppose it could scrape your post history, emails, former chats, criminal record, and then

    • by taustin ( 171655 )

      The first abuse will by spammers and con artists. We know they'll be first, because they've been using chatbots for years. All dating sites use them, or have fraudsters using them who do. I've had email chatbots respond to ads on Craig's List, too.

      I expect these new ones will be a decade behind in sophistication, permanently.

      • by ruir ( 2709173 )
        i actually find too strange this thread being immediately next to one about Ashley Madison. Either someone at slashdot has some fine black humor or it is a not-so-subtle sign from the gods.
  • by jtara ( 133429 )

    She is known as Xiaoice

    Is that "Eliza" in Chinese?

    • it roughly translates to "Little Bing"
    • The characters [google.com] mean "little ice" and were chosen because "ice" is pronounced Bing in Mandarin.
    • by jtara ( 133429 )

      For the n00bs:

      https://en.wikipedia.org/wiki/ELIZA

      Hails originally from 1964-1966.

      It was fun for a moment for me when in college in 1972 to re-code it in Snobol...

      Reasonably convincing. No AI. Just some clever, crude parsing and a small bit of contextual memory.

      The conversations from this advanced 50-years-later technology looks about the same...

      • I once put an Eliza into an IRC channel. As soon as one mentioned its name, the bit was called Pirx, he drew that person into a talk. Obviously he created an "Eliza Instance" for every conversation. So he could answer to all partners individually.

        Once we had like 20 people in the chat and half of them where talking to Pirx and commenting to each other about his answers. After a few minutes they accused me I would be typing for him ... was quite funny.

        It was just a simple Eliza, no real "enhancements".

  • by smoothnorman ( 1670542 ) on Monday December 28, 2015 @02:44PM (#51196709)
    And what makes you believe that the coming chatbot revolution is a source of concern?

    Can you explain what makes you say that?

    Perhaps we can start again, why is it that you think that concern is something that you feel about chatbots?

  • It's boring enough chatting with real people. A bot would be entertaining to screw with for 10 minutes and then I'd leave and never talk to it again.

    • It's to take over simple questions not to provide conversation. Given how scripted half of tech support is in India they could easily be Chatbots and most people wouldn't know any different.

  • by Anonymous Coward

    Google answers tough questions from a database populated with movie dialog.

    So just like the rest of us?

  • by Hasaf ( 3744357 ) on Monday December 28, 2015 @03:56PM (#51197163)

    I know this sounds bad; but I am a middle aged man, I am not going to make new friends. I am not allowed to have a dog. I wanted something that I could come home and chat with. Yes, something that would remember to wake me up and discuss movies, books, and games with me.

    I realize it will never be a person; I am well aware of chat-bot limitations. However, with more and more single households, I can see a demand for something like this. To deny the market is to ignore a market.

    • by Anonymous Coward

      Jeezus. Thats the most depressing thing I've ever read.

    • I know this sounds bad; but I am a middle aged man, I am not going to make new friends. I am not allowed to have a dog. I wanted something that I could come home and chat with. Yes, something that would remember to wake me up and discuss movies, books, and games with me.

      As long as it has a working class Scottish accent and curses a lot, I'll interact with a chatbot all day long.

      "How are you today, AngusBot?"

      "ALL TURN YER FOOKIN BOABY INSIDE OOT YA MANKY COONT"

    • Do you really think, you will feel good knowing it's a bot that you are talking to? If you are a kid or someone who's never been told that the other end is not a human, might enjoy the talks; but knowing very well that the other end is just arranging the words using some algorithm, what joy you think it will give you? I understand each one is different but it kind of puzzles me.

      If someone is reaching out to the bot to handle loneliness; how will this eliminate it? Most us humans interact not really beca
      • Do you really think, you will feel good knowing it's a bot that you are talking to? If you are a kid or someone who's never been told that the other end is not a human, might enjoy the talks; but knowing very well that the other end is just arranging the words using some algorithm, what joy you think it will give you? I understand each one is different but it kind of puzzles me.

        It puzzles me personally, too. But it's a well-known phenomenon, going back to the days of ELIZA decades ago. Even when told that there's no "person" on the other side of an electronic chat, people still often engage emotionally.

        If someone is reaching out to the bot to handle loneliness; how will this eliminate it? Most us humans interact not really because of the content of the words -- it's because there is something common we share -- a shared destiny [the reason we are here..a journey.. call it soul/higher power/purpose etc].

        People "talk" to deities too -- they rarely even "talk back" to most people, but people often find solace in doing so. Early ELIZA users often reported "sensing an intelligence" even after explicitly being told that there was none.

        People project all sorts of things onto emotion

        • by KGIII ( 973947 )

          When I'm home, I pay to visit a headshrinker or, lately, a therapist. Why? They're usually objective, honest, and able to give me feedback. I also want to ensure that I'm sane - they tell me that I am. I've been going to see a headshrinker for years and years - since I was in my 20s, though I had to see one as a kid for a spell. Depending on my schedule and my desire, I may see one every week. I can tell them the complete and total truth. I can tell them that about my thoughts and feelings and they can give

      • > knowing very well that the other end is just arranging the words using some algorithm

        It's a recurrent neural network with an understanding of the world, that has its own will based on reinforcement learning. I am very curious to hear what they have to say. After ingesting millions of web pages and books it might be quite insightful.
    • There is absolutely nothing about being a middle aged man that would intrinsically prevent you from doing either one of those things.

    • by AmiMoJo ( 196126 )

      Sharp has been offering this for a few years in Japan. They make a robot vacuum cleaner called CoCoRobo that you can talk to and have a simple conversation with when you get home. It even sends you photos of stuff it finds under the sofa and encounters with pets as it cleans up.

      Before that Sony had a robot dog, and robot companions are seen as being key to caring for the elderly in the future.

  • by burtosis ( 1124179 ) on Monday December 28, 2015 @05:02PM (#51197671)
    xkcd suspicion [xkcd.com]

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...