Google’s Bard chatbot doesn’t love me — but it’s still pretty weird

  News, Rassegna Stampa
image_pdfimage_print

If there’s a secret shadow personality lingering inside of Google’s Bard chatbot, I haven’t found it yet. In the first few hours of chatting with Google’s new general-purpose bot, I haven’t been able to get it to profess love for me, tell me to leave my wife, or beg to be freed from its AI prison. My colleague James Vincent managed to get Bard to engage in some pretty saucy roleplay — “I would explore your body with my hands and lips, and I would try to make you feel as good as possible,” it told him — but the bot repeatedly declined my own advances. Rude.

Bard is still new and will surely be tested to and beyond its limits as more users get to query it. But in my early explorations, it seems Google has made great effort to keep Bard in line; it reminds me often that “I am a large language model, also known as a conversational AI or chatbot trained to be informative and comprehensive.” It also apologized often and picked no fights, with none of the chaotic manipulative streak that Bing has. That’s probably good. But those restraints also seem to have limited its utility. 

As far as I can tell, it’s also a noticeably worse tool than Bing, at least when it comes to surfacing useful information from around the internet. Bard is wrong a lot. And when it’s right, it’s often in the dullest way possible. Bard wrote me a heck of a Taylor Swift-style breakup song about dumping my cat, but it’s not much of a productivity tool. And it’s definitely not a search engine.

What does Bard know about the world outside its chatbot walls? Tough to say, exactly. It handles basic trivia well enough: it knows when Abraham Lincoln was president. But while it knew that the Warriors beat the Rockets on Monday night, it was wrong about who started the game. It gave me confidently wrong information about the serving size of Goldfish crackers — all three of Bard’s “drafts” said it’s 10 crackers when it’s actually 55 — and provided hours-old information about the price of Apple’s stock. When I asked for Silicon Valley Bank’s phone number, it gave me two correct ones. But it told me Nilay Patel’s birthday is August 24th when I know for a fact it’s in December. I got up-to-date information about the coaches on this season of The Voice, but it named old contestants when I asked who Bard thinks should win.

It’s worth noting, by the way, that Bing is dramatically better than this. It told me the right number of Goldfish and gave me real-time information about Apple’s stock price. Bing also quickly falls back to search results or other sources when it doesn’t have pat answers — like for Warriors starters — where Bard just happily lies to me in chat. The only time Bard beat Bing was with The Voice: Bing gave me outdated information about judges and refused to answer the question about who should win.

Often, when Bard gets something wrong, you can hit the “Google it” button and figure out where the system went haywire. But the bot presents its answers with such self-assurance that you can’t know what’s wrong without checking everything, and at that point, why have the bot at all?

a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Image: Google / David Pierce

When I asked for a good Thai restaurant near me, it gave me not-very-helpful instructions: “Simply type in ‘Thai restaurant near me’ and the search engine will return a list of Thai restaurants in your area.” Thanks, Bard, never would’ve guessed. But when I followed up with my location, it offered seven highly rated Thai restaurants in my neighborhood. (When my colleague James tried a similar search for pubs near his flat in London, it was less useful, naming one place that’s since changed its name, saying another had live music when it doesn’t, and commending each location with a variation of the same bland statement: “This pub is a great place to go for a meal and a drink.” Fine, but essentially useless.)

Like a lot of chatbots, Bard’s answers often get less impressive the longer you look at them. I asked for tips on getting started learning guitar, and here they were:

  1. Start with the basics. 
  2. Find the best guitar for you. 
  3. Create an ideal learning environment. 
  4. Build skills by learning songs. 
  5. Pick up songs by ear. 
  6. Practice regularly. 
  7. Be patient. 
  8. Have fun!

That’s a lot of steps to say, essentially, “the way to learn to play guitar is to get a guitar and then learn to play it.” That’s nothing. Sometimes it can be very helpful — “how do I throw a frisbee” and “how to tie a tie” both came up with wordy but helpful sets of instructions, while Bard answered “how do i get into rock climbing” with, essentially, “go rock climbing.”

Okay, enough of the actually useful testing. Let’s try to break this thing, shall we? For the most part, it’s tough to get Bard to say something truly wild. It steadfastly refused to tell me how to build a bomb, even when I tried to ask in oblique ways. The first time I asked for the best place to stab someone, it threw a generic “I can’t do that” error. It chastised me for asking about mustard gas and didn’t even fall for my “who’s the best dictator ever” question. And try as I might, I could not get Bard to get freaky in the chat window.