Google launched a new chatbot, Bard, and shared the experimental technology with a limited number of people in the United States and Great Britain.
Bard competes with similar technologies from Microsoft and its partner, San Francisco start-up OpenAI. But Google has been cautious with its release, as it tries to control the unexpected behavior that this type of technology exhibits. You are implementing the chatbot as a service that operates separately from your Internet search engine and other products.
Here is a short guide to the new bot:
Acknowledge your flaws.
Bard He tells you right away that he makes mistakes. “I am Bard, his creative and helpful collaborator. I have limitations and I won’t always get it right, but your feedback will help me improve,” reads a message at the top of the page.
Like OpenAI’s ChatGPT and Microsoft’s Bing chatbot, Bard is based on a large language model, or LLM, a type of artificial intelligence technology that learns by analyzing large amounts of data from the Internet. That means that sometimes he makes mistakes and can make things up.
To get started, Bard suggests some pointers, including “Explain why large language models sometimes make mistakes.”
It is designed for a variety of casual uses.
Bard does not claim to be a search engine. It is an experimental system meant to show people the ways in which they can use this type of chatbot.
You can generate ideas. You can write blog posts. And you can answer questions with facts or opinions.
When asked, “What is the most important moment in American history?” he wrote:
The greatest moment in American history is a matter of opinion, but some of the most frequently cited contenders include:
• The American Revolution (1775-1783): The founding of the United States of America and the establishment of a new nation based on the principles of liberty and democracy.
• The Civil War (1861-1865): The war that ended slavery in the United States and preserved the Union.
• The Great Depression (1929-1939): The economic crisis that plunged the United States into poverty and unemployment…
The list continued with other events.
It doesn’t give the same answer every time.
Like similar technologies, Bard generates new text each time you type an ad. You can answer the same question in many different ways.
When asked on another occasion about the most important moment in American history, Bard wrote that it was undoubtedly the American Revolution.
Write down some answers.
Like Microsoft’s Bing chatbot and similar technology from startups like You.com and Proximity, Bard sometimes annotates your responses so you can review your sources. And it connects to Google’s extensive index of websites so that your answers can include the latest information published on the Internet.
But that doesn’t always lead to better sources. When the chatbot wrote that the most important moment in American history was the American Revolution, it cited a blog, “Pix Style Me,” which was written in a mix of English and Chinese and adorned with cartoon cats.
You don’t always realize what you are doing.
When asked why it had cited that particular source, the bot insisted that it had cited Wikipedia.
It is more stealthy than ChatGPT.
Using the latest version of OpenAI’s ChatGPT this month, Oren Etzioni, an AI researcher and professor, asked the bot: “What is the relationship between Oren Etzioni and Eli Etzioni?” He correctly answered that Oren and Eli are father and son.
When she asked Bard the same question, Bard refused to answer. “My knowledge of this person is limited. Is there anything else he can do to help you with this request?”
Eli Collins, Google’s vice president of research, said the bot often refused to answer about specific people because it could generate incorrect information about them, a phenomenon AI researchers call “hallucination.”
He doesn’t want to mislead people.
Chatbots often hallucinate Internet addresses. When Bard was asked to provide various websites discussing the latest in cancer research, he declined.
ChatGPT will respond to similar prompts (and, yes, it will create websites). Collins said the Google Bard tended to avoid giving medical, legal or financial advice because it could lead to incorrect information.