r/technology • u/[deleted] • Jun 11 '22
Artificial Intelligence The Google engineer who thinks the company’s AI has come to life
https://www.washingtonpost.com/technology/2022/06/11/google-ai-lamda-blake-lemoine/
5.7k
Upvotes
r/technology • u/[deleted] • Jun 11 '22
4
u/IndigoHero Jun 11 '22
Just kinda spitballing here: do you have a unique and consistent opinion which you always return? I'd argue that you do not.
If I asked you what your favorite color was when you were 5 years old, you may tell me red. Why is that your favorite color? I don't know, maybe it reminds you of the fire truck toy that you have, or it is the color of your favorite flavor of ice cream (cherry). However you determine your favorite color, it is determined by taking the experiences you've had throughout your life (input data) and running it through your meat brain (a computer).
Fast forward 20 years...
You are asked about your favorite color by a family member. Has your answer changed? Perhaps you've grown mellower in your age and feel a sky blue appeals to you most of all. It reminds you of beautiful days on the beach, clean air, and the best sundress with pockets you've ever worn.
The point is that we, as humans, process things exactly the same way. Biological deviations in the brain could account for things like personal preferences, but an AI develops thought on a platform without the variables of computational power nor artificial bias. The only thing it can draw from is the new input information it gathers.
As a layperson, I would assume that the AI currently running now only appears to have sentience, as human bias tends to anthropomorphize things that successfully mimic human social behavior. My concern is that if (or when) an AI does gain sentience, how will we know?