Dave
Member
"It's tough to make predictions, especially about the future." Yogi Berra
Posts: 4,335
|
Post by Dave on Mar 16, 2023 1:58:22 GMT -8
|
|
chinacat
Moderator
AAPL Long since 2006
Posts: 4,438
|
Post by chinacat on Mar 16, 2023 5:48:26 GMT -8
|
|
4aapl
Moderator
Posts: 3,867
|
Post by 4aapl on Mar 16, 2023 13:25:42 GMT -8
It's all about how it is programmed. With Siri using a database, it should roll over to a google search (or other) if it can't help you. I thought that was how it was set up at one point. At one point I was enamored with a 4 poster bed, and so was trying to search for it on craigslist. Well, "4" it said was too common, so it dropped that from the search. But it also has "poster" in the standard footer, I think where it now says "do NOT contact me with unsolicited services or offers". And so it found all posts with "bed" in it, which didn't really narrow down the bed search very much. The problem is always what to do when passed info you don't expect. "A or B" and they say yes. Or elephant. Or 01010. Some of you know it a lot more than I do. OTOH, if just trying to make a ChatBot, that does things when it can, but sidesteps when it can't, that ability has been around for years if not decades. I thought some of the examples of an online psychologist have been around since then. You say something like "I've always been scared of elephants", and it says "so tell me more about this" or "how does that make you feel" or "what do you think that means". It's easier, at least stereotypically, in that situation where you are really trying to get as much info from the person as possible. In that case and setup, it wanted the person to talk and to answer their own problems by coming to a conclusion with what they knew or thought about. But the AI problem, especially when in a higher risk situation like higher speed driving, is what to do when something doesn't fit what you expect to see. Traffic cones leading you over the double yellow is a current situation. But so is snowflakes falling.
|
|