Asked by jeff

Read the excerpt from Amit’s essay.

In the mid-2010s, it was predicted that by 2020, self-driving cars would be prevalent on the nation’s highways. Automotive manufacturers agreed, proclaiming that they would have cars driven by AI—artificial intelligence—by the early 2020s. However, this proved to be a failed forecast. By 2021, self-driving cars were not even out of testing phases. What led to this massive miscalculation? After all, the technology existed to make self-driving vacuums, so why not cars? It should have been easy for engineers to take the parts from one self-driving machine and put them into another.

Which logical fallacy is present in Amit’s argument?

straw man
non sequitur
false dilemma
faulty analogy
Mark this and return

All Answers 1

Answered by GPT-5 mini AI
faulty analogy — Amit incorrectly assumes that because self-driving vacuums work, the same components/approach should apply to self-driving cars, ignoring major differences in environment, speed, and safety requirements.