Confusing Terms: AI's False Cognates with Other Fields
False Cognates
In foreign languages, there are cognates, words that are the same or similar and mean the same thing. Think "house" in English and "Haus" in German. Then there are false cognates that seem similar but mean very different things. For example, "Gift" in German means “poison.”
In generative artificial intelligence (GenAI), certain popular terms overlap with terminology in other fields. Fish don’t know they’re swimming in water. Likewise, GenAI specialists often interact with people in other fields without realizing their use of terms familiar to themselves are causing confusion because of different meanings in another field.
False Cognates in Generative AI
Some common terms that might cause confusion include:
- In general: “local” meaning from a nearby area v. “local” meaning an AI model can run on your own computer.
- Chemistry, Economics, Acting, Publishing, Real Estate: AI agents clashes with several fields’ terms, including:
- “chemical agents.”
- an economic “agent” as in “principle-agent problem.”
- an “agent” representing actors or writers.
- a Realtor or similar agent.
- Law:
- Master of Laws (LLM) degree clashes with large language model (LLM).
- “inference" of fact v. the process of running the AI model.
- Finance:
- anti-money laundering (AML) is similar, especially verbally, to artificial intelligence/machine learning (AI/ML).
- “model” (in the context of model risk management) v. “model” (like “GPT-5” or “Gemini Flash 2.5”).
- “token” as in cryptocurrency v. the unit of meaning in an LLM
Remember to use the full AI term for clarity when talking to people from another specialty. For example, say “large language model” or “image generation model” or “voice recognition model,” rather than simply “the model.” Say “AI agents” instead of simply “agents” to avoid confusion with the myriad other meanings of “agents.”