You can't "patch" LLM's in 4 hours and this is not the kind of question to trigger a web search
You can pattern match on the prompt (input) then (a) stuff the context with helpful hints to the LLM e.g. "Remember that a car is too heavy for a person to carry" or (b) upgrade to "thinking".
You absolutely can, either through the system prompt or by hardcoding overrides in the backend before it even hits the LLM, and I can guarantee that companies like Google are doing both
A tiny bit of fine-tuning would take minutes...
This has been viral on Tiktok far at least one week. Not really 4 hours.