Isn't "context" just another word for "prompt?" Techniques have become more complex, but they're still just techniques for assembling the token sequences we feed to the transformer.
Almost. It's the current prompt plus the previous prompts and responses in the current conversation.
The idea behind "context engineering" is to help people understand that a prompt these days can be long, and can incorporate a whole bunch of useful things (examples, extra documentation, transcript summaries etc) to help get the desired response.
"Prompt engineering" was meant to mean this too, but the AI influencer crowd redefined it to mean "typing prompts into a chatbot".
Almost. It's the current prompt plus the previous prompts and responses in the current conversation.
The idea behind "context engineering" is to help people understand that a prompt these days can be long, and can incorporate a whole bunch of useful things (examples, extra documentation, transcript summaries etc) to help get the desired response.
"Prompt engineering" was meant to mean this too, but the AI influencer crowd redefined it to mean "typing prompts into a chatbot".