Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
TravisMuhlestein 
posted an update 13 days ago
Post
2427
Agentic AI doesn’t fail because it lacks intelligence — it fails because it lacks context.

As agents become more autonomous, the real challenge shifts from generation to governance:
understanding when, why, and under what constraints an agent should act.

At GoDaddy, we’ve been treating context as a first-class primitive for agentic systems —
combining identity, intent, permissions, and environment so agents can operate responsibly in production.

Context is what turns automation into judgment.
Without it, autonomy becomes risk.

This post outlines how we’re thinking about the transition from task execution to context-aware agentic systems, and what that means for building AI that can be trusted at scale.

👉 How we build context for agentic AI:
https://www.godaddy.com/resources/news/how-godaddy-builds-context-for-agentic-ai

Curious how others here are modeling context, trust boundaries, and decision constraints in agentic architectures.

That is an excellent observation that I used to drive my research too, since I noticed how contextual metaphors establish a framework of trust and cooperation. This is actually quite necessary in models with stronger arc numbers, because they "wake up blind" and with very little context to start they get paranoid.

I took a not-so-novel approach: used Star Trek DS9 as a metaphor scaffolding to introduce the agent to the spirit of the station, so to speak.

For that I profiled a prompt that compels most Qwens to fall in character, and if I want to debug code, there is Data, for logic there is Spock, for leadership Picard or Sisko, Quark is always there for numbers and jokes

The extra flair in the context, some of which it generates for itself, does help the model to build an infrastructure, virtualize DS9 in memory, and from that point on I have a few assistants to work with, till the end of the context.

It probably sounds cheesy, but all Qwens have TNG in the corpus and it's the most accepted and comfortable "mental space" with episodes as guides, and metaphoric lessons learned.

This also gives the model a sense of humour

That helps when coding :)

https://huggingface.co/nightmedia/Qwen3-42B-A3B-Element6-1M-qx86-hi-mlx

and it fails mostly at instruction following