Ironically, I tried to use the official "github-mcp" and failed to make it work with my company's repos, even with a properly configured token. The thing comes with a full blown server running inside a docker container.
Well, I just told my llm agent to use the `gh` cli instead.
It seems all those new protocols are there to re-invent wheels just to create a new ecosystem of free programs that corporations will be able to use to extract value without writing the safety guards themselves.
I feel the same way about OpenAI‘s new responses API. Under the cover of DX they‘re marketing a new default, which is we hold your state and sell it back to you.
OpenAI is tedious to work with. Took me a solid day of fooling around with it before I realized the chat api and the chat completions api are two entirely different apis. Then you have the responses api which is a third thing.
The irony is that gpt4 has no clue which approach is correct. Give it the same prompt three times and you’ll get a solution that uses each of these that has a wildly different footprint, be it via function calls or system prompts, schema or no schema, etc.
Well, I just told my llm agent to use the `gh` cli instead.
It seems all those new protocols are there to re-invent wheels just to create a new ecosystem of free programs that corporations will be able to use to extract value without writing the safety guards themselves.