feat(zai): add Z.AI/GLM support via happy zai command#138
Open
Zorglub4242 wants to merge 2 commits intoslopus:mainfrom
Open
feat(zai): add Z.AI/GLM support via happy zai command#138Zorglub4242 wants to merge 2 commits intoslopus:mainfrom
Zorglub4242 wants to merge 2 commits intoslopus:mainfrom
Conversation
Add support for GLM (z.ai / BigModel.cn) API through a new `happy zai` subcommand, similar to existing `happy gemini` and `happy codex` commands. Features: - `happy zai` - Start Claude with GLM API endpoint - `happy zai token set <key>` - Save GLM API key to ~/.zai/config.json - `happy zai token get` - Show current API token - `happy zai model set <model>` - Set GLM model (glm-4.7, glm-4-plus, etc.) - `happy zai model get` - Show current model - `happy zai base-url set <url>` - Set custom API endpoint Configuration is stored in ~/.zai/config.json with support for environment variable overrides (ZAI_AUTH_TOKEN, ZAI_BASE_URL, ZAI_MODEL). Co-Authored-By: Claude <noreply@anthropic.com>
Fix argument passing for `happy zai` so that flags like `--yolo`, `--resume`, `--dangerously-skip-permissions` etc. are properly passed through to the underlying claude command. - Add claudeArgs parameter to runZai function - Parse and collect unknown arguments in zai subcommand handler - Pass claudeArgs through to runClaude Co-Authored-By: Claude <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Add support for GLM (z.ai / BigModel.cn) API through a new
happy zaisubcommand, similar to existinghappy geminiandhappy codexcommands.What It Does
happy zai- Start Claude with GLM API endpoint (redirects to https://open.bigmodel.cn/api/anthropic)happy zai token set <key>- Save GLM API key to ~/.zai/config.jsonhappy zai token get- Show current API tokenhappy zai model set <model>- Set GLM model (glm-4.7, glm-4-plus, glm-4-flash, glm-4-air, glm-4-flashx)happy zai model get- Show current modelhappy zai base-url set <url>- Set custom API endpointHow It Works
GLM is Anthropic-compatible, so this implementation sets environment variables before launching Claude:
ANTHROPIC_BASE_URL→https://open.bigmodel.cn/api/anthropicANTHROPIC_AUTH_TOKEN→ GLM API keyANTHROPIC_MODEL→ glm-4.7 (or specified model)Configuration is stored in
~/.zai/config.jsonwith support for environment variable overrides (ZAI_AUTH_TOKEN,ZAI_BASE_URL,ZAI_MODEL).Files Changed
src/zai/runZai.ts- New module for Z.AI/GLM supportsrc/index.ts- Added zai subcommand handler with config subcommandsUsage Example