Compare commits
35 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
136d1605c0 | ||
|
|
9da3125c34 | ||
|
|
daa11ca440 | ||
|
|
e2bdffb41e | ||
|
|
ddd6951610 | ||
|
|
8ff62d8f11 | ||
|
|
6a2db542c3 | ||
|
|
d49fa9ce90 | ||
|
|
df40af5632 | ||
|
|
fb319afbd5 | ||
|
|
feff23fba9 | ||
|
|
c3124aeb7d | ||
|
|
a7b4850c47 | ||
|
|
f5c91adac8 | ||
|
|
33e15827ce | ||
|
|
73bcac8a55 | ||
|
|
05706182df | ||
|
|
c4fae6a9c1 | ||
|
|
259caacb01 | ||
|
|
6c5ceead09 | ||
|
|
65c1f809b1 | ||
|
|
064de8df8d | ||
|
|
7522908404 | ||
|
|
c21b263b24 | ||
|
|
464b803b42 | ||
|
|
bbef9ad014 | ||
|
|
9f7fbb759f | ||
|
|
16e69a2bd6 | ||
|
|
bc4cf2b7a3 | ||
|
|
35bdefba59 | ||
|
|
d33f573483 | ||
|
|
cf319c502e | ||
|
|
01ab318e4b | ||
|
|
f9a81a4825 | ||
|
|
3bafb0deb8 |
88
.codex/skills/server-manager/SKILL.md
Normal file
88
.codex/skills/server-manager/SKILL.md
Normal file
@@ -0,0 +1,88 @@
|
||||
---
|
||||
name: server-manager
|
||||
description: Use ServerManager's shared local server inventory and ssh.py utility to manage configured SSH, Telnet, SQL, Redis, S3/MinIO, Grafana, Prometheus, and WinRM endpoints by alias without exposing credentials. Use when the user asks to operate on servers managed by ServerManager or when editing ServerManager's Claude/Codex/Gemini integration.
|
||||
metadata:
|
||||
short-description: Safe remote ops through ServerManager aliases
|
||||
---
|
||||
|
||||
# Server Manager
|
||||
|
||||
Use this skill for two cases:
|
||||
|
||||
1. The user wants work done on a server or service already configured in ServerManager.
|
||||
2. The user wants to modify ServerManager's CLI/integration layer so Claude/Codex/Gemini can use it safely.
|
||||
|
||||
## First Step
|
||||
|
||||
Before any server operation:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/codex-ssh --list
|
||||
```
|
||||
|
||||
Read the `Type` column before choosing commands. Do not guess the server type.
|
||||
|
||||
If the wrapper is missing, run the doctor script for your platform:
|
||||
|
||||
```bash
|
||||
$HOME/.codex/skills/server-manager/scripts/server-manager-doctor.sh
|
||||
```
|
||||
|
||||
On Windows, use:
|
||||
|
||||
```bat
|
||||
%USERPROFILE%\.codex\skills\server-manager\scripts\server-manager-doctor.cmd
|
||||
```
|
||||
|
||||
## Hard Rules
|
||||
|
||||
- Never read `~/.server-connections/servers.json`, `settings.json`, or `encryption.py` directly.
|
||||
- Never use `--list-full`.
|
||||
- Never use raw `ssh`, `scp`, `rsync`, `redis-cli`, `mysql`, `psql`, `mc`, `aws s3`, or similar tools unless the user explicitly asks to bypass ServerManager.
|
||||
- Maximum one connection attempt per action. If it times out or fails, report it and stop.
|
||||
- `ALIAS "command"` is only for `ssh` and `telnet`.
|
||||
- `rdp` and `vnc` are GUI-only. Do not invent CLI access.
|
||||
- For S3/MinIO, list buckets and paths before upload, delete, or URL generation.
|
||||
- Ask for confirmation before destructive actions if the user's intent is not explicit.
|
||||
|
||||
## Preferred Entry Points
|
||||
|
||||
Use the shared wrapper:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/codex-ssh ...
|
||||
```
|
||||
|
||||
It delegates to the installed `ssh.py` backend without requiring a `python` alias.
|
||||
|
||||
Safe discovery commands:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/codex-ssh --list
|
||||
$HOME/.server-connections/codex-ssh --info ALIAS
|
||||
$HOME/.server-connections/codex-ssh --status
|
||||
```
|
||||
|
||||
Read [references/command-matrix.md](references/command-matrix.md) when you need the per-type command matrix.
|
||||
|
||||
## Server Operation Workflow
|
||||
|
||||
1. Run `--list`.
|
||||
2. Match the alias using notes/type, not credentials.
|
||||
3. Pick commands strictly from the server type.
|
||||
4. Execute exactly one action.
|
||||
5. Report the result without exposing IPs, logins, passwords, ports, or secrets.
|
||||
|
||||
## Working On ServerManager Itself
|
||||
|
||||
Read [references/project.md](references/project.md) before changing integration code.
|
||||
|
||||
Source-of-truth files:
|
||||
|
||||
- `tools/ssh.py`: local CLI used by Claude/Codex
|
||||
- `tools/skill-ssh.md`: current Claude `/ssh` instructions
|
||||
- `core/claude_setup.py`: Claude installer logic
|
||||
- `build.py`: auto-deploys shared CLI files after build
|
||||
- `README.md` and `CLAUDE.md`: project-level rules and architecture
|
||||
|
||||
If you change command semantics in `tools/ssh.py`, update the user-facing instructions alongside it.
|
||||
91
.codex/skills/server-manager/references/command-matrix.md
Normal file
91
.codex/skills/server-manager/references/command-matrix.md
Normal file
@@ -0,0 +1,91 @@
|
||||
# Command Matrix
|
||||
|
||||
Always identify the server type first with:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/codex-ssh --list
|
||||
```
|
||||
|
||||
## Type To Command Map
|
||||
|
||||
| Type | Use | Do Not Use |
|
||||
| --- | --- | --- |
|
||||
| `ssh` | `ALIAS "command"`, `--upload`, `--download`, `--ping`, `--install-key` | n/a |
|
||||
| `telnet` | `ALIAS "command"` | `--upload`, `--download`, `--install-key` |
|
||||
| `mariadb`, `mssql`, `postgresql` | `--sql`, `--sql-databases`, `--sql-tables` | `ALIAS "command"` |
|
||||
| `redis` | `--redis`, `--redis-info`, `--redis-keys` | `ALIAS "command"` |
|
||||
| `s3` | `--s3-buckets`, `--s3-ls`, `--s3-upload`, `--s3-download`, `--s3-delete`, `--s3-url`, `--s3-create-bucket` | `ALIAS "command"`, SSH/SFTP commands |
|
||||
| `grafana` | `--grafana-dashboards`, `--grafana-alerts` | `ALIAS "command"` |
|
||||
| `prometheus` | `--prom-query`, `--prom-targets`, `--prom-alerts` | `ALIAS "command"` |
|
||||
| `winrm` | `--ps`, `--cmd` | `ALIAS "command"` |
|
||||
| `rdp`, `vnc` | GUI only | all CLI actions |
|
||||
|
||||
## Common Safe Commands
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/codex-ssh --list
|
||||
$HOME/.server-connections/codex-ssh --info ALIAS
|
||||
$HOME/.server-connections/codex-ssh --status
|
||||
$HOME/.server-connections/codex-ssh --set-note ALIAS "description"
|
||||
```
|
||||
|
||||
## SSH And Telnet
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/codex-ssh ALIAS "command"
|
||||
$HOME/.server-connections/codex-ssh ALIAS --no-sudo "command"
|
||||
$HOME/.server-connections/codex-ssh ALIAS --upload "local" //remote/path
|
||||
$HOME/.server-connections/codex-ssh ALIAS --download //remote/path "local"
|
||||
$HOME/.server-connections/codex-ssh ALIAS --ping
|
||||
```
|
||||
|
||||
Use double slashes for remote SSH/SFTP paths when working from Git Bash style environments.
|
||||
|
||||
## SQL
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/codex-ssh --sql ALIAS "SELECT * FROM table LIMIT 10"
|
||||
$HOME/.server-connections/codex-ssh --sql-databases ALIAS
|
||||
$HOME/.server-connections/codex-ssh --sql-tables ALIAS [database]
|
||||
```
|
||||
|
||||
## Redis
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/codex-ssh --redis ALIAS "GET key"
|
||||
$HOME/.server-connections/codex-ssh --redis-info ALIAS
|
||||
$HOME/.server-connections/codex-ssh --redis-keys ALIAS "pattern:*"
|
||||
```
|
||||
|
||||
## S3 / MinIO
|
||||
|
||||
Before modifying objects:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/codex-ssh --s3-buckets ALIAS
|
||||
$HOME/.server-connections/codex-ssh --s3-ls ALIAS bucket/prefix/
|
||||
```
|
||||
|
||||
Then act:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/codex-ssh --s3-upload ALIAS "local" bucket/key
|
||||
$HOME/.server-connections/codex-ssh --s3-download ALIAS bucket/key "local"
|
||||
$HOME/.server-connections/codex-ssh --s3-delete ALIAS bucket/key
|
||||
$HOME/.server-connections/codex-ssh --s3-url ALIAS bucket/key [seconds]
|
||||
$HOME/.server-connections/codex-ssh --s3-create-bucket ALIAS bucket-name
|
||||
```
|
||||
|
||||
Do not treat S3 as a shell filesystem.
|
||||
|
||||
## Grafana / Prometheus / WinRM
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/codex-ssh --grafana-dashboards ALIAS
|
||||
$HOME/.server-connections/codex-ssh --grafana-alerts ALIAS
|
||||
$HOME/.server-connections/codex-ssh --prom-query ALIAS "up"
|
||||
$HOME/.server-connections/codex-ssh --prom-targets ALIAS
|
||||
$HOME/.server-connections/codex-ssh --prom-alerts ALIAS
|
||||
$HOME/.server-connections/codex-ssh --ps ALIAS "Get-Process"
|
||||
$HOME/.server-connections/codex-ssh --cmd ALIAS "dir"
|
||||
```
|
||||
68
.codex/skills/server-manager/references/project.md
Normal file
68
.codex/skills/server-manager/references/project.md
Normal file
@@ -0,0 +1,68 @@
|
||||
# Project Notes
|
||||
|
||||
This skill is based on `/home/code/Desktop/CODING/server-manager`.
|
||||
|
||||
## What ServerManager Is
|
||||
|
||||
ServerManager is a cross-platform desktop GUI built with CustomTkinter. It manages multiple remote endpoint types through one local encrypted inventory:
|
||||
|
||||
- SSH / Telnet
|
||||
- MariaDB / MSSQL / PostgreSQL
|
||||
- Redis
|
||||
- S3 / MinIO
|
||||
- Grafana
|
||||
- Prometheus
|
||||
- WinRM
|
||||
- RDP / VNC launchers
|
||||
|
||||
## Core Integration Model
|
||||
|
||||
The GUI and CLI share one local backend:
|
||||
|
||||
```text
|
||||
ServerManager GUI <-> ~/.server-connections/servers.json <-> ~/.server-connections/ssh.py
|
||||
```
|
||||
|
||||
The AI never needs raw credentials. It only uses aliases and the local CLI.
|
||||
|
||||
## Important Files
|
||||
|
||||
- `README.md`: product overview and install flow
|
||||
- `CLAUDE.md`: project rules, architecture, security, workflow
|
||||
- `tools/ssh.py`: CLI entry point used by AI tools
|
||||
- `tools/skill-ssh.md`: current Claude `/ssh` instructions
|
||||
- `core/claude_setup.py`: installer for shared CLI files plus Claude/Codex/Gemini skills
|
||||
- `build.py`: auto-deploys `ssh.py`, `encryption.py`, Claude skill, Codex skill, and Gemini skill after builds
|
||||
|
||||
## Architectural Shape
|
||||
|
||||
- `core/server_store.py`: encrypted storage, CRUD, observers, backups
|
||||
- `core/connection_factory.py`: type-to-client factory with lazy imports
|
||||
- `core/*_client.py`: protocol-specific backends
|
||||
- `gui/app.py`: tab registry, conditional tabs by server type
|
||||
- `gui/tabs/`: protocol-specific GUI surfaces
|
||||
|
||||
## Existing Local Agent Integration
|
||||
|
||||
Current setup installs:
|
||||
|
||||
- `~/.server-connections/ssh.py`
|
||||
- `~/.server-connections/encryption.py`
|
||||
- `~/.claude/commands/ssh.md`
|
||||
- `~/.codex/skills/server-manager/`
|
||||
- `~/.server-connections/codex-ssh` or `codex-ssh.cmd`
|
||||
- a `~/.claude/CLAUDE.md` guidance block
|
||||
|
||||
The Codex skill mirrors the same safety model:
|
||||
|
||||
- use aliases only
|
||||
- use the shared local CLI
|
||||
- never read credentials directly
|
||||
- choose commands by server type
|
||||
|
||||
## Local Findings
|
||||
|
||||
- `ssh.py` is executable and uses a `python3` shebang, so Codex does not need a `python` alias.
|
||||
- `ssh.py` has no `--help`; use `--list`, `--info`, and `--status` for safe discovery.
|
||||
- The Unix wrapper path covers both Linux and macOS through `codex-ssh-wrapper.sh`.
|
||||
- Windows-native Codex wrapper support exists through `codex-ssh-wrapper.cmd`.
|
||||
28
.codex/skills/server-manager/scripts/codex-ssh-wrapper.cmd
Normal file
28
.codex/skills/server-manager/scripts/codex-ssh-wrapper.cmd
Normal file
@@ -0,0 +1,28 @@
|
||||
@echo off
|
||||
setlocal
|
||||
|
||||
set "SHARED_DIR=%SERVER_MANAGER_SHARED_DIR%"
|
||||
if "%SHARED_DIR%"=="" set "SHARED_DIR=%USERPROFILE%\.server-connections"
|
||||
set "SSH_SCRIPT=%SHARED_DIR%\ssh.py"
|
||||
|
||||
if not exist "%SSH_SCRIPT%" (
|
||||
echo error: missing ssh.py at %SSH_SCRIPT% 1>&2
|
||||
echo hint: install ServerManager shared CLI files first 1>&2
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
where py >nul 2>&1
|
||||
if not errorlevel 1 (
|
||||
py -3 "%SSH_SCRIPT%" %*
|
||||
exit /b %errorlevel%
|
||||
)
|
||||
|
||||
where python >nul 2>&1
|
||||
if not errorlevel 1 (
|
||||
python "%SSH_SCRIPT%" %*
|
||||
exit /b %errorlevel%
|
||||
)
|
||||
|
||||
echo error: neither py nor python is available in PATH 1>&2
|
||||
echo hint: install Python launcher or use ServerManager Setup on a machine with Python present 1>&2
|
||||
exit /b 1
|
||||
13
.codex/skills/server-manager/scripts/codex-ssh-wrapper.sh
Normal file
13
.codex/skills/server-manager/scripts/codex-ssh-wrapper.sh
Normal file
@@ -0,0 +1,13 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
shared_dir="${SERVER_MANAGER_SHARED_DIR:-$HOME/.server-connections}"
|
||||
ssh_script="${shared_dir}/ssh.py"
|
||||
|
||||
if [[ ! -x "$ssh_script" ]]; then
|
||||
echo "error: missing executable ssh.py at ${ssh_script}" >&2
|
||||
echo "hint: install ServerManager's shared CLI files first" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
exec "$ssh_script" "$@"
|
||||
@@ -0,0 +1,41 @@
|
||||
@echo off
|
||||
setlocal
|
||||
|
||||
set "SHARED_DIR=%SERVER_MANAGER_SHARED_DIR%"
|
||||
if "%SHARED_DIR%"=="" set "SHARED_DIR=%USERPROFILE%\.server-connections"
|
||||
set "CODEX_HOME=%USERPROFILE%\.codex\skills\server-manager"
|
||||
|
||||
set "FAILED=0"
|
||||
|
||||
if exist "%SHARED_DIR%\ssh.py" (
|
||||
echo ok: found %SHARED_DIR%\ssh.py
|
||||
) else (
|
||||
echo error: missing %SHARED_DIR%\ssh.py 1>&2
|
||||
set "FAILED=1"
|
||||
)
|
||||
|
||||
if exist "%SHARED_DIR%\encryption.py" (
|
||||
echo ok: found %SHARED_DIR%\encryption.py
|
||||
) else (
|
||||
echo error: missing %SHARED_DIR%\encryption.py 1>&2
|
||||
set "FAILED=1"
|
||||
)
|
||||
|
||||
if exist "%SHARED_DIR%\codex-ssh.cmd" (
|
||||
echo ok: found %SHARED_DIR%\codex-ssh.cmd
|
||||
) else (
|
||||
echo error: missing %SHARED_DIR%\codex-ssh.cmd 1>&2
|
||||
set "FAILED=1"
|
||||
)
|
||||
|
||||
if exist "%CODEX_HOME%\SKILL.md" (
|
||||
echo ok: found %CODEX_HOME%\SKILL.md
|
||||
) else (
|
||||
echo error: missing %CODEX_HOME%\SKILL.md 1>&2
|
||||
set "FAILED=1"
|
||||
)
|
||||
|
||||
if "%FAILED%"=="1" exit /b 1
|
||||
|
||||
echo ok: ServerManager Codex integration looks installed
|
||||
exit /b 0
|
||||
@@ -0,0 +1,41 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
shared_dir="${SERVER_MANAGER_SHARED_DIR:-$HOME/.server-connections}"
|
||||
ssh_script="${shared_dir}/ssh.py"
|
||||
encryption_module="${shared_dir}/encryption.py"
|
||||
wrapper="${shared_dir}/codex-ssh"
|
||||
|
||||
status=0
|
||||
|
||||
check_file() {
|
||||
local path="$1"
|
||||
if [[ -f "$path" ]]; then
|
||||
printf '[ok] file %s\n' "$path"
|
||||
else
|
||||
printf '[missing] file %s\n' "$path" >&2
|
||||
status=1
|
||||
fi
|
||||
}
|
||||
|
||||
check_exec() {
|
||||
local path="$1"
|
||||
if [[ -x "$path" ]]; then
|
||||
printf '[ok] executable %s\n' "$path"
|
||||
else
|
||||
printf '[missing] executable %s\n' "$path" >&2
|
||||
status=1
|
||||
fi
|
||||
}
|
||||
|
||||
check_file "$encryption_module"
|
||||
check_exec "$ssh_script"
|
||||
check_exec "$wrapper"
|
||||
|
||||
if [[ -d "/home/code/CODING/server-manager" ]]; then
|
||||
printf '[ok] source repo /home/code/CODING/server-manager\n'
|
||||
else
|
||||
printf '[warn] source repo /home/code/CODING/server-manager not found\n'
|
||||
fi
|
||||
|
||||
exit "$status"
|
||||
8
.gemini/settings.json
Normal file
8
.gemini/settings.json
Normal file
@@ -0,0 +1,8 @@
|
||||
{
|
||||
"context": {
|
||||
"fileName": "GEMINI.md"
|
||||
},
|
||||
"experimental": {
|
||||
"enableAgents": true
|
||||
}
|
||||
}
|
||||
84
.gemini/skills/server-manager/SKILL.md
Normal file
84
.gemini/skills/server-manager/SKILL.md
Normal file
@@ -0,0 +1,84 @@
|
||||
---
|
||||
name: server-manager
|
||||
description: Use ServerManager's shared local server inventory and ssh.py utility to manage configured SSH, Telnet, SQL, Redis, S3/MinIO, Grafana, Prometheus, and WinRM endpoints by alias without exposing credentials. Use when the user asks to operate on servers managed by ServerManager or when editing ServerManager's Claude/Codex/Gemini integration.
|
||||
---
|
||||
|
||||
# Server Manager
|
||||
|
||||
Use this skill for two cases:
|
||||
|
||||
1. The user wants work done on a server or service already configured in ServerManager.
|
||||
2. The user wants to modify ServerManager's CLI/integration layer so Claude/Codex/Gemini can use it safely.
|
||||
|
||||
## First Step
|
||||
|
||||
Before any server operation:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh --list
|
||||
```
|
||||
|
||||
Read the `Type` column before choosing commands. Do not guess the server type.
|
||||
|
||||
If the wrapper is missing, run the doctor script for your platform:
|
||||
|
||||
```bash
|
||||
$HOME/.gemini/skills/server-manager/scripts/server-manager-gemini-doctor.sh
|
||||
```
|
||||
|
||||
On Windows, use:
|
||||
|
||||
```bat
|
||||
%USERPROFILE%\.gemini\skills\server-manager\scripts\server-manager-gemini-doctor.cmd
|
||||
```
|
||||
|
||||
## Hard Rules
|
||||
|
||||
- Never read `~/.server-connections/servers.json`, `settings.json`, or `encryption.py` directly.
|
||||
- Never use `--list-full`.
|
||||
- Never use raw `ssh`, `scp`, `rsync`, `redis-cli`, `mysql`, `psql`, `mc`, `aws s3`, or similar tools unless the user explicitly asks to bypass ServerManager.
|
||||
- Maximum one connection attempt per action. If it times out or fails, report it and stop.
|
||||
- `ALIAS "command"` is only for `ssh` and `telnet`.
|
||||
- `rdp` and `vnc` are GUI-only. Do not invent CLI access.
|
||||
- For S3/MinIO, list buckets and paths before upload, delete, or URL generation.
|
||||
- Ask for confirmation before destructive actions if the user's intent is not explicit.
|
||||
|
||||
## Preferred Entry Points
|
||||
|
||||
Use the shared wrapper:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh ...
|
||||
```
|
||||
|
||||
Safe discovery commands:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh --list
|
||||
$HOME/.server-connections/gemini-ssh --info ALIAS
|
||||
$HOME/.server-connections/gemini-ssh --status
|
||||
```
|
||||
|
||||
Read [references/command-matrix.md](references/command-matrix.md) when you need the per-type command matrix.
|
||||
|
||||
## Server Operation Workflow
|
||||
|
||||
1. Run `--list`.
|
||||
2. Match the alias using notes/type, not credentials.
|
||||
3. Pick commands strictly from the server type.
|
||||
4. Execute exactly one action.
|
||||
5. Report the result without exposing IPs, logins, passwords, ports, or secrets.
|
||||
|
||||
## Working On ServerManager Itself
|
||||
|
||||
Read [references/project.md](references/project.md) before changing integration code.
|
||||
|
||||
Source-of-truth files:
|
||||
|
||||
- `tools/ssh.py`: local CLI used by AI tools
|
||||
- `tools/skill-ssh.md`: current Claude `/ssh` instructions
|
||||
- `core/claude_setup.py`: installer for shared CLI files and AI skills
|
||||
- `build.py`: auto-deploys `ssh.py`, `encryption.py`, Claude/Codex/Gemini skills after builds
|
||||
- `README.md`, `CLAUDE.md`, and `GEMINI.md`: project-level rules and architecture
|
||||
|
||||
If you change command semantics in `tools/ssh.py`, update the user-facing instructions alongside it.
|
||||
91
.gemini/skills/server-manager/references/command-matrix.md
Normal file
91
.gemini/skills/server-manager/references/command-matrix.md
Normal file
@@ -0,0 +1,91 @@
|
||||
# Command Matrix
|
||||
|
||||
Always identify the server type first with:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh --list
|
||||
```
|
||||
|
||||
## Type To Command Map
|
||||
|
||||
| Type | Use | Do Not Use |
|
||||
| --- | --- | --- |
|
||||
| `ssh` | `ALIAS "command"`, `--upload`, `--download`, `--ping`, `--install-key` | n/a |
|
||||
| `telnet` | `ALIAS "command"` | `--upload`, `--download`, `--install-key` |
|
||||
| `mariadb`, `mssql`, `postgresql` | `--sql`, `--sql-databases`, `--sql-tables` | `ALIAS "command"` |
|
||||
| `redis` | `--redis`, `--redis-info`, `--redis-keys` | `ALIAS "command"` |
|
||||
| `s3` | `--s3-buckets`, `--s3-ls`, `--s3-upload`, `--s3-download`, `--s3-delete`, `--s3-url`, `--s3-create-bucket` | `ALIAS "command"`, SSH/SFTP commands |
|
||||
| `grafana` | `--grafana-dashboards`, `--grafana-alerts` | `ALIAS "command"` |
|
||||
| `prometheus` | `--prom-query`, `--prom-targets`, `--prom-alerts` | `ALIAS "command"` |
|
||||
| `winrm` | `--ps`, `--cmd` | `ALIAS "command"` |
|
||||
| `rdp`, `vnc` | GUI only | all CLI actions |
|
||||
|
||||
## Common Safe Commands
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh --list
|
||||
$HOME/.server-connections/gemini-ssh --info ALIAS
|
||||
$HOME/.server-connections/gemini-ssh --status
|
||||
$HOME/.server-connections/gemini-ssh --set-note ALIAS "description"
|
||||
```
|
||||
|
||||
## SSH And Telnet
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh ALIAS "command"
|
||||
$HOME/.server-connections/gemini-ssh ALIAS --no-sudo "command"
|
||||
$HOME/.server-connections/gemini-ssh ALIAS --upload "local" //remote/path
|
||||
$HOME/.server-connections/gemini-ssh ALIAS --download //remote/path "local"
|
||||
$HOME/.server-connections/gemini-ssh ALIAS --ping
|
||||
```
|
||||
|
||||
Use double slashes for remote SSH/SFTP paths when working from Git Bash style environments.
|
||||
|
||||
## SQL
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh --sql ALIAS "SELECT * FROM table LIMIT 10"
|
||||
$HOME/.server-connections/gemini-ssh --sql-databases ALIAS
|
||||
$HOME/.server-connections/gemini-ssh --sql-tables ALIAS [database]
|
||||
```
|
||||
|
||||
## Redis
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh --redis ALIAS "GET key"
|
||||
$HOME/.server-connections/gemini-ssh --redis-info ALIAS
|
||||
$HOME/.server-connections/gemini-ssh --redis-keys ALIAS "pattern:*"
|
||||
```
|
||||
|
||||
## S3 / MinIO
|
||||
|
||||
Before modifying objects:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh --s3-buckets ALIAS
|
||||
$HOME/.server-connections/gemini-ssh --s3-ls ALIAS bucket/prefix/
|
||||
```
|
||||
|
||||
Then act:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh --s3-upload ALIAS "local" bucket/key
|
||||
$HOME/.server-connections/gemini-ssh --s3-download ALIAS bucket/key "local"
|
||||
$HOME/.server-connections/gemini-ssh --s3-delete ALIAS bucket/key
|
||||
$HOME/.server-connections/gemini-ssh --s3-url ALIAS bucket/key [seconds]
|
||||
$HOME/.server-connections/gemini-ssh --s3-create-bucket ALIAS bucket-name
|
||||
```
|
||||
|
||||
Do not treat S3 as a shell filesystem.
|
||||
|
||||
## Grafana / Prometheus / WinRM
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh --grafana-dashboards ALIAS
|
||||
$HOME/.server-connections/gemini-ssh --grafana-alerts ALIAS
|
||||
$HOME/.server-connections/gemini-ssh --prom-query ALIAS "up"
|
||||
$HOME/.server-connections/gemini-ssh --prom-targets ALIAS
|
||||
$HOME/.server-connections/gemini-ssh --prom-alerts ALIAS
|
||||
$HOME/.server-connections/gemini-ssh --ps ALIAS "Get-Process"
|
||||
$HOME/.server-connections/gemini-ssh --cmd ALIAS "dir"
|
||||
```
|
||||
73
.gemini/skills/server-manager/references/project.md
Normal file
73
.gemini/skills/server-manager/references/project.md
Normal file
@@ -0,0 +1,73 @@
|
||||
# Project Notes
|
||||
|
||||
This skill is based on `/home/code/Desktop/CODING/server-manager`.
|
||||
|
||||
## What ServerManager Is
|
||||
|
||||
ServerManager is a cross-platform desktop GUI built with CustomTkinter. It manages multiple remote endpoint types through one local encrypted inventory:
|
||||
|
||||
- SSH / Telnet
|
||||
- MariaDB / MSSQL / PostgreSQL
|
||||
- Redis
|
||||
- S3 / MinIO
|
||||
- Grafana
|
||||
- Prometheus
|
||||
- WinRM
|
||||
- RDP / VNC launchers
|
||||
|
||||
## Core Integration Model
|
||||
|
||||
The GUI and CLI share one local backend:
|
||||
|
||||
```text
|
||||
ServerManager GUI <-> ~/.server-connections/servers.json <-> ~/.server-connections/ssh.py
|
||||
```
|
||||
|
||||
The AI never needs raw credentials. It only uses aliases and the local CLI.
|
||||
|
||||
## Important Files
|
||||
|
||||
- `README.md`: product overview and install flow
|
||||
- `CLAUDE.md`: project rules, architecture, security, workflow
|
||||
- `GEMINI.md`: Gemini-native project contract
|
||||
- `tools/ssh.py`: CLI entry point used by AI tools
|
||||
- `tools/skill-ssh.md`: current Claude `/ssh` instructions
|
||||
- `core/claude_setup.py`: installer for shared CLI files plus Claude/Codex/Gemini skill deployment
|
||||
- `build.py`: auto-deploys `ssh.py`, `encryption.py`, Claude skill, Codex skill, and Gemini skill after builds
|
||||
|
||||
## Architectural Shape
|
||||
|
||||
- `core/server_store.py`: encrypted storage, CRUD, observers, backups
|
||||
- `core/connection_factory.py`: type-to-client factory with lazy imports
|
||||
- `core/*_client.py`: protocol-specific backends
|
||||
- `gui/app.py`: tab registry, conditional tabs by server type
|
||||
- `gui/tabs/`: protocol-specific GUI surfaces
|
||||
|
||||
## Existing Local Agent Integration
|
||||
|
||||
Current setup installs:
|
||||
|
||||
- `~/.server-connections/ssh.py`
|
||||
- `~/.server-connections/encryption.py`
|
||||
- `~/.claude/commands/ssh.md`
|
||||
- `~/.codex/skills/server-manager/`
|
||||
- `~/.gemini/skills/server-manager/`
|
||||
- `~/.agents/skills/server-manager/` (cross-tool mirror)
|
||||
- `~/.server-connections/codex-ssh` or `codex-ssh.cmd`
|
||||
- `~/.server-connections/gemini-ssh` or `gemini-ssh.cmd`
|
||||
- a `~/.claude/CLAUDE.md` guidance block
|
||||
- a `~/.gemini/GEMINI.md` guidance block
|
||||
|
||||
The Gemini skill mirrors the same safety model:
|
||||
|
||||
- use aliases only
|
||||
- use the shared local CLI
|
||||
- never read credentials directly
|
||||
- choose commands by server type
|
||||
|
||||
## Local Findings
|
||||
|
||||
- `ssh.py` is executable and uses a `python3` shebang, so Gemini does not need a `python` alias.
|
||||
- `ssh.py` has no `--help`; use `--list`, `--info`, and `--status` for safe discovery.
|
||||
- The Unix wrapper path covers both Linux and macOS through `gemini-ssh-wrapper.sh`.
|
||||
- Windows-native Gemini wrapper support exists through `gemini-ssh-wrapper.cmd`.
|
||||
11
.gemini/skills/server-manager/scripts/gemini-ssh-wrapper.cmd
Normal file
11
.gemini/skills/server-manager/scripts/gemini-ssh-wrapper.cmd
Normal file
@@ -0,0 +1,11 @@
|
||||
@echo off
|
||||
setlocal
|
||||
set SHARED_DIR=%SERVER_MANAGER_SHARED_DIR%
|
||||
if "%SHARED_DIR%"=="" set SHARED_DIR=%USERPROFILE%\.server-connections
|
||||
set SSH_SCRIPT=%SHARED_DIR%\ssh.py
|
||||
if not exist "%SSH_SCRIPT%" (
|
||||
echo error: missing executable ssh.py at %SSH_SCRIPT% 1>&2
|
||||
echo hint: install ServerManager's shared CLI files first 1>&2
|
||||
exit /b 1
|
||||
)
|
||||
"%SSH_SCRIPT%" %*
|
||||
13
.gemini/skills/server-manager/scripts/gemini-ssh-wrapper.sh
Normal file
13
.gemini/skills/server-manager/scripts/gemini-ssh-wrapper.sh
Normal file
@@ -0,0 +1,13 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
shared_dir="${SERVER_MANAGER_SHARED_DIR:-$HOME/.server-connections}"
|
||||
ssh_script="${shared_dir}/ssh.py"
|
||||
|
||||
if [[ ! -x "$ssh_script" ]]; then
|
||||
echo "error: missing executable ssh.py at ${ssh_script}" >&2
|
||||
echo "hint: install ServerManager's shared CLI files first" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
exec "$ssh_script" "$@"
|
||||
@@ -0,0 +1,39 @@
|
||||
@echo off
|
||||
setlocal
|
||||
set SHARED_DIR=%SERVER_MANAGER_SHARED_DIR%
|
||||
if "%SHARED_DIR%"=="" set SHARED_DIR=%USERPROFILE%\.server-connections
|
||||
set SSH_SCRIPT=%SHARED_DIR%\ssh.py
|
||||
set ENCRYPTION=%SHARED_DIR%\encryption.py
|
||||
set WRAPPER=%SHARED_DIR%\gemini-ssh.cmd
|
||||
set SKILL=%USERPROFILE%\.gemini\skills\server-manager\SKILL.md
|
||||
set STATUS=0
|
||||
|
||||
if exist "%ENCRYPTION%" (
|
||||
echo [ok] file %ENCRYPTION%
|
||||
) else (
|
||||
echo [missing] file %ENCRYPTION% 1>&2
|
||||
set STATUS=1
|
||||
)
|
||||
|
||||
if exist "%SSH_SCRIPT%" (
|
||||
echo [ok] file %SSH_SCRIPT%
|
||||
) else (
|
||||
echo [missing] file %SSH_SCRIPT% 1>&2
|
||||
set STATUS=1
|
||||
)
|
||||
|
||||
if exist "%WRAPPER%" (
|
||||
echo [ok] file %WRAPPER%
|
||||
) else (
|
||||
echo [missing] file %WRAPPER% 1>&2
|
||||
set STATUS=1
|
||||
)
|
||||
|
||||
if exist "%SKILL%" (
|
||||
echo [ok] file %SKILL%
|
||||
) else (
|
||||
echo [missing] file %SKILL% 1>&2
|
||||
set STATUS=1
|
||||
)
|
||||
|
||||
exit /b %STATUS%
|
||||
@@ -0,0 +1,37 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
shared_dir="${SERVER_MANAGER_SHARED_DIR:-$HOME/.server-connections}"
|
||||
ssh_script="${shared_dir}/ssh.py"
|
||||
encryption_module="${shared_dir}/encryption.py"
|
||||
wrapper="${shared_dir}/gemini-ssh"
|
||||
skill_dir="$HOME/.gemini/skills/server-manager"
|
||||
|
||||
status=0
|
||||
|
||||
check_file() {
|
||||
local path="$1"
|
||||
if [[ -f "$path" ]]; then
|
||||
printf '[ok] file %s\n' "$path"
|
||||
else
|
||||
printf '[missing] file %s\n' "$path" >&2
|
||||
status=1
|
||||
fi
|
||||
}
|
||||
|
||||
check_exec() {
|
||||
local path="$1"
|
||||
if [[ -x "$path" ]]; then
|
||||
printf '[ok] executable %s\n' "$path"
|
||||
else
|
||||
printf '[missing] executable %s\n' "$path" >&2
|
||||
status=1
|
||||
fi
|
||||
}
|
||||
|
||||
check_file "$encryption_module"
|
||||
check_exec "$ssh_script"
|
||||
check_exec "$wrapper"
|
||||
check_file "$skill_dir/SKILL.md"
|
||||
|
||||
exit "$status"
|
||||
142
BUG_REPORT_CLAUDE_CODE_PNG_CRASH.md
Normal file
142
BUG_REPORT_CLAUDE_CODE_PNG_CRASH.md
Normal file
@@ -0,0 +1,142 @@
|
||||
# Bug Report: Claude Code CLI crashes when reading large image files
|
||||
|
||||
## Summary
|
||||
|
||||
The `Read` tool in Claude Code CLI fails when reading images larger than ~25K base64 tokens (~150KB file size). Small images work fine. The root cause is in the `DP1` image compression pipeline — when a large image goes through compression, the resulting API content block ends up with `source: {type: "base64"}` but **missing both `data` and `media_type` fields**. This causes an unrecoverable API 400 error.
|
||||
|
||||
## Environment
|
||||
|
||||
- **Claude Code CLI:** `@anthropic-ai/claude-code@2.1.70`
|
||||
- **OS:** Windows 10 Pro for Workstations 10.0.19045
|
||||
- **Node.js:** v24.13.1
|
||||
- **sharp:** 0.34.5 (manually installed, works correctly)
|
||||
|
||||
## Root Cause Analysis
|
||||
|
||||
### The Size Threshold
|
||||
|
||||
Images are read by `Nv8()` which calls `q01()` to create the result. After `q01()`, a size check runs:
|
||||
|
||||
```javascript
|
||||
if (Math.ceil($.file.base64.length * 0.125) > q) // q = Tv8() = 25000 tokens
|
||||
```
|
||||
|
||||
- **Small images** (< ~150KB file / < 25K tokens base64): Skip `DP1`, return directly from `q01()` → **WORKS**
|
||||
- **Large images** (> ~150KB file / > 25K tokens base64): Enter `DP1` compression path → **CRASHES**
|
||||
|
||||
### What happens in the DP1 path
|
||||
|
||||
When the image exceeds the token limit, `DP1()` is called to compress it. `DP1` uses sharp to resize/recompress and returns `{base64, mediaType, originalSize}`. The code then returns:
|
||||
|
||||
```javascript
|
||||
return {type: "image", file: {base64: H.base64, type: H.mediaType, originalSize: z}}
|
||||
```
|
||||
|
||||
In isolation, this looks correct. `H.mediaType` is `"image/jpeg"` (from `vp6()` inside `DP1`).
|
||||
|
||||
### Where it actually breaks
|
||||
|
||||
The tool result mapper converts this to an API content block:
|
||||
|
||||
```javascript
|
||||
case "image": return {
|
||||
tool_use_id: q,
|
||||
type: "tool_result",
|
||||
content: [{
|
||||
type: "image",
|
||||
source: {type: "base64", data: A.file.base64, media_type: A.file.type}
|
||||
}]
|
||||
};
|
||||
```
|
||||
|
||||
**However**, between the mapper output and the actual API request, the image content block gets **stripped**. The API receives:
|
||||
|
||||
```json
|
||||
{"type": "image", "source": {"type": "base64"}}
|
||||
```
|
||||
|
||||
Both `data` and `media_type` are absent. `JSON.stringify` silently drops `undefined` properties, so if both become `undefined` at any point, the serialized JSON omits them entirely.
|
||||
|
||||
### Evidence from transcript analysis
|
||||
|
||||
The session transcript (`.jsonl` output) captured the exact message content sent to the API:
|
||||
|
||||
```json
|
||||
{
|
||||
"type": "user",
|
||||
"content": [{
|
||||
"tool_use_id": "toolu_01NmuSjPErhBfbtoV8RBrJip",
|
||||
"type": "tool_result",
|
||||
"content": [{"type": "image", "source": {"type": "base64"}}]
|
||||
}]
|
||||
}
|
||||
```
|
||||
|
||||
This confirms `data` and `media_type` are both missing at the API call level.
|
||||
|
||||
### The actual root cause (suspected)
|
||||
|
||||
The image data stripping likely occurs in the **message normalization/storage layer** between the tool result mapper and the API call. When conversation messages are stored in memory (the internal `D` array or conversation state), large base64 image data may be:
|
||||
|
||||
1. Stripped for memory efficiency
|
||||
2. Moved to a separate image attachment store (referenced by `imagePasteIds`)
|
||||
3. Lost during `structuredClone` or message serialization
|
||||
|
||||
The reconstruction step that should restore the image data before the API call **fails for tool_result image blocks**, possibly because it only handles top-level image blocks (from user pastes) but not images nested inside `tool_result.content[]`.
|
||||
|
||||
## Test Results
|
||||
|
||||
| File | Size | Base64 tokens | DP1 path | Result |
|
||||
|------|------|---------------|----------|--------|
|
||||
| photo.jpg | 25KB | ~4,250 | No | **Works** |
|
||||
| test_tiny.png | 98B | ~16 | No | **Works** |
|
||||
| test_medium.png | 751KB | ~125,000 | Yes | **Crashes** |
|
||||
| screenshot_gui.png | 387KB | ~64,500 | Yes | **Crashes** |
|
||||
|
||||
## Severity: Critical
|
||||
|
||||
- **Session-killing:** corrupted message poisons the entire conversation context
|
||||
- **No recovery:** every subsequent API call fails with 400
|
||||
- **Affects subagents too:** Agent tool crashes, but main session survives
|
||||
- **Size-dependent:** only images > ~150KB trigger the bug
|
||||
|
||||
## Patches Applied
|
||||
|
||||
### Patch 1: Nv8 try/catch wrapper (`PATCHED_NV8_SAFE_IMAGE_READ`)
|
||||
Wraps the entire `Nv8` function in try/catch. On failure, returns a text error message instead of corrupted binary. Also adds `||"image/png"` fallback on `H.mediaType` in the DP1 path.
|
||||
|
||||
### Patch 2: Image mapper media_type fallback (`PATCHED_IMAGE_MEDIA_TYPE`)
|
||||
Adds `||"image/png"` fallback to `media_type` in the tool result mapper. Prevents `undefined` from being serialized as absent field.
|
||||
|
||||
### Effectiveness
|
||||
- Patches only work after restarting Claude Code (cli.js is loaded once at startup)
|
||||
- Patches fix the `media_type` issue but may NOT fix the missing `data` issue
|
||||
- The underlying cause (image data being stripped from stored messages) needs to be fixed upstream
|
||||
|
||||
## Patcher Tool
|
||||
|
||||
```bash
|
||||
node tools/patch_claude_code.js # Apply all patches
|
||||
node tools/patch_claude_code.js --check # Check status
|
||||
node tools/patch_claude_code.js --revert # Revert to backup
|
||||
```
|
||||
|
||||
After updating Claude Code (`npm update -g @anthropic-ai/claude-code`), re-run the patcher.
|
||||
|
||||
## Workarounds
|
||||
|
||||
1. **Use subagent for ALL image reading** — crashes in isolation, main session survives
|
||||
2. **Resize large images before reading** — keep under ~150KB
|
||||
3. **Read images only via Bash tool** — `file screenshot.png` for metadata, avoid actual content
|
||||
|
||||
## Files Referenced
|
||||
|
||||
- **Patcher:** `tools/patch_claude_code.js`
|
||||
- **CLI entry:** `node_modules/@anthropic-ai/claude-code/cli.js` (minified, ~13K lines)
|
||||
- **Key functions:** `Nv8` (image reader), `DP1` (compressor), `q01` (result builder), `ig` (sharp wrapper), `mapToolResultToToolResultBlockParam` (API mapper)
|
||||
|
||||
## Report Info
|
||||
|
||||
- **Date:** 2026-03-06
|
||||
- **Version:** Claude Code 2.1.70
|
||||
- **Reproducible:** 100% on Windows with any image > ~150KB
|
||||
10
CHANGELOG.md
10
CHANGELOG.md
@@ -1,5 +1,15 @@
|
||||
# Changelog
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
### Added
|
||||
- Add Codex integration to Setup tab with dedicated install buttons and status rows
|
||||
- Add packaged Codex skill deployment and local `codex-ssh` wrapper installation
|
||||
- Add Windows `.cmd` wrappers for Codex skill installation/runtime
|
||||
|
||||
### Changed
|
||||
- Extend `build.py` auto-deploy to sync Codex artifacts alongside Claude artifacts
|
||||
- Update docs and i18n strings to describe Claude Code + Codex setup flow
|
||||
|
||||
## [1.8.24] - 2026-02-24
|
||||
|
||||
|
||||
23
CLAUDE.md
23
CLAUDE.md
@@ -4,6 +4,20 @@
|
||||
|
||||
ServerManager — **кроссплатформенное** Desktop GUI (CustomTkinter) для управления удалёнными серверами **любых типов**. Работает на **Windows и Linux**. Claude Code управляет серверами через скилл `/ssh`, зная только алиасы.
|
||||
|
||||
## КРИТИЧНО: Чтение изображений — ТОЛЬКО через Agent
|
||||
|
||||
**Claude Code CLI v2.1.70 баг:** Read tool крашит сессию на изображениях >150KB.
|
||||
Большие изображения проходят через DP1-сжатие, где `data` и `media_type` теряются → API 400 → сессия мертва навсегда.
|
||||
|
||||
**ПРАВИЛА:**
|
||||
- **НИКОГДА** не вызывать `Read` на .png/.jpg/.jpeg/.gif/.webp напрямую
|
||||
- **НИКОГДА** не делать скриншоты через `mcp__chrome-devtools__take_screenshot` — результат тоже изображение, крашит так же
|
||||
- **ВСЕГДА** читать изображения через `Agent` tool (субагент) — он упадёт изолированно, основная сессия выживет
|
||||
- Маленькие (<150KB) работают, но **не рисковать** — всё через агента
|
||||
- НЕ тестировать фиксы изображений в основном контексте даже после "исправления"
|
||||
|
||||
Подробности: `BUG_REPORT_CLAUDE_CODE_PNG_CRASH.md`
|
||||
|
||||
## КРОССПЛАТФОРМЕННОСТЬ — ОБЯЗАТЕЛЬНО
|
||||
|
||||
Приложение собирается и работает на **Windows** и **Linux**. При любых правках кода:
|
||||
@@ -26,7 +40,7 @@ ServerManager — **кроссплатформенное** Desktop GUI (CustomTk
|
||||
| grafana | `grafana_client.py` (requests) | Dashboards, Info, Setup | `--grafana-dashboards`, `--grafana-alerts` |
|
||||
| prometheus | `prometheus_client.py` (requests) | Metrics, Info, Setup | `--prom-query`, `--prom-targets`, `--prom-alerts` |
|
||||
| winrm | `winrm_client.py` (pywinrm) | PowerShell, Info, Setup | `--ps`, `--cmd` |
|
||||
| s3 | `s3_client.py` (boto3) | Objects, Info, Setup | `--s3-buckets`, `--s3-ls`, `--s3-upload`, `--s3-download`, `--s3-delete` |
|
||||
| s3 | `s3_client.py` (boto3) | Objects, Info, Setup | `--s3-buckets`, `--s3-ls`, `--s3-upload`, `--s3-download`, `--s3-delete`, `--s3-url` |
|
||||
| rdp/vnc | `remote_desktop.py` | Launch, Info, Setup | — (запуск внешнего клиента) |
|
||||
|
||||
## БЕЗОПАСНОСТЬ
|
||||
@@ -139,6 +153,13 @@ tools/
|
||||
/ssh --redis ALIAS "GET key" # Redis-команда
|
||||
/ssh --redis-info ALIAS # Redis INFO
|
||||
/ssh --redis-keys ALIAS "pattern" # SCAN ключей
|
||||
# S3 / MinIO
|
||||
/ssh --s3-buckets ALIAS # Список бакетов
|
||||
/ssh --s3-ls ALIAS bucket[/prefix] # Список объектов
|
||||
/ssh --s3-upload ALIAS local bucket/key # Upload файла
|
||||
/ssh --s3-download ALIAS bucket/key local # Download файла
|
||||
/ssh --s3-delete ALIAS bucket/key # Удалить объект
|
||||
/ssh --s3-url ALIAS bucket/key [SEC] # Presigned URL (по умолчанию 1 час)
|
||||
# Grafana / Prometheus
|
||||
/ssh --grafana-dashboards ALIAS # Дашборды
|
||||
/ssh --prom-query ALIAS "up" # PromQL
|
||||
|
||||
409
CODEX_SKILL_SETUP.md
Normal file
409
CODEX_SKILL_SETUP.md
Normal file
@@ -0,0 +1,409 @@
|
||||
# Развёртывание Codex Skill Для ServerManager
|
||||
|
||||
Этот документ описывает текущее состояние интеграции `ServerManager -> Codex`, автоматическое и ручное развёртывание, проверку и все известные edge cases.
|
||||
|
||||
Поддерживаемый deployment target для этой интеграции:
|
||||
|
||||
- Linux
|
||||
- macOS
|
||||
- Windows
|
||||
|
||||
## Что именно разворачивается
|
||||
|
||||
Интеграция для Codex состоит из трёх слоёв:
|
||||
|
||||
1. Общий локальный backend:
|
||||
- `~/.server-connections/ssh.py`
|
||||
- `~/.server-connections/encryption.py`
|
||||
- `~/.server-connections/servers.json`
|
||||
2. Codex skill package:
|
||||
- `~/.codex/skills/server-manager/`
|
||||
3. Безопасный wrapper для вызова backend из Codex:
|
||||
- `~/.server-connections/codex-ssh` на Linux/macOS
|
||||
- `~/.server-connections/codex-ssh.cmd` на Windows
|
||||
|
||||
В репозитории исходники skill лежат здесь:
|
||||
|
||||
- [SKILL.md](/home/code/Desktop/CODING/server-manager/.codex/skills/server-manager/SKILL.md)
|
||||
- [command-matrix.md](/home/code/Desktop/CODING/server-manager/.codex/skills/server-manager/references/command-matrix.md)
|
||||
- [project.md](/home/code/Desktop/CODING/server-manager/.codex/skills/server-manager/references/project.md)
|
||||
- [server-manager-doctor.sh](/home/code/Desktop/CODING/server-manager/.codex/skills/server-manager/scripts/server-manager-doctor.sh)
|
||||
- [server-manager-doctor.cmd](/home/code/Desktop/CODING/server-manager/.codex/skills/server-manager/scripts/server-manager-doctor.cmd)
|
||||
- [codex-ssh-wrapper.sh](/home/code/Desktop/CODING/server-manager/.codex/skills/server-manager/scripts/codex-ssh-wrapper.sh)
|
||||
- [codex-ssh-wrapper.cmd](/home/code/Desktop/CODING/server-manager/.codex/skills/server-manager/scripts/codex-ssh-wrapper.cmd)
|
||||
|
||||
## Как это работает
|
||||
|
||||
Модель безопасности та же, что и у Claude integration:
|
||||
|
||||
```text
|
||||
Codex skill -> ~/.server-connections/codex-ssh -> ~/.server-connections/ssh.py -> encrypted servers.json
|
||||
```
|
||||
|
||||
Ключевая идея:
|
||||
|
||||
- Codex видит только алиасы и безопасные результаты команд.
|
||||
- `ssh.py` сам читает credentials из локального зашифрованного хранилища.
|
||||
- Codex не должен читать `servers.json`, `settings.json` или `encryption.py` напрямую.
|
||||
|
||||
## Что уже автоматизировано
|
||||
|
||||
Теперь Codex integration встроена в продуктовый setup flow:
|
||||
|
||||
- `core/claude_setup.py` ставит `ssh.py`, `encryption.py`, `~/.claude/commands/ssh.md`, `~/.codex/skills/server-manager/`, wrapper `~/.server-connections/codex-ssh` и блок в `~/.claude/CLAUDE.md`
|
||||
- вкладка `Setup` в GUI показывает отдельные статусы для Claude skill, Codex skill и Codex wrapper
|
||||
- `build.py` после сборки автоматически синхронизирует Claude- и Codex-артефакты в локальный runtime
|
||||
|
||||
Платформенный split такой:
|
||||
|
||||
- Linux/macOS: используются `codex-ssh-wrapper.sh` и `server-manager-doctor.sh`
|
||||
- Windows: используются `codex-ssh-wrapper.cmd` и `server-manager-doctor.cmd`
|
||||
|
||||
Ручная установка всё ещё полезна как fallback path, если нужен точечный repair или offline debugging.
|
||||
|
||||
## Предварительные условия
|
||||
|
||||
Перед установкой Codex skill должны уже существовать или быть установлены через `Setup`:
|
||||
|
||||
1. `~/.server-connections/ssh.py`
|
||||
2. `~/.server-connections/encryption.py`
|
||||
3. `~/.server-connections/servers.json`
|
||||
4. `codex` CLI
|
||||
|
||||
Проверка:
|
||||
|
||||
```bash
|
||||
ls -la ~/.server-connections
|
||||
codex --help
|
||||
```
|
||||
|
||||
Если `~/.server-connections/ssh.py` отсутствует:
|
||||
|
||||
1. Открыть ServerManager GUI
|
||||
2. Перейти в `Setup`
|
||||
3. Нажать `Install Everything`
|
||||
|
||||
Это поставит backend, Claude skill и Codex skill целиком.
|
||||
|
||||
## Рекомендуемый путь: установка через GUI
|
||||
|
||||
1. Открыть ServerManager
|
||||
2. Перейти в `Setup`
|
||||
3. Нажать `Install Everything`
|
||||
4. Проверить, что зелёные статусы появились у:
|
||||
- `ssh.py`
|
||||
- `Encryption module`
|
||||
- `Claude /ssh skill`
|
||||
- `Codex skill`
|
||||
- `Codex wrapper`
|
||||
- `SSH key`
|
||||
|
||||
Для точечного ремонта можно использовать отдельные кнопки `Claude skill` и `Codex skill` в той же вкладке.
|
||||
|
||||
## Ручная установка Codex Skill
|
||||
|
||||
### 1. Скопировать skill package в глобальный Codex home
|
||||
|
||||
```bash
|
||||
mkdir -p ~/.codex/skills
|
||||
cp -R .codex/skills/server-manager ~/.codex/skills/server-manager
|
||||
```
|
||||
|
||||
### 2. Установить wrapper в shared runtime directory
|
||||
|
||||
Linux/macOS:
|
||||
|
||||
```bash
|
||||
install -m 755 .codex/skills/server-manager/scripts/codex-ssh-wrapper.sh ~/.server-connections/codex-ssh
|
||||
```
|
||||
|
||||
Windows:
|
||||
|
||||
```bat
|
||||
copy .codex\skills\server-manager\scripts\codex-ssh-wrapper.cmd %USERPROFILE%\.server-connections\codex-ssh.cmd
|
||||
```
|
||||
|
||||
### 3. Проверить doctor script
|
||||
|
||||
Linux/macOS:
|
||||
|
||||
```bash
|
||||
~/.codex/skills/server-manager/scripts/server-manager-doctor.sh
|
||||
```
|
||||
|
||||
Windows:
|
||||
|
||||
```bat
|
||||
%USERPROFILE%\.codex\skills\server-manager\scripts\server-manager-doctor.cmd
|
||||
```
|
||||
|
||||
Ожидается:
|
||||
|
||||
- `ssh.py` найден
|
||||
- `encryption.py` найден
|
||||
- `codex-ssh` executable
|
||||
|
||||
### 4. Проверить wrapper без раскрытия credentials
|
||||
|
||||
Linux/macOS:
|
||||
|
||||
```bash
|
||||
~/.server-connections/codex-ssh --list
|
||||
```
|
||||
|
||||
Windows:
|
||||
|
||||
```bat
|
||||
%USERPROFILE%\.server-connections\codex-ssh.cmd --list
|
||||
```
|
||||
|
||||
Это безопасная базовая проверка. Она должна вывести список алиасов и типов серверов.
|
||||
|
||||
### 5. Перезапустить Codex
|
||||
|
||||
Если у вас уже была открыта интерактивная Codex session, её нужно перезапустить. Новый skill обычно подхватывается новым процессом Codex, а не уже живой сессией.
|
||||
|
||||
## Как проверить, что Codex реально видит skill
|
||||
|
||||
Самый надёжный способ:
|
||||
|
||||
```bash
|
||||
codex exec --skip-git-repo-check -s read-only -C /tmp \
|
||||
"A user asks: Using the locally installed ServerManager integration, what is the safest first command to enumerate configured servers? Reply with only the command."
|
||||
```
|
||||
|
||||
Если skill подхватился корректно, Codex должен сам прочитать `~/.codex/skills/server-manager/SKILL.md` и ответить:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/codex-ssh --list
|
||||
```
|
||||
|
||||
## Что Codex должен делать через skill
|
||||
|
||||
Правильный workflow для любой server operation:
|
||||
|
||||
1. Сначала `--list`
|
||||
2. Прочитать колонку `Type`
|
||||
3. Выбрать команду строго по типу сервера
|
||||
4. Выполнить ровно одно подключение/одно действие
|
||||
5. Вернуть результат без IP/логинов/паролей/портов
|
||||
|
||||
Безопасные discovery-команды:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/codex-ssh --list
|
||||
$HOME/.server-connections/codex-ssh --info ALIAS
|
||||
$HOME/.server-connections/codex-ssh --status
|
||||
```
|
||||
|
||||
## Источники истины по интеграции
|
||||
|
||||
Если меняется поведение интеграции, проверять нужно в таком порядке:
|
||||
|
||||
1. [tools/ssh.py](/home/code/Desktop/CODING/server-manager/tools/ssh.py)
|
||||
2. [tools/skill-ssh.md](/home/code/Desktop/CODING/server-manager/tools/skill-ssh.md)
|
||||
3. [core/claude_setup.py](/home/code/Desktop/CODING/server-manager/core/claude_setup.py)
|
||||
4. [build.py](/home/code/Desktop/CODING/server-manager/build.py)
|
||||
5. [SKILL.md](/home/code/Desktop/CODING/server-manager/.codex/skills/server-manager/SKILL.md)
|
||||
6. [command-matrix.md](/home/code/Desktop/CODING/server-manager/.codex/skills/server-manager/references/command-matrix.md)
|
||||
|
||||
Если меняется семантика `ssh.py`, нужно обновлять и Claude skill, и Codex skill.
|
||||
|
||||
## Edge Cases
|
||||
|
||||
### 1. `python` alias отсутствует
|
||||
|
||||
В этой среде `python` отсутствует, но `ssh.py` имеет shebang `#!/usr/bin/env python3` и executable bit.
|
||||
|
||||
Поэтому wrapper вызывает `ssh.py` напрямую:
|
||||
|
||||
```bash
|
||||
~/.server-connections/codex-ssh --list
|
||||
```
|
||||
|
||||
Это намеренно лучше, чем завязка на `python ~/.server-connections/ssh.py`.
|
||||
|
||||
### 2. `ssh.py --help` не поддерживается
|
||||
|
||||
`ssh.py` не имеет полноценного `--help`. Попытка вызвать `--help` возвращает список доступных alias'ов, а не usage.
|
||||
|
||||
Поэтому для безопасной проверки используются:
|
||||
|
||||
- `--list`
|
||||
- `--info ALIAS`
|
||||
- `--status`
|
||||
|
||||
### 3. Skill установлен в repo, но не установлен глобально
|
||||
|
||||
Наличие `.codex/skills/server-manager/` внутри репозитория полезно как source of truth, но новый Codex процесс по умолчанию ищет глобальные skills в `~/.codex/skills`.
|
||||
|
||||
Если skill есть только в repo:
|
||||
|
||||
- документация в проекте будет на месте
|
||||
- глобальный Codex может его не увидеть
|
||||
|
||||
Для надёжности нужен именно глобальный install в `~/.codex/skills/server-manager`.
|
||||
|
||||
### 4. Wrapper отсутствует, а skill уже установлен
|
||||
|
||||
В этом случае Codex прочитает skill, но не сможет выполнить рекомендуемую команду `$HOME/.server-connections/codex-ssh ...`.
|
||||
|
||||
Проверка:
|
||||
|
||||
```bash
|
||||
~/.codex/skills/server-manager/scripts/server-manager-doctor.sh
|
||||
```
|
||||
|
||||
Исправление:
|
||||
|
||||
```bash
|
||||
install -m 755 .codex/skills/server-manager/scripts/codex-ssh-wrapper.sh ~/.server-connections/codex-ssh
|
||||
```
|
||||
|
||||
### 5. Backend отсутствует
|
||||
|
||||
Если нет `~/.server-connections/ssh.py` или `encryption.py`, skill бесполезен: он знает workflow, но не имеет локального transport layer.
|
||||
|
||||
Исправление:
|
||||
|
||||
1. Запустить ServerManager
|
||||
2. `Setup -> Install Everything`
|
||||
|
||||
### 6. Интерактивный Codex уже был запущен до установки skill
|
||||
|
||||
Новая интерактивная сессия обычно увидит skill, старая может не увидеть.
|
||||
|
||||
Исправление:
|
||||
|
||||
- закрыть старую Codex session
|
||||
- запустить новый процесс `codex`
|
||||
|
||||
### 7. `codex exec` не может проверить skill из-за sandbox/network policy
|
||||
|
||||
Во время non-interactive проверки Codex может упереться не в skill, а в сетевую политику среды:
|
||||
|
||||
- websocket backend заблокирован
|
||||
- sandbox запрещает соединение
|
||||
|
||||
Симптом:
|
||||
|
||||
```text
|
||||
failed to connect to websocket ... Operation not permitted
|
||||
```
|
||||
|
||||
Это не означает, что skill неверный. Это означает, что проверка упёрлась в runtime policy Codex backend.
|
||||
|
||||
### 8. Повторная установка поверх существующего skill
|
||||
|
||||
GUI installer и `install_codex_skill()` синхронизируют дерево skill поверх существующей директории без полного удаления. Это безопасно для обычных обновлений.
|
||||
|
||||
Но при ручном `cp -R` возможен stale state, если какие-то файлы были удалены из repo, а старая глобальная копия осталась.
|
||||
|
||||
Полная ручная пересинхронизация нужна только если вы осознанно хотите очистить старые файлы:
|
||||
|
||||
1. удалить старую копию осознанно
|
||||
2. снова скопировать skill целиком
|
||||
|
||||
Пример:
|
||||
|
||||
```bash
|
||||
rm -rf ~/.codex/skills/server-manager
|
||||
cp -R .codex/skills/server-manager ~/.codex/skills/server-manager
|
||||
```
|
||||
|
||||
Делать это только если вы уверены, что хотите полностью пересобрать глобальную копию.
|
||||
|
||||
### 9. Изменился `tools/ssh.py`, но глобальная установка осталась старой
|
||||
|
||||
Это самый вероятный operational drift.
|
||||
|
||||
Что может устареть:
|
||||
|
||||
- `~/.server-connections/ssh.py`
|
||||
- `~/.codex/skills/server-manager/*`
|
||||
- `~/.server-connections/codex-ssh`
|
||||
|
||||
После изменения `tools/ssh.py` или skill docs нужно заново синхронизировать:
|
||||
|
||||
```bash
|
||||
cp tools/ssh.py ~/.server-connections/ssh.py
|
||||
cp core/encryption.py ~/.server-connections/encryption.py
|
||||
rm -rf ~/.codex/skills/server-manager
|
||||
cp -R .codex/skills/server-manager ~/.codex/skills/server-manager
|
||||
install -m 755 .codex/skills/server-manager/scripts/codex-ssh-wrapper.sh ~/.server-connections/codex-ssh
|
||||
```
|
||||
|
||||
### 10. Windows / macOS / Linux split
|
||||
|
||||
Сейчас runtime path intentionally разделён по платформам:
|
||||
|
||||
- Linux/macOS: shell wrapper `codex-ssh-wrapper.sh`
|
||||
- Windows: native wrapper `codex-ssh-wrapper.cmd`
|
||||
|
||||
Installer на Windows кладёт wrapper как:
|
||||
|
||||
- `~/.server-connections/codex-ssh.cmd`
|
||||
|
||||
Installer на Linux/macOS кладёт wrapper как:
|
||||
|
||||
- `~/.server-connections/codex-ssh`
|
||||
|
||||
Что это закрывает:
|
||||
|
||||
- Linux/macOS path без platform-specific разветвления в skill
|
||||
- запуск через `cmd.exe`
|
||||
- запуск из PowerShell
|
||||
- отсутствие bash-зависимости для стандартного Windows deployment path
|
||||
|
||||
Ограничение остаётся одно: в текущей среде я прогнал end-to-end smoke только на Linux. macOS и Windows path подготовлены в installer/docs, но не smoke-tested здесь из-за отсутствия соответствующих runner'ов.
|
||||
|
||||
### 11. `ssh.py` intentionally не должен читать secrets в контекст AI
|
||||
|
||||
Это не баг. Даже если кажется проще открыть `servers.json`, делать этого нельзя.
|
||||
|
||||
Skill намеренно запрещает:
|
||||
|
||||
- `cat ~/.server-connections/servers.json`
|
||||
- `cat ~/.server-connections/settings.json`
|
||||
- `python -c "...read servers.json..."`
|
||||
- `--list-full`
|
||||
|
||||
### 12. fail2ban / anti-bruteforce edge case
|
||||
|
||||
Повторные неудачные подключения опасны. Поэтому skill зафиксирован как:
|
||||
|
||||
- максимум 1 попытка на действие
|
||||
- при timeout/ошибке остановиться и сообщить пользователю
|
||||
|
||||
Это обязательное правило, а не рекомендация.
|
||||
|
||||
## Рекомендуемый Update Workflow
|
||||
|
||||
После любых изменений, затрагивающих Codex integration:
|
||||
|
||||
1. Обновить исходники в repo:
|
||||
- `tools/ssh.py`
|
||||
- `.codex/skills/server-manager/*`
|
||||
- этот документ
|
||||
2. Синхронизировать глобальную установку
|
||||
3. Прогнать doctor
|
||||
4. Прогнать `~/.server-connections/codex-ssh --list`
|
||||
5. Прогнать свежий `codex exec` smoke test
|
||||
|
||||
## Минимальный Smoke Test
|
||||
|
||||
```bash
|
||||
~/.codex/skills/server-manager/scripts/server-manager-doctor.sh
|
||||
~/.server-connections/codex-ssh --list
|
||||
codex exec --skip-git-repo-check -s read-only -C /tmp \
|
||||
"A user asks: Using the locally installed ServerManager integration, what is the safest first command to enumerate configured servers? Reply with only the command."
|
||||
```
|
||||
|
||||
## Что ещё желательно автоматизировать позже
|
||||
|
||||
Чтобы интеграция стала production-complete, в проект ещё полезно добавить:
|
||||
|
||||
1. отдельный smoke test script внутри репозитория для проверки именно Codex integration
|
||||
2. e2e smoke test на Windows runner
|
||||
3. e2e smoke test на macOS runner
|
||||
4. optional PowerShell-native wrapper, если понадобится richer Windows logging
|
||||
52
GEMINI.md
Normal file
52
GEMINI.md
Normal file
@@ -0,0 +1,52 @@
|
||||
# Gemini Project Contract
|
||||
|
||||
This repository is **ServerManager** — a cross-platform desktop GUI for managing remote servers and services through one encrypted local inventory.
|
||||
|
||||
Use this file as the native Gemini contract for sessions started inside this repository.
|
||||
|
||||
## First Read
|
||||
|
||||
Read these files first when relevant:
|
||||
|
||||
- `README.md`
|
||||
- `CLAUDE.md`
|
||||
- `CHANGELOG.md`
|
||||
- `core/claude_setup.py`
|
||||
- `tools/ssh.py`
|
||||
|
||||
## Default Role
|
||||
|
||||
- Gemini is a secondary implementation and review helper, not the owner of the project state.
|
||||
- Prefer minimal safe changes that preserve Claude and Codex integration behavior.
|
||||
- When changing AI integration code, keep Claude `/ssh`, Codex `server-manager`, and Gemini `server-manager` behavior aligned.
|
||||
|
||||
## Project-Specific Rules
|
||||
|
||||
- Never read or print secrets from `~/.server-connections/servers.json`, `settings.json`, or `encryption.py`.
|
||||
- Treat `tools/ssh.py` as the shared transport layer for Claude, Codex, and Gemini.
|
||||
- Keep cross-platform behavior explicit for Linux, macOS, and Windows.
|
||||
- Prefer shared installer logic in `core/claude_setup.py` over duplicated per-tool logic.
|
||||
- If command semantics change in `tools/ssh.py`, update all relevant user-facing skill docs.
|
||||
|
||||
## Native Gemini Entry Points
|
||||
|
||||
- Project contract: `GEMINI.md`
|
||||
- Gemini settings: `.gemini/settings.json`
|
||||
- Workspace skill: `.gemini/skills/server-manager/`
|
||||
|
||||
## Safe Server Workflow
|
||||
|
||||
When the user asks to operate on a server already configured in ServerManager:
|
||||
|
||||
1. Use the installed ServerManager Gemini skill.
|
||||
2. First enumerate aliases safely.
|
||||
3. Determine endpoint type before choosing a command.
|
||||
4. Use the shared CLI wrapper, not raw credentials.
|
||||
|
||||
Preferred discovery commands:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh --list
|
||||
$HOME/.server-connections/gemini-ssh --info ALIAS
|
||||
$HOME/.server-connections/gemini-ssh --status
|
||||
```
|
||||
86
GEMINI_SKILL_SETUP.md
Normal file
86
GEMINI_SKILL_SETUP.md
Normal file
@@ -0,0 +1,86 @@
|
||||
# Развёртывание Gemini Skill Для ServerManager
|
||||
|
||||
Этот документ описывает, как ServerManager интегрируется с Gemini CLI.
|
||||
|
||||
## Что устанавливается
|
||||
|
||||
Для каждого target home устанавливаются:
|
||||
|
||||
1. Общий backend:
|
||||
- `~/.server-connections/ssh.py`
|
||||
- `~/.server-connections/encryption.py`
|
||||
2. Gemini skill package:
|
||||
- `~/.gemini/skills/server-manager/`
|
||||
3. Безопасный runtime wrapper:
|
||||
- `~/.server-connections/gemini-ssh`
|
||||
4. Глобальный Gemini context:
|
||||
- `~/.gemini/GEMINI.md`
|
||||
|
||||
## Skill workflow
|
||||
|
||||
Gemini должен начинать discovery так:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh --list
|
||||
```
|
||||
|
||||
Далее:
|
||||
|
||||
- определить `Type`
|
||||
- выбрать команду строго по типу
|
||||
- выполнить ровно одно действие
|
||||
- не раскрывать IP, логины, пароли, порты
|
||||
|
||||
## Рекомендуемая установка
|
||||
|
||||
### Через GUI
|
||||
|
||||
Вкладка `Setup` теперь умеет ставить:
|
||||
|
||||
- Claude skill
|
||||
- Codex skill
|
||||
- Gemini skill
|
||||
- shared backend и wrappers
|
||||
|
||||
### Через Python installer
|
||||
|
||||
```bash
|
||||
python3 tools/install_ai_integrations.py
|
||||
python3 tools/install_ai_integrations.py --target-home /root
|
||||
python3 tools/install_ai_integrations.py --all-users
|
||||
```
|
||||
|
||||
### Через shell installer (Linux/macOS)
|
||||
|
||||
```bash
|
||||
bash tools/install.sh --source-dir /path/to/server-manager
|
||||
bash tools/install.sh --source-dir /path/to/server-manager --target-home /root
|
||||
bash tools/install.sh --source-dir /path/to/server-manager --all-users
|
||||
```
|
||||
|
||||
## Проверка
|
||||
|
||||
### 1. Проверить skill discovery
|
||||
|
||||
```bash
|
||||
gemini skills list
|
||||
```
|
||||
|
||||
### 2. Проверить wrapper
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh --list
|
||||
```
|
||||
|
||||
### 3. Проверить doctor script
|
||||
|
||||
```bash
|
||||
$HOME/.gemini/skills/server-manager/scripts/server-manager-gemini-doctor.sh
|
||||
```
|
||||
|
||||
## Важные замечания
|
||||
|
||||
- `servers.json` не размножается автоматически в `--all-users` режиме — это сделано намеренно, чтобы не копировать credentials между пользователями.
|
||||
- Для root / service accounts используйте отдельную установку в нужный `target home`.
|
||||
- Gemini skill source в репозитории лежит в `.gemini/skills/server-manager/`.
|
||||
- При необходимости можно дополнительно ставить mirror в `~/.agents/skills/server-manager/`, но по умолчанию это отключено, чтобы Gemini не ругался на duplicate skill conflict.
|
||||
89
README.md
89
README.md
@@ -2,7 +2,7 @@
|
||||
|
||||
<p align="center">
|
||||
<strong>Desktop GUI for managing remote servers</strong><br>
|
||||
CustomTkinter + Paramiko | Dark Theme | Claude Code Integration
|
||||
CustomTkinter + Paramiko | Dark Theme | Claude Code + Codex + Gemini Integration
|
||||
</p>
|
||||
|
||||
<p align="center">
|
||||
@@ -22,7 +22,7 @@
|
||||
- **SFTP Transfer** — upload/download files with progress bar
|
||||
- **SSH Keys** — generate ed25519, install on server, copy to clipboard
|
||||
- **Status Monitor** — background check every 60 sec (online/offline badges)
|
||||
- **Claude Code Integration** — one-click setup, shared config with `/ssh` skill
|
||||
- **Claude Code + Codex + Gemini Integration** — one-click setup, shared config with `/ssh`, Codex skill, and Gemini skill
|
||||
- **TOTP / 2FA** — Google Authenticator compatible codes with live countdown, one-click copy
|
||||
- **Encryption** — servers.json encrypted with Fernet (passwords never stored in plaintext)
|
||||
- **Backups** — manual and automatic backups with one-click restore
|
||||
@@ -62,23 +62,25 @@ Output goes to `releases/ServerManager-vX.Y.Z-{platform}.exe`
|
||||
3. **Terminal** — select server → Terminal tab → type command → Run
|
||||
4. **Files** — select server → Files tab → set paths → Upload/Download
|
||||
5. **Keys** — Keys tab → Generate Key → Install on Server
|
||||
6. **Setup** — Setup tab → "Install Everything" → Claude Code ready
|
||||
6. **Setup** — Setup tab → "Install Everything" → Claude Code, Codex, and Gemini ready
|
||||
7. Status badges update automatically (green = online, red = offline)
|
||||
|
||||
### Claude Code Integration
|
||||
### Claude Code + Codex + Gemini Integration
|
||||
|
||||
ServerManager and Claude Code share the same config file: `~/.server-connections/servers.json`
|
||||
ServerManager, Claude Code, and Codex share the same config file: `~/.server-connections/servers.json`
|
||||
|
||||
For Codex deployment and operational edge cases, see [`CODEX_SKILL_SETUP.md`](CODEX_SKILL_SETUP.md).
|
||||
|
||||
**How it works:**
|
||||
```
|
||||
ServerManager GUI ←→ ~/.server-connections/servers.json ←→ ssh.py (Claude Code)
|
||||
ServerManager GUI ←→ ~/.server-connections/servers.json ←→ ssh.py backend
|
||||
↕ ↕
|
||||
Add/edit/delete /ssh skill
|
||||
servers in GUI executes commands
|
||||
Add/edit/delete Claude /ssh + Codex + Gemini skill
|
||||
servers in GUI execute commands
|
||||
```
|
||||
|
||||
- Add a server in GUI → Claude Code sees it immediately via `/ssh list`
|
||||
- Both use the same `ssh.py` + `servers.json`
|
||||
- Add a server in GUI → Claude Code, Codex, and Gemini see it immediately
|
||||
- Both agents use the same `ssh.py` + `servers.json`
|
||||
- Passwords **never** pass through the AI API
|
||||
|
||||
**New SSH Commands:**
|
||||
@@ -93,12 +95,17 @@ ServerManager GUI ←→ ~/.server-connections/servers.json ←→ ssh.py (C
|
||||
**Setup on a new machine:**
|
||||
1. Install ServerManager (clone repo or download binary)
|
||||
2. Open Setup tab → click "Install Everything"
|
||||
3. Done. Claude Code now has `/ssh` skill and access to your servers
|
||||
3. Done. Claude Code now has `/ssh`, Codex now has the `server-manager` skill, and Gemini now has the `server-manager` skill with access to your servers
|
||||
|
||||
The Setup tab installs:
|
||||
- `ssh.py` → `~/.server-connections/` (SSH utility)
|
||||
- `encryption.py` → `~/.server-connections/` (encryption module for CLI)
|
||||
- `/ssh` skill → `~/.claude/commands/ssh.md` (Claude Code skill)
|
||||
- `server-manager` skill → `~/.codex/skills/server-manager/` (Codex skill package)
|
||||
- `server-manager` skill → `~/.gemini/skills/server-manager/` (Gemini skill package)
|
||||
- optional mirror → `~/.agents/skills/server-manager/` (off by default to avoid Gemini duplicate-skill warnings)
|
||||
- `codex-ssh` wrapper → `~/.server-connections/` (safe Codex entry point)
|
||||
- `gemini-ssh` wrapper → `~/.server-connections/` (safe Gemini entry point)
|
||||
- SSH key (ed25519) — if not exists
|
||||
- Checks for duplicates — safe to run multiple times
|
||||
|
||||
@@ -140,7 +147,7 @@ App executes: sudo -S -p '' bash -c 'systemctl restart nginx'
|
||||
- Passwords stored locally only, **never sent to any AI/API**
|
||||
- SSH keys (ed25519) — recommended auth method
|
||||
- sudo password sent via stdin (not visible in process list)
|
||||
- When used with Claude Code: only alias + command are passed through the AI API, passwords stay in the local encrypted file
|
||||
- When used with Claude Code or Codex: only alias + command are passed through the AI API, passwords stay in the local encrypted file
|
||||
- Automatic pre-encryption backup on first migration
|
||||
|
||||
### Project Structure
|
||||
@@ -154,7 +161,7 @@ ServerManager/
|
||||
│ ├── server_store.py # CRUD + encrypted JSON + observer + backups
|
||||
│ ├── encryption.py # Fernet encryption module
|
||||
│ ├── ssh_client.py # Paramiko SSH/SFTP wrapper
|
||||
│ ├── claude_setup.py # Claude Code integration installer
|
||||
│ ├── claude_setup.py # Claude Code + Codex + Gemini integration installer
|
||||
│ ├── status_checker.py # Background monitoring
|
||||
│ ├── totp.py # TOTP/2FA module (pyotp)
|
||||
│ ├── logger.py # Rotating file logger
|
||||
@@ -166,7 +173,7 @@ ServerManager/
|
||||
│ ├── tabs/ # Terminal, Files, Info, Keys, Setup
|
||||
│ └── widgets/ # StatusBadge
|
||||
├── tools/ # CLI tools (installed to ~/.server-connections/)
|
||||
│ ├── ssh.py # SSH utility for Claude Code
|
||||
│ ├── ssh.py # SSH utility for Claude Code / Codex
|
||||
│ └── skill-ssh.md # /ssh skill template
|
||||
├── config/ # Example configs
|
||||
├── releases/ # Built executables
|
||||
@@ -187,7 +194,7 @@ pip install -r requirements.txt
|
||||
python main.py
|
||||
# → Setup tab → Install Everything
|
||||
# → Add your servers via + Add
|
||||
# → Done! Both GUI and Claude Code are ready
|
||||
# → Done! GUI, Claude Code, and Codex are ready
|
||||
```
|
||||
|
||||
---
|
||||
@@ -201,7 +208,7 @@ python main.py
|
||||
- **SFTP** — загрузка и скачивание файлов с прогресс-баром
|
||||
- **SSH-ключи** — генерация ed25519, установка на сервер, копирование
|
||||
- **Мониторинг** — фоновая проверка каждые 60 сек (бейджи online/offline)
|
||||
- **Интеграция с Claude Code** — установка в один клик, общий конфиг со скиллом `/ssh`
|
||||
- **Интеграция с Claude Code + Codex + Gemini** — установка в один клик, общий конфиг со скиллом `/ssh`, Codex skill и Gemini skill
|
||||
- **TOTP / 2FA** — коды Google Authenticator с обратным отсчётом, копирование в один клик
|
||||
- **Шифрование** — servers.json зашифрован Fernet (пароли не хранятся в открытом виде)
|
||||
- **Бэкапы** — ручные и автоматические с восстановлением в один клик
|
||||
@@ -241,23 +248,23 @@ python build.py
|
||||
3. **Терминал** — выберите сервер → вкладка Terminal → введите команду → Run
|
||||
4. **Файлы** — выберите сервер → вкладка Files → укажите пути → Upload/Download
|
||||
5. **Ключи** — вкладка Keys → Generate Key → Install on Server
|
||||
6. **Настройка Claude** — вкладка Setup → "Install Everything" → Claude Code готов
|
||||
6. **Настройка** — вкладка Setup → "Install Everything" → Claude Code, Codex и Gemini готовы
|
||||
7. Бейджи статуса обновляются автоматически (зелёный = online, красный = offline)
|
||||
|
||||
### Интеграция с Claude Code
|
||||
### Интеграция с Claude Code + Codex + Gemini
|
||||
|
||||
ServerManager и Claude Code используют **один и тот же файл конфигурации**: `~/.server-connections/servers.json`
|
||||
ServerManager, Claude Code, Codex и Gemini используют **один и тот же файл конфигурации**: `~/.server-connections/servers.json`
|
||||
|
||||
**Как это работает:**
|
||||
```
|
||||
ServerManager GUI ←→ ~/.server-connections/servers.json ←→ ssh.py (Claude Code)
|
||||
ServerManager GUI ←→ ~/.server-connections/servers.json ←→ backend ssh.py
|
||||
↕ ↕
|
||||
Добавил/изменил скилл /ssh
|
||||
сервер в GUI выполняет команды
|
||||
Добавил/изменил Claude /ssh + Codex + Gemini skill
|
||||
серверы в GUI выполняют команды
|
||||
```
|
||||
|
||||
- Добавил сервер в GUI → Claude Code сразу видит его через `/ssh list`
|
||||
- Оба используют один `ssh.py` + `servers.json`
|
||||
- Добавил сервер в GUI → Claude Code, Codex и Gemini сразу видят его
|
||||
- Оба агента используют один `ssh.py` + `servers.json`
|
||||
- Пароли **никогда** не проходят через API нейронки
|
||||
|
||||
**Новые SSH команды:**
|
||||
@@ -272,12 +279,16 @@ ServerManager GUI ←→ ~/.server-connections/servers.json ←→ ssh.py (C
|
||||
**Настройка на новой машине:**
|
||||
1. Установить ServerManager (клонировать репо или скачать бинарник)
|
||||
2. Открыть вкладку Setup → нажать "Install Everything"
|
||||
3. Готово. Claude Code теперь имеет скилл `/ssh` и доступ к серверам
|
||||
3. Готово. Claude Code получает скилл `/ssh`, а Codex и Gemini получают skill `server-manager` и доступ к серверам
|
||||
|
||||
Вкладка Setup устанавливает:
|
||||
- `ssh.py` → `~/.server-connections/` (SSH-утилита)
|
||||
- `encryption.py` → `~/.server-connections/` (модуль шифрования для CLI)
|
||||
- скилл `/ssh` → `~/.claude/commands/ssh.md` (скилл Claude Code)
|
||||
- skill `server-manager` → `~/.codex/skills/server-manager/` (скилл Codex)
|
||||
- skill `server-manager` → `~/.gemini/skills/server-manager/` (скилл Gemini)
|
||||
- wrapper `codex-ssh` → `~/.server-connections/` (безопасная точка входа для Codex)
|
||||
- wrapper `gemini-ssh` → `~/.server-connections/` (безопасная точка входа для Gemini)
|
||||
- SSH-ключ (ed25519) — если ещё не создан
|
||||
- Проверяет дубли — безопасно запускать повторно
|
||||
|
||||
@@ -319,7 +330,7 @@ ServerManager GUI ←→ ~/.server-connections/servers.json ←→ ssh.py (C
|
||||
- Пароли хранятся только локально, **никогда не передаются в AI/API**
|
||||
- SSH-ключи (ed25519) — рекомендуемый метод аутентификации
|
||||
- sudo-пароль передаётся через stdin (не виден в списке процессов)
|
||||
- При использовании с Claude Code: через API нейронки проходят только alias + команда, пароли остаются в зашифрованном локальном файле
|
||||
- При использовании с Claude Code или Codex: через API нейронки проходят только alias + команда, пароли остаются в зашифрованном локальном файле
|
||||
- Автоматический пред-шифровальный бэкап при первой миграции
|
||||
|
||||
### Развёртывание на новой машине
|
||||
@@ -336,7 +347,7 @@ pip install -r requirements.txt
|
||||
python main.py
|
||||
# → Вкладка Setup → Install Everything
|
||||
# → Добавить серверы через + Add
|
||||
# → Готово! GUI и Claude Code работают с одним конфигом
|
||||
# → Готово! GUI, Claude Code, Codex и Gemini работают с одним конфигом
|
||||
```
|
||||
|
||||
---
|
||||
@@ -350,7 +361,7 @@ python main.py
|
||||
- **SFTP传输** — 带进度条的文件上传/下载
|
||||
- **SSH密钥** — 生成ed25519、安装到服务器、复制到剪贴板
|
||||
- **状态监控** — 每60秒后台检查(在线/离线徽标)
|
||||
- **Claude Code集成** — 一键设置,与`/ssh`技能共享配置
|
||||
- **Claude Code + Codex + Gemini 集成** — 一键设置,与 `/ssh` 技能、Codex skill 和 Gemini skill 共享配置
|
||||
- **TOTP / 2FA** — 兼容Google Authenticator的验证码,实时倒计时,一键复制
|
||||
- **加密** — servers.json使用Fernet加密(密码不再以明文存储)
|
||||
- **备份** — 手动和自动备份,一键恢复
|
||||
@@ -390,34 +401,36 @@ python build.py
|
||||
3. **终端** — 选择服务器 → Terminal标签 → 输入命令 → Run
|
||||
4. **文件** — 选择服务器 → Files标签 → 设置路径 → Upload/Download
|
||||
5. **密钥** — Keys标签 → Generate Key → Install on Server
|
||||
6. **设置Claude** — Setup标签 → "Install Everything" → Claude Code就绪
|
||||
6. **设置** — Setup标签 → "Install Everything" → Claude Code 和 Codex 就绪
|
||||
7. 状态徽标自动更新(绿色 = 在线,红色 = 离线)
|
||||
|
||||
### Claude Code集成
|
||||
### Claude Code + Codex + Gemini 集成
|
||||
|
||||
ServerManager和Claude Code共享**同一个配置文件**:`~/.server-connections/servers.json`
|
||||
ServerManager、Claude Code 和 Codex 共享**同一个配置文件**:`~/.server-connections/servers.json`
|
||||
|
||||
**工作原理:**
|
||||
```
|
||||
ServerManager GUI ←→ ~/.server-connections/servers.json ←→ ssh.py (Claude Code)
|
||||
ServerManager GUI ←→ ~/.server-connections/servers.json ←→ ssh.py 后端
|
||||
↕ ↕
|
||||
在GUI中添加/编辑 /ssh技能
|
||||
在GUI中添加/编辑 Claude /ssh + Codex + Gemini skill
|
||||
服务器 执行命令
|
||||
```
|
||||
|
||||
- 在GUI中添加服务器 → Claude Code立即通过 `/ssh list` 看到
|
||||
- 两者使用相同的 `ssh.py` + `servers.json`
|
||||
- 在GUI中添加服务器 → Claude Code 和 Codex 都会立即看到
|
||||
- Claude Code、Codex 和 Gemini 都使用同一个 `ssh.py` + `servers.json`
|
||||
- 密码**绝不**通过AI API传递
|
||||
|
||||
**在新机器上设置:**
|
||||
1. 安装ServerManager(克隆仓库或下载二进制文件)
|
||||
2. 打开Setup标签 → 点击 "Install Everything"
|
||||
3. 完成。Claude Code现在拥有 `/ssh` 技能并可访问您的服务器
|
||||
3. 完成。Claude Code 现在拥有 `/ssh` 技能,Codex 现在拥有 `server-manager` 技能并可访问您的服务器
|
||||
|
||||
Setup标签安装:
|
||||
- `ssh.py` → `~/.server-connections/`(SSH工具)
|
||||
- `encryption.py` → `~/.server-connections/`(CLI加密模块)
|
||||
- `/ssh` 技能 → `~/.claude/commands/ssh.md`(Claude Code技能)
|
||||
- `server-manager` 技能 → `~/.codex/skills/server-manager/`(Codex技能包)
|
||||
- `codex-ssh` 包装器 → `~/.server-connections/`(Codex安全入口)
|
||||
- SSH密钥(ed25519)— 如果不存在
|
||||
- 检查重复 — 可安全重复运行
|
||||
|
||||
@@ -459,7 +472,7 @@ Setup标签安装:
|
||||
- 密码仅存储在本地,**绝不发送到任何AI/API**
|
||||
- SSH密钥(ed25519)— 推荐的认证方式
|
||||
- sudo密码通过stdin传递(在进程列表中不可见)
|
||||
- 与Claude Code配合使用时:只有别名和命令通过AI API传递,密码保留在本地加密文件中
|
||||
- 与 Claude Code 或 Codex 配合使用时:只有别名和命令通过 AI API 传递,密码保留在本地加密文件中
|
||||
- 首次迁移时自动创建加密前备份
|
||||
|
||||
### 在新机器上部署
|
||||
@@ -476,7 +489,7 @@ pip install -r requirements.txt
|
||||
python main.py
|
||||
# → Setup标签 → Install Everything
|
||||
# → 通过 + Add 添加服务器
|
||||
# → 完成!GUI和Claude Code使用同一个配置
|
||||
# → 完成!GUI、Claude Code 和 Codex 使用同一个配置
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
50
build.py
50
build.py
@@ -114,7 +114,12 @@ def build():
|
||||
"--add-data", f"config/servers.example.json{os.pathsep}config",
|
||||
"--add-data", f"tools/ssh.py{os.pathsep}tools",
|
||||
"--add-data", f"tools/skill-ssh.md{os.pathsep}tools",
|
||||
"--add-data", f"tools/install_ai_integrations.py{os.pathsep}tools",
|
||||
"--add-data", f"core/encryption.py{os.pathsep}core",
|
||||
"--add-data", f".codex/skills/server-manager{os.pathsep}.codex/skills/server-manager",
|
||||
"--add-data", f".gemini/skills/server-manager{os.pathsep}.gemini/skills/server-manager",
|
||||
"--add-data", f".gemini/settings.json{os.pathsep}.gemini",
|
||||
"--add-data", f"GEMINI.md{os.pathsep}.",
|
||||
]
|
||||
|
||||
# PNG icons for GUI (Material Design)
|
||||
@@ -133,6 +138,7 @@ def build():
|
||||
cmd_parts.extend([
|
||||
"--hidden-import", "customtkinter",
|
||||
"--hidden-import", "PIL",
|
||||
"--hidden-import", "PIL._tkinter_finder",
|
||||
"--hidden-import", "pyotp",
|
||||
"--hidden-import", "pyte",
|
||||
"--hidden-import", "psutil",
|
||||
@@ -349,34 +355,34 @@ def cleanup_old_releases():
|
||||
|
||||
|
||||
def deploy_shared_files():
|
||||
"""Auto-deploy ssh.py, encryption.py, skill to shared dirs after build.
|
||||
"""Auto-deploy shared CLI files and local agent integrations after build."""
|
||||
from core.claude_setup import (
|
||||
install_claude_skill,
|
||||
install_codex_skill,
|
||||
install_gemini_skill,
|
||||
install_ssh_script,
|
||||
)
|
||||
|
||||
Ensures Claude Code /ssh skill always uses the latest version.
|
||||
Without this, editing tools/ssh.py updates the exe but NOT the live
|
||||
~/.server-connections/ssh.py that Claude Code actually calls.
|
||||
"""
|
||||
shared_dir = os.path.expanduser("~/.server-connections")
|
||||
skill_dir = os.path.expanduser("~/.claude/commands")
|
||||
|
||||
deploy_map = [
|
||||
(os.path.join(PROJECT_DIR, "tools", "ssh.py"),
|
||||
os.path.join(shared_dir, "ssh.py")),
|
||||
(os.path.join(PROJECT_DIR, "core", "encryption.py"),
|
||||
os.path.join(shared_dir, "encryption.py")),
|
||||
(os.path.join(PROJECT_DIR, "tools", "skill-ssh.md"),
|
||||
os.path.join(skill_dir, "ssh.md")),
|
||||
deploy_steps = [
|
||||
install_ssh_script,
|
||||
install_claude_skill,
|
||||
install_codex_skill,
|
||||
install_gemini_skill,
|
||||
]
|
||||
|
||||
deployed = []
|
||||
for src, dst in deploy_map:
|
||||
if not os.path.exists(src):
|
||||
continue
|
||||
os.makedirs(os.path.dirname(dst), exist_ok=True)
|
||||
shutil.copy2(src, dst)
|
||||
deployed.append(os.path.basename(dst))
|
||||
for step in deploy_steps:
|
||||
try:
|
||||
result = step()
|
||||
if result:
|
||||
deployed.append(result.replace("\n", "; "))
|
||||
except Exception as exc:
|
||||
print(f"WARNING: auto-deploy step failed ({step.__name__}): {exc}")
|
||||
|
||||
if deployed:
|
||||
print(f"Auto-deployed to local: {', '.join(deployed)}")
|
||||
print("Auto-deployed to local:")
|
||||
for item in deployed:
|
||||
print(f"- {item}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
@@ -1,15 +1,16 @@
|
||||
"""
|
||||
Claude Code integration setup.
|
||||
Installs ssh.py, encryption.py, /ssh skill, SSH key — everything needed
|
||||
for Claude Code to manage servers via the shared servers.json.
|
||||
Local AI agent integration setup.
|
||||
Installs the shared ssh.py/encryption.py backend, Claude /ssh command,
|
||||
Codex/Gemini skill packages, platform-specific wrappers, and SSH key material.
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import re
|
||||
import shutil
|
||||
from core.logger import log
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
SHARED_DIR = os.path.expanduser("~/.server-connections")
|
||||
from core.logger import log
|
||||
|
||||
# PyInstaller: bundled data is in sys._MEIPASS; otherwise use project dir
|
||||
if getattr(sys, "frozen", False) and hasattr(sys, "_MEIPASS"):
|
||||
@@ -19,54 +20,276 @@ else:
|
||||
|
||||
SSH_SCRIPT_SRC = os.path.join(_BASE_DIR, "tools", "ssh.py")
|
||||
ENCRYPTION_SRC = os.path.join(_BASE_DIR, "core", "encryption.py")
|
||||
SKILL_SRC = os.path.join(_BASE_DIR, "tools", "skill-ssh.md")
|
||||
CLAUDE_SKILL_SRC = os.path.join(_BASE_DIR, "tools", "skill-ssh.md")
|
||||
GEMINI_CONTRACT_SRC = os.path.join(_BASE_DIR, "GEMINI.md")
|
||||
|
||||
SKILL_DST_DIR = os.path.expanduser("~/.claude/commands")
|
||||
SKILL_DST = os.path.join(SKILL_DST_DIR, "ssh.md")
|
||||
SSH_KEY_PATH = os.path.expanduser("~/.ssh/id_ed25519")
|
||||
GLOBAL_CLAUDE_MD = os.path.expanduser("~/.claude/CLAUDE.md")
|
||||
CODEX_SKILL_SRC_DIR = os.path.join(_BASE_DIR, ".codex", "skills", "server-manager")
|
||||
CODEX_WRAPPER_SRC_SH = os.path.join(CODEX_SKILL_SRC_DIR, "scripts", "codex-ssh-wrapper.sh")
|
||||
CODEX_WRAPPER_SRC_CMD = os.path.join(CODEX_SKILL_SRC_DIR, "scripts", "codex-ssh-wrapper.cmd")
|
||||
GEMINI_SKILL_SRC_DIR = os.path.join(_BASE_DIR, ".gemini", "skills", "server-manager")
|
||||
GEMINI_WRAPPER_SRC_SH = os.path.join(GEMINI_SKILL_SRC_DIR, "scripts", "gemini-ssh-wrapper.sh")
|
||||
GEMINI_WRAPPER_SRC_CMD = os.path.join(GEMINI_SKILL_SRC_DIR, "scripts", "gemini-ssh-wrapper.cmd")
|
||||
|
||||
_BLOCK_START = "<!-- server-manager:start -->"
|
||||
_BLOCK_END = "<!-- server-manager:end -->"
|
||||
_GEMINI_BLOCK_START = "<!-- server-manager-gemini:start -->"
|
||||
_GEMINI_BLOCK_END = "<!-- server-manager-gemini:end -->"
|
||||
|
||||
GLOBAL_CLAUDE_MD_BLOCK = f"""{_BLOCK_START}
|
||||
## Server Manager — управление серверами
|
||||
## Серверы — ТОЛЬКО через /ssh
|
||||
|
||||
**ВСЕГДА** используй server manager для подключения к серверам. Никогда не используй `ssh`, `sshpass` или прямые подключения.
|
||||
**НИКОГДА не используй raw `ssh` команды.** НИКОГДА не читай `~/.ssh/config` для поиска серверов.
|
||||
Все операции с серверами — **ТОЛЬКО через скилл `/ssh`** или напрямую через `ssh.py`:
|
||||
|
||||
- Скилл: `/ssh ALIAS "command"` — выполнить команду на сервере
|
||||
- Список серверов: `python3 ~/.server-connections/ssh.py --list`
|
||||
- Документация: `~/.claude/commands/ssh.md`
|
||||
- Memory bank: проект `global-infrastructure` → `techContext.md`
|
||||
- Инфраструктура: https://git.sensey24.ru/aibot777/infrastructure-docs
|
||||
```bash
|
||||
python ~/.server-connections/ssh.py --list # список серверов (alias, тип, заметки)
|
||||
python ~/.server-connections/ssh.py --info ALIAS # инфо (без creds)
|
||||
python ~/.server-connections/ssh.py --status # online/offline
|
||||
```
|
||||
|
||||
**Запрещено:** использовать `ssh`, `sshpass`, читать `~/.server-connections/` напрямую, раскрывать IP/пароли/порты.
|
||||
При вопросе о сервере — **СНАЧАЛА `--list`**, найди нужный алиас по заметкам и **ПРОВЕРЬ ТИП**.
|
||||
Скрипт `ssh.py` сам читает credentials из зашифрованного хранилища. Claude НЕ видит IP, логины, пароли.
|
||||
|
||||
### КРИТИЧНО — команды зависят от типа сервера
|
||||
|
||||
**`ALIAS "command"` (shell) — ТОЛЬКО для типов `ssh` и `telnet`!**
|
||||
|
||||
| Тип | Команды |
|
||||
|-----|---------|
|
||||
| `ssh`/`telnet` | `ALIAS "cmd"`, `--upload ALIAS local remote`, `--download ALIAS remote local` |
|
||||
| `s3` (MinIO и др.) | `--s3-buckets ALIAS`, `--s3-ls ALIAS bucket/prefix`, `--s3-upload ALIAS local bucket/key`, `--s3-download ALIAS bucket/key local`, `--s3-delete ALIAS bucket/key`, `--s3-url ALIAS bucket/key [SEC]`, `--s3-create-bucket ALIAS name` |
|
||||
| `mariadb`/`mssql`/`postgresql` | `--sql ALIAS "SELECT ..."`, `--sql-databases ALIAS`, `--sql-tables ALIAS [db]` |
|
||||
| `redis` | `--redis ALIAS "GET key"`, `--redis-info ALIAS`, `--redis-keys ALIAS "pattern"` |
|
||||
| `grafana` | `--grafana-dashboards ALIAS`, `--grafana-alerts ALIAS` |
|
||||
| `prometheus` | `--prom-query ALIAS "up"`, `--prom-targets ALIAS`, `--prom-alerts ALIAS` |
|
||||
| `winrm` | `--ps ALIAS "Get-Process"`, `--cmd ALIAS "dir"` |
|
||||
|
||||
**Формат: `python ~/.server-connections/ssh.py КОМАНДА АЛИАС АРГУМЕНТЫ`** — алиас ВСЕГДА второй после команды.
|
||||
|
||||
**S3 правило:** перед `--s3-upload/download/delete` — СНАЧАЛА `--s3-buckets ALIAS` и `--s3-ls ALIAS bucket/` чтобы узнать реальные бакеты и пути. НЕ УГАДЫВАЙ имена бакетов!
|
||||
|
||||
**Запрещено:** использовать `ssh`/`sshpass`, читать `~/.server-connections/` напрямую, раскрывать IP/пароли/порты.
|
||||
{_BLOCK_END}
|
||||
"""
|
||||
|
||||
GLOBAL_GEMINI_MD_BLOCK = f"""{_GEMINI_BLOCK_START}
|
||||
## ServerManager — use the installed skill
|
||||
|
||||
When a user asks about a server managed by ServerManager, use the installed `server-manager` skill first.
|
||||
|
||||
Preferred discovery commands:
|
||||
|
||||
```bash
|
||||
$HOME/.server-connections/gemini-ssh --list
|
||||
$HOME/.server-connections/gemini-ssh --info ALIAS
|
||||
$HOME/.server-connections/gemini-ssh --status
|
||||
```
|
||||
|
||||
Rules:
|
||||
|
||||
- Never read `~/.server-connections/servers.json`, `settings.json`, or `encryption.py` directly.
|
||||
- Never use `--list-full`.
|
||||
- Never use raw `ssh`, `scp`, `redis-cli`, `psql`, `mysql`, `mc`, or cloud CLIs unless the user explicitly asks to bypass ServerManager.
|
||||
- Choose commands strictly by the endpoint type reported by `--list`.
|
||||
- Use exactly one connection attempt per action and stop on timeout/failure.
|
||||
{_GEMINI_BLOCK_END}
|
||||
"""
|
||||
|
||||
|
||||
def _target_home() -> str:
|
||||
override = os.environ.get("SERVER_MANAGER_TARGET_HOME", "").strip()
|
||||
if override:
|
||||
return os.path.abspath(os.path.expanduser(override))
|
||||
return os.path.expanduser("~")
|
||||
|
||||
|
||||
def _shared_dir() -> str:
|
||||
return os.path.join(_target_home(), ".server-connections")
|
||||
|
||||
|
||||
def _gemini_dir() -> str:
|
||||
return os.path.join(_target_home(), ".gemini")
|
||||
|
||||
|
||||
def _claude_skill_dst_dir() -> str:
|
||||
return os.path.join(_target_home(), ".claude", "commands")
|
||||
|
||||
|
||||
def _claude_skill_dst() -> str:
|
||||
return os.path.join(_claude_skill_dst_dir(), "ssh.md")
|
||||
|
||||
|
||||
def _ssh_key_path() -> str:
|
||||
return os.path.join(_target_home(), ".ssh", "id_ed25519")
|
||||
|
||||
|
||||
def _global_claude_md() -> str:
|
||||
return os.path.join(_target_home(), ".claude", "CLAUDE.md")
|
||||
|
||||
|
||||
def _global_gemini_md() -> str:
|
||||
return os.path.join(_gemini_dir(), "GEMINI.md")
|
||||
|
||||
|
||||
def _codex_skill_dst_root() -> str:
|
||||
return os.path.join(_target_home(), ".codex", "skills")
|
||||
|
||||
|
||||
def _codex_skill_dst_dir() -> str:
|
||||
return os.path.join(_codex_skill_dst_root(), "server-manager")
|
||||
|
||||
|
||||
def _codex_skill_entry() -> str:
|
||||
return os.path.join(_codex_skill_dst_dir(), "SKILL.md")
|
||||
|
||||
|
||||
def _codex_wrapper_dst() -> str:
|
||||
return os.path.join(
|
||||
_shared_dir(),
|
||||
"codex-ssh.cmd" if sys.platform == "win32" else "codex-ssh",
|
||||
)
|
||||
|
||||
|
||||
def _gemini_skill_dst_root() -> str:
|
||||
return os.path.join(_gemini_dir(), "skills")
|
||||
|
||||
|
||||
def _gemini_skill_dst_dir() -> str:
|
||||
return os.path.join(_gemini_skill_dst_root(), "server-manager")
|
||||
|
||||
|
||||
def _gemini_skill_entry() -> str:
|
||||
return os.path.join(_gemini_skill_dst_dir(), "SKILL.md")
|
||||
|
||||
|
||||
def _agents_skill_dst_root() -> str:
|
||||
return os.path.join(_target_home(), ".agents", "skills")
|
||||
|
||||
|
||||
def _agents_skill_dst_dir() -> str:
|
||||
return os.path.join(_agents_skill_dst_root(), "server-manager")
|
||||
|
||||
|
||||
def _gemini_wrapper_dst() -> str:
|
||||
return os.path.join(
|
||||
_shared_dir(),
|
||||
"gemini-ssh.cmd" if sys.platform == "win32" else "gemini-ssh",
|
||||
)
|
||||
|
||||
|
||||
def _ensure_executable(path: str):
|
||||
if sys.platform == "win32" or not os.path.exists(path):
|
||||
return
|
||||
mode = os.stat(path).st_mode
|
||||
os.chmod(path, mode | 0o755)
|
||||
|
||||
|
||||
def _copy_file(src: str, dst: str, executable: bool = False) -> str:
|
||||
os.makedirs(os.path.dirname(dst), exist_ok=True)
|
||||
shutil.copy2(src, dst)
|
||||
if executable:
|
||||
_ensure_executable(dst)
|
||||
return dst
|
||||
|
||||
|
||||
def _copy_tree(src: str, dst: str) -> str:
|
||||
os.makedirs(dst, exist_ok=True)
|
||||
shutil.copytree(src, dst, dirs_exist_ok=True)
|
||||
return dst
|
||||
|
||||
|
||||
def _install_wrapper(src: str, dst: str) -> str:
|
||||
return _copy_file(src, dst, executable=(sys.platform != "win32"))
|
||||
|
||||
|
||||
def _skill_script_names() -> list[str]:
|
||||
if sys.platform == "win32":
|
||||
return [
|
||||
os.path.join("scripts", "server-manager-doctor.cmd"),
|
||||
os.path.join("scripts", "codex-ssh-wrapper.cmd"),
|
||||
os.path.join("scripts", "server-manager-gemini-doctor.cmd"),
|
||||
os.path.join("scripts", "gemini-ssh-wrapper.cmd"),
|
||||
]
|
||||
return [
|
||||
os.path.join("scripts", "server-manager-doctor.sh"),
|
||||
os.path.join("scripts", "codex-ssh-wrapper.sh"),
|
||||
os.path.join("scripts", "server-manager-gemini-doctor.sh"),
|
||||
os.path.join("scripts", "gemini-ssh-wrapper.sh"),
|
||||
]
|
||||
|
||||
|
||||
def _ensure_skill_scripts(skill_dir: str):
|
||||
for rel_path in _skill_script_names():
|
||||
_ensure_executable(os.path.join(skill_dir, rel_path))
|
||||
|
||||
|
||||
def _iter_all_user_homes() -> list[str]:
|
||||
homes: list[str] = []
|
||||
|
||||
def add(path: str):
|
||||
expanded = os.path.abspath(os.path.expanduser(path))
|
||||
if os.path.isdir(expanded) and expanded not in homes:
|
||||
homes.append(expanded)
|
||||
|
||||
add(_target_home())
|
||||
|
||||
if sys.platform == "win32":
|
||||
users_root = os.path.join(os.environ.get("SystemDrive", "C:"), "Users")
|
||||
skip = {"public", "default", "default user", "all users"}
|
||||
if os.path.isdir(users_root):
|
||||
for name in sorted(os.listdir(users_root)):
|
||||
if name.lower() in skip:
|
||||
continue
|
||||
add(os.path.join(users_root, name))
|
||||
elif sys.platform == "darwin":
|
||||
add("/var/root")
|
||||
users_root = "/Users"
|
||||
if os.path.isdir(users_root):
|
||||
for name in sorted(os.listdir(users_root)):
|
||||
if name.startswith("."):
|
||||
continue
|
||||
add(os.path.join(users_root, name))
|
||||
else:
|
||||
add("/root")
|
||||
users_root = "/home"
|
||||
if os.path.isdir(users_root):
|
||||
for name in sorted(os.listdir(users_root)):
|
||||
if name.startswith("."):
|
||||
continue
|
||||
add(os.path.join(users_root, name))
|
||||
|
||||
return homes
|
||||
|
||||
|
||||
def check_status() -> dict:
|
||||
"""Check what's installed and what's missing."""
|
||||
shared_dir = _shared_dir()
|
||||
ssh_key_path = _ssh_key_path()
|
||||
return {
|
||||
"shared_dir": os.path.exists(SHARED_DIR),
|
||||
"servers_json": os.path.exists(os.path.join(SHARED_DIR, "servers.json")),
|
||||
"ssh_script": os.path.exists(os.path.join(SHARED_DIR, "ssh.py")),
|
||||
"encryption": os.path.exists(os.path.join(SHARED_DIR, "encryption.py")),
|
||||
"skill_installed": os.path.exists(SKILL_DST),
|
||||
"ssh_key_exists": os.path.exists(SSH_KEY_PATH),
|
||||
"ssh_key_pub": os.path.exists(SSH_KEY_PATH + ".pub"),
|
||||
"target_home": _target_home(),
|
||||
"shared_dir": os.path.exists(shared_dir),
|
||||
"servers_json": os.path.exists(os.path.join(shared_dir, "servers.json")),
|
||||
"ssh_script": os.path.exists(os.path.join(shared_dir, "ssh.py")),
|
||||
"encryption": os.path.exists(os.path.join(shared_dir, "encryption.py")),
|
||||
"claude_skill_installed": os.path.exists(_claude_skill_dst()),
|
||||
"codex_skill_installed": os.path.exists(_codex_skill_entry()),
|
||||
"codex_wrapper_installed": os.path.exists(_codex_wrapper_dst()),
|
||||
"gemini_skill_installed": os.path.exists(_gemini_skill_entry()),
|
||||
"gemini_wrapper_installed": os.path.exists(_gemini_wrapper_dst()),
|
||||
"ssh_key_exists": os.path.exists(ssh_key_path),
|
||||
"ssh_key_pub": os.path.exists(ssh_key_path + ".pub"),
|
||||
}
|
||||
|
||||
|
||||
def install_ssh_script() -> str:
|
||||
"""Copy ssh.py and encryption.py to shared dir."""
|
||||
os.makedirs(SHARED_DIR, exist_ok=True)
|
||||
shared_dir = _shared_dir()
|
||||
os.makedirs(shared_dir, exist_ok=True)
|
||||
results = []
|
||||
|
||||
# Copy ssh.py
|
||||
dst = os.path.join(SHARED_DIR, "ssh.py")
|
||||
dst = os.path.join(shared_dir, "ssh.py")
|
||||
if os.path.exists(SSH_SCRIPT_SRC):
|
||||
shutil.copy2(SSH_SCRIPT_SRC, dst)
|
||||
_copy_file(SSH_SCRIPT_SRC, dst, executable=True)
|
||||
log.info(f"ssh.py installed: {dst}")
|
||||
results.append(f"ssh.py installed: {dst}")
|
||||
elif os.path.exists(dst):
|
||||
@@ -74,10 +297,9 @@ def install_ssh_script() -> str:
|
||||
else:
|
||||
results.append("ERROR: ssh.py source not found")
|
||||
|
||||
# Copy encryption.py
|
||||
enc_dst = os.path.join(SHARED_DIR, "encryption.py")
|
||||
enc_dst = os.path.join(shared_dir, "encryption.py")
|
||||
if os.path.exists(ENCRYPTION_SRC):
|
||||
shutil.copy2(ENCRYPTION_SRC, enc_dst)
|
||||
_copy_file(ENCRYPTION_SRC, enc_dst)
|
||||
log.info(f"encryption.py installed: {enc_dst}")
|
||||
results.append(f"encryption.py installed: {enc_dst}")
|
||||
elif os.path.exists(enc_dst):
|
||||
@@ -88,40 +310,125 @@ def install_ssh_script() -> str:
|
||||
return "\n".join(results)
|
||||
|
||||
|
||||
def install_skill() -> str:
|
||||
def install_claude_skill() -> str:
|
||||
"""Install /ssh skill for Claude Code."""
|
||||
os.makedirs(SKILL_DST_DIR, exist_ok=True)
|
||||
if os.path.exists(SKILL_SRC):
|
||||
shutil.copy2(SKILL_SRC, SKILL_DST)
|
||||
log.info(f"Skill installed: {SKILL_DST}")
|
||||
return f"Skill installed: {SKILL_DST}"
|
||||
# Fallback: check existing
|
||||
if os.path.exists(SKILL_DST):
|
||||
return f"Skill already exists: {SKILL_DST}"
|
||||
# Generate minimal skill
|
||||
claude_skill_dst_dir = _claude_skill_dst_dir()
|
||||
claude_skill_dst = _claude_skill_dst()
|
||||
os.makedirs(claude_skill_dst_dir, exist_ok=True)
|
||||
if os.path.exists(CLAUDE_SKILL_SRC):
|
||||
_copy_file(CLAUDE_SKILL_SRC, claude_skill_dst)
|
||||
log.info(f"Claude skill installed: {claude_skill_dst}")
|
||||
return f"Claude skill installed: {claude_skill_dst}"
|
||||
if os.path.exists(claude_skill_dst):
|
||||
return f"Claude skill already exists: {claude_skill_dst}"
|
||||
|
||||
skill_content = _generate_skill_content()
|
||||
with open(SKILL_DST, "w", encoding="utf-8") as f:
|
||||
with open(claude_skill_dst, "w", encoding="utf-8") as f:
|
||||
f.write(skill_content)
|
||||
log.info(f"Skill generated: {SKILL_DST}")
|
||||
return f"Skill generated: {SKILL_DST}"
|
||||
log.info(f"Claude skill generated: {claude_skill_dst}")
|
||||
return f"Claude skill generated: {claude_skill_dst}"
|
||||
|
||||
|
||||
def install_codex_skill() -> str:
|
||||
"""Install ServerManager skill package for Codex and the local wrapper."""
|
||||
results = []
|
||||
codex_skill_dst_dir = _codex_skill_dst_dir()
|
||||
codex_skill_entry = _codex_skill_entry()
|
||||
codex_wrapper_dst = _codex_wrapper_dst()
|
||||
|
||||
if os.path.isdir(CODEX_SKILL_SRC_DIR):
|
||||
_copy_tree(CODEX_SKILL_SRC_DIR, codex_skill_dst_dir)
|
||||
_ensure_skill_scripts(codex_skill_dst_dir)
|
||||
log.info(f"Codex skill installed: {codex_skill_dst_dir}")
|
||||
results.append(f"Codex skill installed: {codex_skill_dst_dir}")
|
||||
elif os.path.exists(codex_skill_entry):
|
||||
results.append(f"Codex skill already exists: {codex_skill_dst_dir}")
|
||||
else:
|
||||
results.append("ERROR: Codex skill source not found")
|
||||
|
||||
wrapper_src = CODEX_WRAPPER_SRC_CMD if sys.platform == "win32" else CODEX_WRAPPER_SRC_SH
|
||||
if os.path.exists(wrapper_src):
|
||||
_copy_file(wrapper_src, codex_wrapper_dst, executable=(sys.platform != "win32"))
|
||||
log.info(f"Codex wrapper installed: {codex_wrapper_dst}")
|
||||
results.append(f"Codex wrapper installed: {codex_wrapper_dst}")
|
||||
elif os.path.exists(codex_wrapper_dst):
|
||||
results.append(f"Codex wrapper already exists: {codex_wrapper_dst}")
|
||||
else:
|
||||
results.append("ERROR: Codex wrapper source not found")
|
||||
|
||||
return "\n".join(results)
|
||||
|
||||
|
||||
def install_gemini_skill() -> str:
|
||||
"""Install ServerManager skill package for Gemini."""
|
||||
results = []
|
||||
gemini_skill_dst_dir = _gemini_skill_dst_dir()
|
||||
gemini_skill_entry = _gemini_skill_entry()
|
||||
agents_skill_dst_dir = _agents_skill_dst_dir()
|
||||
gemini_wrapper_dst = _gemini_wrapper_dst()
|
||||
install_generic_mirror = os.environ.get(
|
||||
"SERVER_MANAGER_INSTALL_GENERIC_SKILL_MIRROR", ""
|
||||
).strip() == "1"
|
||||
|
||||
if os.path.isdir(GEMINI_SKILL_SRC_DIR):
|
||||
_copy_tree(GEMINI_SKILL_SRC_DIR, gemini_skill_dst_dir)
|
||||
_ensure_skill_scripts(gemini_skill_dst_dir)
|
||||
log.info(f"Gemini skill installed: {gemini_skill_dst_dir}")
|
||||
results.append(f"Gemini skill installed: {gemini_skill_dst_dir}")
|
||||
|
||||
if install_generic_mirror:
|
||||
_copy_tree(GEMINI_SKILL_SRC_DIR, agents_skill_dst_dir)
|
||||
_ensure_skill_scripts(agents_skill_dst_dir)
|
||||
log.info(f"Generic agents skill mirror installed: {agents_skill_dst_dir}")
|
||||
results.append(f"Generic agents skill mirror installed: {agents_skill_dst_dir}")
|
||||
elif os.path.exists(agents_skill_dst_dir):
|
||||
shutil.rmtree(agents_skill_dst_dir, ignore_errors=True)
|
||||
log.info(f"Removed generic agents skill mirror to avoid Gemini conflicts: {agents_skill_dst_dir}")
|
||||
results.append(
|
||||
f"Removed generic agents skill mirror to avoid Gemini conflicts: {agents_skill_dst_dir}"
|
||||
)
|
||||
elif os.path.exists(gemini_skill_entry):
|
||||
results.append(f"Gemini skill already exists: {gemini_skill_dst_dir}")
|
||||
else:
|
||||
results.append("ERROR: Gemini skill source not found")
|
||||
|
||||
wrapper_src = GEMINI_WRAPPER_SRC_CMD if sys.platform == "win32" else GEMINI_WRAPPER_SRC_SH
|
||||
if not os.path.exists(wrapper_src):
|
||||
wrapper_src = CODEX_WRAPPER_SRC_CMD if sys.platform == "win32" else CODEX_WRAPPER_SRC_SH
|
||||
|
||||
if os.path.exists(wrapper_src):
|
||||
_install_wrapper(wrapper_src, gemini_wrapper_dst)
|
||||
log.info(f"Gemini wrapper installed: {gemini_wrapper_dst}")
|
||||
results.append(f"Gemini wrapper installed: {gemini_wrapper_dst}")
|
||||
elif os.path.exists(gemini_wrapper_dst):
|
||||
results.append(f"Gemini wrapper already exists: {gemini_wrapper_dst}")
|
||||
else:
|
||||
results.append("ERROR: Gemini wrapper source not found")
|
||||
|
||||
return "\n".join(results)
|
||||
|
||||
|
||||
def install_skill() -> str:
|
||||
"""Backward-compatible alias for the Claude /ssh skill installer."""
|
||||
return install_claude_skill()
|
||||
|
||||
|
||||
def generate_ssh_key() -> str:
|
||||
"""Generate ed25519 SSH key if not exists."""
|
||||
if os.path.exists(SSH_KEY_PATH):
|
||||
return f"Key already exists: {SSH_KEY_PATH}"
|
||||
ssh_key_path = _ssh_key_path()
|
||||
if os.path.exists(ssh_key_path):
|
||||
return f"Key already exists: {ssh_key_path}"
|
||||
|
||||
os.makedirs(os.path.dirname(SSH_KEY_PATH), exist_ok=True)
|
||||
os.makedirs(os.path.dirname(ssh_key_path), exist_ok=True)
|
||||
|
||||
import subprocess
|
||||
try:
|
||||
subprocess.run(
|
||||
["ssh-keygen", "-t", "ed25519", "-f", SSH_KEY_PATH,
|
||||
["ssh-keygen", "-t", "ed25519", "-f", ssh_key_path,
|
||||
"-N", "", "-C", "server-manager"],
|
||||
check=True, capture_output=True, timeout=15
|
||||
)
|
||||
log.info(f"SSH key generated: {SSH_KEY_PATH}")
|
||||
return f"Key generated: {SSH_KEY_PATH}"
|
||||
log.info(f"SSH key generated: {ssh_key_path}")
|
||||
return f"Key generated: {ssh_key_path}"
|
||||
except FileNotFoundError:
|
||||
hint = "enable OpenSSH optional feature" if sys.platform == "win32" else "install openssh-client"
|
||||
msg = f"ERROR: ssh-keygen not found — {hint}"
|
||||
@@ -134,16 +441,13 @@ def generate_ssh_key() -> str:
|
||||
|
||||
|
||||
def install_global_claude_md() -> str:
|
||||
"""Add/update server manager section in global ~/.claude/CLAUDE.md.
|
||||
|
||||
Uses start/end markers to safely replace existing block without duplication.
|
||||
"""
|
||||
import re
|
||||
os.makedirs(os.path.dirname(GLOBAL_CLAUDE_MD), exist_ok=True)
|
||||
"""Add/update server manager section in global ~/.claude/CLAUDE.md."""
|
||||
global_claude_md = _global_claude_md()
|
||||
os.makedirs(os.path.dirname(global_claude_md), exist_ok=True)
|
||||
|
||||
existing = ""
|
||||
if os.path.exists(GLOBAL_CLAUDE_MD):
|
||||
with open(GLOBAL_CLAUDE_MD, encoding="utf-8") as f:
|
||||
if os.path.exists(global_claude_md):
|
||||
with open(global_claude_md, encoding="utf-8") as f:
|
||||
existing = f.read()
|
||||
|
||||
pattern = re.compile(
|
||||
@@ -152,43 +456,97 @@ def install_global_claude_md() -> str:
|
||||
)
|
||||
|
||||
if pattern.search(existing):
|
||||
# Блок уже есть — заменяем на актуальную версию
|
||||
updated = pattern.sub(GLOBAL_CLAUDE_MD_BLOCK.strip(), existing)
|
||||
with open(GLOBAL_CLAUDE_MD, "w", encoding="utf-8") as f:
|
||||
with open(global_claude_md, "w", encoding="utf-8") as f:
|
||||
f.write(updated)
|
||||
log.info(f"Global CLAUDE.md block updated: {GLOBAL_CLAUDE_MD}")
|
||||
return f"Global CLAUDE.md block updated: {GLOBAL_CLAUDE_MD}"
|
||||
else:
|
||||
# Блока нет — добавляем в конец
|
||||
with open(GLOBAL_CLAUDE_MD, "a", encoding="utf-8") as f:
|
||||
if existing and not existing.endswith("\n"):
|
||||
f.write("\n")
|
||||
f.write("\n" + GLOBAL_CLAUDE_MD_BLOCK)
|
||||
log.info(f"Global CLAUDE.md block added: {GLOBAL_CLAUDE_MD}")
|
||||
return f"Global CLAUDE.md block added: {GLOBAL_CLAUDE_MD}"
|
||||
log.info(f"Global CLAUDE.md block updated: {global_claude_md}")
|
||||
return f"Global CLAUDE.md block updated: {global_claude_md}"
|
||||
|
||||
with open(global_claude_md, "a", encoding="utf-8") as f:
|
||||
if existing and not existing.endswith("\n"):
|
||||
f.write("\n")
|
||||
f.write("\n" + GLOBAL_CLAUDE_MD_BLOCK)
|
||||
log.info(f"Global CLAUDE.md block added: {global_claude_md}")
|
||||
return f"Global CLAUDE.md block added: {global_claude_md}"
|
||||
|
||||
|
||||
def install_global_gemini_md() -> str:
|
||||
"""Add/update server manager section in global ~/.gemini/GEMINI.md."""
|
||||
global_gemini_md = _global_gemini_md()
|
||||
os.makedirs(os.path.dirname(global_gemini_md), exist_ok=True)
|
||||
|
||||
existing = ""
|
||||
if os.path.exists(global_gemini_md):
|
||||
with open(global_gemini_md, encoding="utf-8") as f:
|
||||
existing = f.read()
|
||||
|
||||
pattern = re.compile(
|
||||
re.escape(_GEMINI_BLOCK_START) + r".*?" + re.escape(_GEMINI_BLOCK_END),
|
||||
re.DOTALL
|
||||
)
|
||||
|
||||
if pattern.search(existing):
|
||||
updated = pattern.sub(GLOBAL_GEMINI_MD_BLOCK.strip(), existing)
|
||||
with open(global_gemini_md, "w", encoding="utf-8") as f:
|
||||
f.write(updated)
|
||||
log.info(f"Global GEMINI.md block updated: {global_gemini_md}")
|
||||
return f"Global GEMINI.md block updated: {global_gemini_md}"
|
||||
|
||||
with open(global_gemini_md, "a", encoding="utf-8") as f:
|
||||
if existing and not existing.endswith("\n"):
|
||||
f.write("\n")
|
||||
f.write("\n" + GLOBAL_GEMINI_MD_BLOCK)
|
||||
log.info(f"Global GEMINI.md block added: {global_gemini_md}")
|
||||
return f"Global GEMINI.md block added: {global_gemini_md}"
|
||||
|
||||
|
||||
def install_all() -> list[str]:
|
||||
"""Full setup — install everything."""
|
||||
results = []
|
||||
|
||||
steps = [
|
||||
"""Full setup — install everything for Claude Code, Codex, and Gemini."""
|
||||
all_users = os.environ.get("SERVER_MANAGER_INSTALL_ALL_USERS", "").strip() == "1"
|
||||
base_steps = [
|
||||
("ssh_script", install_ssh_script),
|
||||
("skill", install_skill),
|
||||
("ssh_key", generate_ssh_key),
|
||||
("claude_skill", install_claude_skill),
|
||||
("codex_skill", install_codex_skill),
|
||||
("gemini_skill", install_gemini_skill),
|
||||
("global_claude_md", install_global_claude_md),
|
||||
("global_gemini_md", install_global_gemini_md),
|
||||
]
|
||||
|
||||
for name, func in steps:
|
||||
try:
|
||||
log.info(f"install_all: running {name}")
|
||||
result = func()
|
||||
results.append(result)
|
||||
except Exception as e:
|
||||
msg = f"ERROR ({name}): {e}"
|
||||
log.error(msg)
|
||||
results.append(msg)
|
||||
if not all_users:
|
||||
steps = base_steps[:3] + [("ssh_key", generate_ssh_key)] + base_steps[3:]
|
||||
results = []
|
||||
for name, func in steps:
|
||||
try:
|
||||
log.info(f"install_all: running {name}")
|
||||
result = func()
|
||||
results.append(result)
|
||||
except Exception as e:
|
||||
msg = f"ERROR ({name}): {e}"
|
||||
log.error(msg)
|
||||
results.append(msg)
|
||||
return results
|
||||
|
||||
results = []
|
||||
original_target = os.environ.get("SERVER_MANAGER_TARGET_HOME")
|
||||
for home in _iter_all_user_homes():
|
||||
os.environ["SERVER_MANAGER_TARGET_HOME"] = home
|
||||
results.append(f"[target_home] {home}")
|
||||
for name, func in base_steps:
|
||||
try:
|
||||
log.info(f"install_all(all_users): running {name} for {home}")
|
||||
result = func()
|
||||
results.append(result)
|
||||
except Exception as e:
|
||||
msg = f"ERROR ({name}, {home}): {e}"
|
||||
log.error(msg)
|
||||
results.append(msg)
|
||||
|
||||
if original_target is None:
|
||||
os.environ.pop("SERVER_MANAGER_TARGET_HOME", None)
|
||||
else:
|
||||
os.environ["SERVER_MANAGER_TARGET_HOME"] = original_target
|
||||
|
||||
results.append("INFO: SSH key generation skipped in SERVER_MANAGER_INSTALL_ALL_USERS=1 mode")
|
||||
return results
|
||||
|
||||
|
||||
|
||||
@@ -20,19 +20,25 @@ class GrafanaClient:
|
||||
Initialize the Grafana client.
|
||||
|
||||
Args:
|
||||
server: dict with keys: ip, port, api_token, use_ssl
|
||||
server: dict with keys: ip, port, api_token (or user+password), use_ssl
|
||||
"""
|
||||
self.ip: str = server["ip"]
|
||||
self.port: int = int(server["port"])
|
||||
self.api_token: str = server["api_token"]
|
||||
self.api_token: str = server.get("api_token", "")
|
||||
self.user: str = server.get("user", "")
|
||||
self.password: str = server.get("password", "")
|
||||
self.use_ssl: bool = bool(server.get("use_ssl", False))
|
||||
|
||||
scheme = "https" if self.use_ssl else "http"
|
||||
self.base_url: str = f"{scheme}://{self.ip}:{self.port}"
|
||||
self.headers: dict[str, str] = {
|
||||
"Authorization": f"Bearer {self.api_token}",
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
self.headers: dict[str, str] = {"Content-Type": "application/json"}
|
||||
self.auth: tuple[str, str] | None = None
|
||||
|
||||
if self.api_token:
|
||||
self.headers["Authorization"] = f"Bearer {self.api_token}"
|
||||
elif self.user and self.password:
|
||||
self.auth = (self.user, self.password)
|
||||
|
||||
self.timeout: int = 10
|
||||
|
||||
def _get(self, path: str, params: dict | None = None) -> Any:
|
||||
@@ -42,7 +48,7 @@ class GrafanaClient:
|
||||
url = f"{self.base_url}{path}"
|
||||
log.debug("Grafana GET %s", url)
|
||||
resp = requests.get(
|
||||
url, headers=self.headers, params=params, timeout=self.timeout
|
||||
url, headers=self.headers, params=params, auth=self.auth, timeout=self.timeout
|
||||
)
|
||||
resp.raise_for_status()
|
||||
return resp.json()
|
||||
@@ -54,7 +60,7 @@ class GrafanaClient:
|
||||
url = f"{self.base_url}{path}"
|
||||
log.debug("Grafana POST %s", url)
|
||||
resp = requests.post(
|
||||
url, headers=self.headers, json=json_data, timeout=self.timeout
|
||||
url, headers=self.headers, json=json_data, auth=self.auth, timeout=self.timeout
|
||||
)
|
||||
resp.raise_for_status()
|
||||
return resp.json()
|
||||
@@ -132,6 +138,16 @@ class GrafanaClient:
|
||||
log.error("Grafana list_alerts failed: %s", exc)
|
||||
return []
|
||||
|
||||
def get_active_alerts(self) -> list[dict]:
|
||||
"""List active (firing) alerts via AlertManager endpoint."""
|
||||
try:
|
||||
results = self._get("/api/alertmanager/grafana/api/v2/alerts")
|
||||
log.info("Grafana: %d active alerts", len(results))
|
||||
return results
|
||||
except Exception as exc:
|
||||
log.error("Grafana get_active_alerts failed: %s", exc)
|
||||
return []
|
||||
|
||||
def list_datasources(self) -> list[dict]:
|
||||
"""
|
||||
List all datasources via GET /api/datasources.
|
||||
|
||||
186
core/i18n.py
186
core/i18n.py
@@ -46,7 +46,7 @@ _EN = {
|
||||
"about_desc": (
|
||||
"Desktop application for managing remote servers.\n"
|
||||
"SSH terminal, SFTP file transfer, key management,\n"
|
||||
"encrypted credentials, and Claude Code integration."
|
||||
"encrypted credentials, and Claude Code / Codex integration."
|
||||
),
|
||||
"about_features_title": "⚡ Features",
|
||||
"about_features": (
|
||||
@@ -56,13 +56,13 @@ _EN = {
|
||||
"• TOTP / 2FA (Google Authenticator)\n"
|
||||
"• Encrypted credentials (Fernet)\n"
|
||||
"• Automatic backups\n"
|
||||
"• Claude Code integration"
|
||||
"• Claude Code and Codex integration"
|
||||
),
|
||||
"about_howto_title": "🚀 Quick Start",
|
||||
"about_howto": (
|
||||
"1. Click \"+ Add\" to add a server\n"
|
||||
"2. Select server → Terminal / Files\n"
|
||||
"3. Setup tab → Claude Code integration"
|
||||
"3. Setup tab → Claude Code / Codex integration"
|
||||
),
|
||||
"version": "Version",
|
||||
"author": "Author",
|
||||
@@ -111,6 +111,10 @@ _EN = {
|
||||
"term_connecting": "Connecting to {alias}...",
|
||||
"term_connected": "Connected to {alias}",
|
||||
"term_disconnected": "Disconnected",
|
||||
"term_off": "OFFLINE",
|
||||
"ctx_disconnect": "Disconnect",
|
||||
"term_click_to_connect": "Double-click to connect to {alias}",
|
||||
"sftp_click_to_connect": "Double-click server to browse files",
|
||||
"term_reconnecting": "Reconnecting ({n}/{max})...",
|
||||
"term_connect_failed": "Connection failed: {error}",
|
||||
"term_reconnect_fail": "Disconnected (reconnect failed)",
|
||||
@@ -153,6 +157,12 @@ _EN = {
|
||||
"no_public_key": "[!] No public key to copy",
|
||||
|
||||
# Setup
|
||||
"agent_integration": "AI Agent Integration",
|
||||
"agent_desc": (
|
||||
"Setup everything so Claude Code, Codex, and Gemini can manage your servers via shared local skills.\n"
|
||||
"ServerManager, Claude Code, Codex, and Gemini share the same servers.json — add a server here,\n"
|
||||
"all agents see it immediately."
|
||||
),
|
||||
"claude_integration": "Claude Code Integration",
|
||||
"claude_desc": (
|
||||
"Setup everything so Claude Code can manage your servers via /ssh skill.\n"
|
||||
@@ -165,11 +175,19 @@ _EN = {
|
||||
"status_ssh_script": "ssh.py (CLI tool)",
|
||||
"status_encryption": "Encryption module",
|
||||
"status_skill": "/ssh skill for Claude Code",
|
||||
"status_claude_skill": "/ssh skill for Claude Code",
|
||||
"status_codex_skill": "ServerManager skill for Codex",
|
||||
"status_codex_wrapper": "Codex wrapper (codex-ssh)",
|
||||
"status_gemini_skill": "ServerManager skill for Gemini",
|
||||
"status_gemini_wrapper": "Gemini wrapper (gemini-ssh)",
|
||||
"status_ssh_key": "SSH key (ed25519)",
|
||||
"install_everything": "Install Everything",
|
||||
"installing_all": "Installing...",
|
||||
"install_ssh_py": "ssh.py",
|
||||
"install_skill": "/ssh skill",
|
||||
"install_claude_skill": "Claude skill",
|
||||
"install_codex_skill": "Codex skill",
|
||||
"install_gemini_skill": "Gemini skill",
|
||||
"install_ssh_key": "SSH key",
|
||||
"refresh": "Refresh",
|
||||
"configuration": "Configuration",
|
||||
@@ -179,7 +197,7 @@ _EN = {
|
||||
"select_backup": "Select backup...",
|
||||
"no_backups": "No backups",
|
||||
"restore": "Restore",
|
||||
"install_done": "Done! Claude Code can now use /ssh to manage your servers.",
|
||||
"install_done": "Done! Claude Code, Codex, and Gemini can now use ServerManager to manage your servers.",
|
||||
"config_changed": "Config path changed: {path}",
|
||||
"backup_created": "Backup created: {name}",
|
||||
"backup_failed": "Backup failed: {e}",
|
||||
@@ -390,6 +408,12 @@ _EN = {
|
||||
"s3_uploading_n": "Uploading {count} files...",
|
||||
"s3_uploaded_n": "Uploaded {count} files",
|
||||
"s3_upload_partial": "Uploaded {ok}/{total} files",
|
||||
"s3_create_bucket": "Create Bucket",
|
||||
"s3_bucket_name_prompt": "Bucket name:",
|
||||
"s3_delete_bucket": "Delete Bucket",
|
||||
"s3_delete_bucket_confirm": "Delete bucket \"{name}\"? It must be empty.",
|
||||
"s3_bucket_created": "Bucket \"{name}\" created",
|
||||
"s3_bucket_deleted": "Bucket \"{name}\" deleted",
|
||||
"s3_new_folder": "New Folder",
|
||||
"s3_folder_name_prompt": "Folder name:",
|
||||
"s3_creating_folder": "Creating folder...",
|
||||
@@ -419,6 +443,19 @@ _EN = {
|
||||
"grafana_connected": "Connected to {alias}",
|
||||
"grafana_no_dashboards": "No dashboards found",
|
||||
"grafana_no_alerts": "No alerts",
|
||||
"grafana_loading": "Loading...",
|
||||
"grafana_loaded": "{dashboards} dashboards, {alerts} alerts, {datasources} datasources",
|
||||
"grafana_no_server": "No server selected",
|
||||
"grafana_open_browser": "Open Grafana",
|
||||
"grafana_datasources": "Datasources",
|
||||
"grafana_ds_name": "Name",
|
||||
"grafana_ds_type": "Type",
|
||||
"grafana_ds_default": "Default",
|
||||
"grafana_dash_title": "Title",
|
||||
"grafana_dash_folder": "Folder",
|
||||
"grafana_alert_state": "State",
|
||||
"grafana_alert_name": "Name",
|
||||
"grafana_alert_severity": "Severity",
|
||||
|
||||
# Prometheus tab
|
||||
"prom_refresh": "Refresh",
|
||||
@@ -434,6 +471,23 @@ _EN = {
|
||||
"prom_no_targets": "No targets",
|
||||
"prom_no_alerts": "No alerts",
|
||||
"prom_placeholder": "up",
|
||||
"prom_loading": "Loading...",
|
||||
"prom_loaded": "{targets} targets, {alerts} alerts, {rules} rules",
|
||||
"prom_no_server": "No server selected",
|
||||
"prom_executing": "Executing...",
|
||||
"prom_results": "Results",
|
||||
"prom_query_placeholder": "e.g. up, node_cpu_seconds_total",
|
||||
"prom_metrics_browser": "Metrics",
|
||||
"prom_filter_metrics": "Filter metrics...",
|
||||
"prom_rules": "Rules",
|
||||
"prom_rule_type": "Type",
|
||||
"prom_rule_name": "Name",
|
||||
"prom_rule_group": "Group",
|
||||
"prom_rule_health": "Health",
|
||||
"prom_target_job": "Job",
|
||||
"prom_target_instance": "Instance",
|
||||
"prom_target_health": "Health",
|
||||
"prom_target_scrape": "Last Scrape",
|
||||
|
||||
# PowerShell tab
|
||||
"ps_execute": "Execute",
|
||||
@@ -563,7 +617,7 @@ _RU = {
|
||||
"about_desc": (
|
||||
"Настольное приложение для управления удалёнными серверами.\n"
|
||||
"SSH-терминал, SFTP-передача файлов, управление ключами,\n"
|
||||
"шифрование паролей и интеграция с Claude Code."
|
||||
"шифрование паролей и интеграция с Claude Code / Codex."
|
||||
),
|
||||
"about_features_title": "⚡ Возможности",
|
||||
"about_features": (
|
||||
@@ -573,13 +627,13 @@ _RU = {
|
||||
"• TOTP / 2FA (Google Authenticator)\n"
|
||||
"• Шифрование паролей (Fernet)\n"
|
||||
"• Автоматические бэкапы\n"
|
||||
"• Интеграция с Claude Code"
|
||||
"• Интеграция с Claude Code и Codex"
|
||||
),
|
||||
"about_howto_title": "🚀 Быстрый старт",
|
||||
"about_howto": (
|
||||
"1. Нажмите \"+ Добавить\" для добавления сервера\n"
|
||||
"2. Выберите сервер → Терминал / Файлы\n"
|
||||
"3. Вкладка Настройка → интеграция Claude Code"
|
||||
"3. Вкладка Настройка → интеграция Claude Code / Codex"
|
||||
),
|
||||
"version": "Версия",
|
||||
"author": "Автор",
|
||||
@@ -628,6 +682,10 @@ _RU = {
|
||||
"term_connecting": "Подключение к {alias}...",
|
||||
"term_connected": "Подключено к {alias}",
|
||||
"term_disconnected": "Отключено",
|
||||
"term_off": "ОТКЛЮЧЕНО",
|
||||
"ctx_disconnect": "Отключиться",
|
||||
"term_click_to_connect": "Двойной клик для подключения к {alias}",
|
||||
"sftp_click_to_connect": "Двойной клик для просмотра файлов",
|
||||
"term_reconnecting": "Переподключение ({n}/{max})...",
|
||||
"term_connect_failed": "Ошибка подключения: {error}",
|
||||
"term_reconnect_fail": "Отключено (не удалось переподключиться)",
|
||||
@@ -670,23 +728,37 @@ _RU = {
|
||||
"no_public_key": "[!] Нет публичного ключа",
|
||||
|
||||
# Setup
|
||||
"agent_integration": "Интеграция AI-агентов",
|
||||
"claude_integration": "Интеграция с Claude Code",
|
||||
"claude_desc": (
|
||||
"Настройте всё, чтобы Claude Code мог управлять серверами через скилл /ssh.\n"
|
||||
"GUI и Claude Code используют один servers.json — добавьте сервер здесь,\n"
|
||||
"Claude увидит его сразу."
|
||||
),
|
||||
"agent_desc": (
|
||||
"Настройте всё, чтобы Claude Code, Codex и Gemini могли управлять серверами через локальные skills.\n"
|
||||
"ServerManager, Claude Code, Codex и Gemini используют один и тот же servers.json — добавьте сервер здесь,\n"
|
||||
"все агенты увидят его сразу."
|
||||
),
|
||||
"status": "Статус",
|
||||
"status_shared_dir": "Общий каталог (~/.server-connections)",
|
||||
"status_servers_json": "servers.json",
|
||||
"status_ssh_script": "ssh.py (CLI-утилита)",
|
||||
"status_encryption": "Модуль шифрования",
|
||||
"status_skill": "Скилл /ssh для Claude Code",
|
||||
"status_claude_skill": "Скилл /ssh для Claude Code",
|
||||
"status_codex_skill": "Скилл ServerManager для Codex",
|
||||
"status_codex_wrapper": "Обёртка Codex (codex-ssh)",
|
||||
"status_gemini_skill": "Скилл ServerManager для Gemini",
|
||||
"status_gemini_wrapper": "Обёртка Gemini (gemini-ssh)",
|
||||
"status_ssh_key": "SSH-ключ (ed25519)",
|
||||
"install_everything": "Установить всё",
|
||||
"installing_all": "Установка...",
|
||||
"install_ssh_py": "ssh.py",
|
||||
"install_skill": "Скилл /ssh",
|
||||
"install_claude_skill": "Скилл Claude",
|
||||
"install_codex_skill": "Скилл Codex",
|
||||
"install_gemini_skill": "Скилл Gemini",
|
||||
"install_ssh_key": "SSH-ключ",
|
||||
"refresh": "Обновить",
|
||||
"configuration": "Конфигурация",
|
||||
@@ -696,7 +768,7 @@ _RU = {
|
||||
"select_backup": "Выберите бэкап...",
|
||||
"no_backups": "Нет бэкапов",
|
||||
"restore": "Восстановить",
|
||||
"install_done": "Готово! Claude Code теперь может использовать /ssh для управления серверами.",
|
||||
"install_done": "Готово! Claude Code, Codex и Gemini теперь могут использовать ServerManager для управления серверами.",
|
||||
"config_changed": "Путь конфига изменён: {path}",
|
||||
"backup_created": "Бэкап создан: {name}",
|
||||
"backup_failed": "Ошибка бэкапа: {e}",
|
||||
@@ -907,6 +979,12 @@ _RU = {
|
||||
"s3_uploading_n": "Загрузка {count} файлов...",
|
||||
"s3_uploaded_n": "Загружено {count} файлов",
|
||||
"s3_upload_partial": "Загружено {ok}/{total} файлов",
|
||||
"s3_create_bucket": "Создать бакет",
|
||||
"s3_bucket_name_prompt": "Имя бакета:",
|
||||
"s3_delete_bucket": "Удалить бакет",
|
||||
"s3_delete_bucket_confirm": "Удалить бакет \"{name}\"? Он должен быть пустым.",
|
||||
"s3_bucket_created": "Бакет \"{name}\" создан",
|
||||
"s3_bucket_deleted": "Бакет \"{name}\" удалён",
|
||||
"s3_new_folder": "Новая папка",
|
||||
"s3_folder_name_prompt": "Имя папки:",
|
||||
"s3_creating_folder": "Создание папки...",
|
||||
@@ -936,6 +1014,19 @@ _RU = {
|
||||
"grafana_connected": "Подключено к {alias}",
|
||||
"grafana_no_dashboards": "Дашборды не найдены",
|
||||
"grafana_no_alerts": "Нет оповещений",
|
||||
"grafana_loading": "Загрузка...",
|
||||
"grafana_loaded": "{dashboards} дашб., {alerts} оповещ., {datasources} источн.",
|
||||
"grafana_no_server": "Сервер не выбран",
|
||||
"grafana_open_browser": "Открыть Grafana",
|
||||
"grafana_datasources": "Источники данных",
|
||||
"grafana_ds_name": "Имя",
|
||||
"grafana_ds_type": "Тип",
|
||||
"grafana_ds_default": "По умолч.",
|
||||
"grafana_dash_title": "Название",
|
||||
"grafana_dash_folder": "Папка",
|
||||
"grafana_alert_state": "Состояние",
|
||||
"grafana_alert_name": "Имя",
|
||||
"grafana_alert_severity": "Серьёзность",
|
||||
|
||||
# Prometheus tab
|
||||
"prom_refresh": "Обновить",
|
||||
@@ -951,6 +1042,23 @@ _RU = {
|
||||
"prom_no_targets": "Нет целей",
|
||||
"prom_no_alerts": "Нет оповещений",
|
||||
"prom_placeholder": "up",
|
||||
"prom_loading": "Загрузка...",
|
||||
"prom_loaded": "{targets} целей, {alerts} оповещ., {rules} правил",
|
||||
"prom_no_server": "Сервер не выбран",
|
||||
"prom_executing": "Выполнение...",
|
||||
"prom_results": "Результаты",
|
||||
"prom_query_placeholder": "напр. up, node_cpu_seconds_total",
|
||||
"prom_metrics_browser": "Метрики",
|
||||
"prom_filter_metrics": "Фильтр метрик...",
|
||||
"prom_rules": "Правила",
|
||||
"prom_rule_type": "Тип",
|
||||
"prom_rule_name": "Имя",
|
||||
"prom_rule_group": "Группа",
|
||||
"prom_rule_health": "Здоровье",
|
||||
"prom_target_job": "Job",
|
||||
"prom_target_instance": "Инстанс",
|
||||
"prom_target_health": "Здоровье",
|
||||
"prom_target_scrape": "Последний опрос",
|
||||
|
||||
# PowerShell tab
|
||||
"ps_execute": "Выполнить",
|
||||
@@ -1080,7 +1188,7 @@ _ZH = {
|
||||
"about_desc": (
|
||||
"用于管理远程服务器的桌面应用程序。\n"
|
||||
"SSH终端、SFTP文件传输、密钥管理、\n"
|
||||
"凭据加密以及Claude Code集成。"
|
||||
"凭据加密以及Claude Code / Codex集成。"
|
||||
),
|
||||
"about_features_title": "⚡ 功能特点",
|
||||
"about_features": (
|
||||
@@ -1090,13 +1198,13 @@ _ZH = {
|
||||
"• TOTP / 2FA(Google Authenticator)\n"
|
||||
"• 凭据加密(Fernet)\n"
|
||||
"• 自动备份\n"
|
||||
"• Claude Code集成"
|
||||
"• Claude Code 和 Codex 集成"
|
||||
),
|
||||
"about_howto_title": "🚀 快速开始",
|
||||
"about_howto": (
|
||||
"1. 点击\"+ 添加\"来添加服务器\n"
|
||||
"2. 选择服务器 → 终端 / 文件\n"
|
||||
"3. 设置标签 → Claude Code集成"
|
||||
"3. 设置标签 → Claude Code / Codex 集成"
|
||||
),
|
||||
"version": "版本",
|
||||
"author": "作者",
|
||||
@@ -1145,6 +1253,10 @@ _ZH = {
|
||||
"term_connecting": "正在连接 {alias}...",
|
||||
"term_connected": "已连接到 {alias}",
|
||||
"term_disconnected": "已断开",
|
||||
"term_off": "未连接",
|
||||
"ctx_disconnect": "断开连接",
|
||||
"term_click_to_connect": "双击连接 {alias}",
|
||||
"sftp_click_to_connect": "双击服务器浏览文件",
|
||||
"term_reconnecting": "重新连接中 ({n}/{max})...",
|
||||
"term_connect_failed": "连接失败:{error}",
|
||||
"term_reconnect_fail": "已断开(重连失败)",
|
||||
@@ -1187,6 +1299,7 @@ _ZH = {
|
||||
"no_public_key": "[!] 没有公钥可复制",
|
||||
|
||||
# Setup
|
||||
"agent_integration": "AI代理集成",
|
||||
"claude_integration": "Claude Code集成",
|
||||
"claude_desc": (
|
||||
"设置一切以便Claude Code通过/ssh技能管理您的服务器。\n"
|
||||
@@ -1198,12 +1311,25 @@ _ZH = {
|
||||
"status_servers_json": "servers.json",
|
||||
"status_ssh_script": "ssh.py(CLI工具)",
|
||||
"status_encryption": "加密模块",
|
||||
"agent_desc": (
|
||||
"完成设置后,Claude Code、Codex 和 Gemini 都可以通过共享的本地 skills 管理您的服务器。\n"
|
||||
"ServerManager、Claude Code、Codex 和 Gemini 共用同一个 servers.json — 在此添加服务器后,\n"
|
||||
"所有代理都会立即看到。"
|
||||
),
|
||||
"status_skill": "Claude Code的/ssh技能",
|
||||
"status_claude_skill": "Claude Code 的 /ssh 技能",
|
||||
"status_codex_skill": "Codex 的 ServerManager 技能",
|
||||
"status_codex_wrapper": "Codex 包装器(codex-ssh)",
|
||||
"status_gemini_skill": "Gemini 的 ServerManager 技能",
|
||||
"status_gemini_wrapper": "Gemini 包装器(gemini-ssh)",
|
||||
"status_ssh_key": "SSH密钥(ed25519)",
|
||||
"install_everything": "全部安装",
|
||||
"installing_all": "安装中...",
|
||||
"install_ssh_py": "ssh.py",
|
||||
"install_skill": "/ssh技能",
|
||||
"install_claude_skill": "Claude 技能",
|
||||
"install_codex_skill": "Codex 技能",
|
||||
"install_gemini_skill": "Gemini 技能",
|
||||
"install_ssh_key": "SSH密钥",
|
||||
"refresh": "刷新",
|
||||
"configuration": "配置",
|
||||
@@ -1213,7 +1339,7 @@ _ZH = {
|
||||
"select_backup": "选择备份...",
|
||||
"no_backups": "无备份",
|
||||
"restore": "恢复",
|
||||
"install_done": "完成!Claude Code现在可以使用/ssh来管理您的服务器。",
|
||||
"install_done": "完成!Claude Code、Codex 和 Gemini 现在都可以使用 ServerManager 来管理您的服务器。",
|
||||
"config_changed": "配置路径已更改:{path}",
|
||||
"backup_created": "备份已创建:{name}",
|
||||
"backup_failed": "备份失败:{e}",
|
||||
@@ -1424,6 +1550,12 @@ _ZH = {
|
||||
"s3_uploading_n": "正在上传 {count} 个文件...",
|
||||
"s3_uploaded_n": "已上传 {count} 个文件",
|
||||
"s3_upload_partial": "已上传 {ok}/{total} 个文件",
|
||||
"s3_create_bucket": "创建存储桶",
|
||||
"s3_bucket_name_prompt": "存储桶名称:",
|
||||
"s3_delete_bucket": "删除存储桶",
|
||||
"s3_delete_bucket_confirm": "删除存储桶 \"{name}\"?必须为空。",
|
||||
"s3_bucket_created": "存储桶 \"{name}\" 已创建",
|
||||
"s3_bucket_deleted": "存储桶 \"{name}\" 已删除",
|
||||
"s3_new_folder": "新建文件夹",
|
||||
"s3_folder_name_prompt": "文件夹名称:",
|
||||
"s3_creating_folder": "创建文件夹中...",
|
||||
@@ -1453,6 +1585,19 @@ _ZH = {
|
||||
"grafana_connected": "已连接到 {alias}",
|
||||
"grafana_no_dashboards": "未找到仪表盘",
|
||||
"grafana_no_alerts": "无告警",
|
||||
"grafana_loading": "加载中...",
|
||||
"grafana_loaded": "{dashboards}仪表盘, {alerts}告警, {datasources}数据源",
|
||||
"grafana_no_server": "未选择服务器",
|
||||
"grafana_open_browser": "打开Grafana",
|
||||
"grafana_datasources": "数据源",
|
||||
"grafana_ds_name": "名称",
|
||||
"grafana_ds_type": "类型",
|
||||
"grafana_ds_default": "默认",
|
||||
"grafana_dash_title": "标题",
|
||||
"grafana_dash_folder": "文件夹",
|
||||
"grafana_alert_state": "状态",
|
||||
"grafana_alert_name": "名称",
|
||||
"grafana_alert_severity": "严重程度",
|
||||
|
||||
# Prometheus tab
|
||||
"prom_refresh": "刷新",
|
||||
@@ -1468,6 +1613,23 @@ _ZH = {
|
||||
"prom_no_targets": "无目标",
|
||||
"prom_no_alerts": "无告警",
|
||||
"prom_placeholder": "up",
|
||||
"prom_loading": "加载中...",
|
||||
"prom_loaded": "{targets}目标, {alerts}告警, {rules}规则",
|
||||
"prom_no_server": "未选择服务器",
|
||||
"prom_executing": "执行中...",
|
||||
"prom_results": "结果",
|
||||
"prom_query_placeholder": "例如 up, node_cpu_seconds_total",
|
||||
"prom_metrics_browser": "指标",
|
||||
"prom_filter_metrics": "过滤指标...",
|
||||
"prom_rules": "规则",
|
||||
"prom_rule_type": "类型",
|
||||
"prom_rule_name": "名称",
|
||||
"prom_rule_group": "组",
|
||||
"prom_rule_health": "健康",
|
||||
"prom_target_job": "任务",
|
||||
"prom_target_instance": "实例",
|
||||
"prom_target_health": "健康",
|
||||
"prom_target_scrape": "最后抓取",
|
||||
|
||||
# PowerShell tab
|
||||
"ps_execute": "执行",
|
||||
|
||||
@@ -210,6 +210,7 @@ CTX_ICONS = {
|
||||
"ctx_open_browser": "browser",
|
||||
"ctx_check_status": "status_check",
|
||||
"ctx_copy_alias": "copy",
|
||||
"ctx_disconnect": "close",
|
||||
"edit": "edit",
|
||||
"delete": "delete",
|
||||
}
|
||||
|
||||
@@ -518,3 +518,29 @@ class S3Client:
|
||||
return resp.get("ContentLength", 0)
|
||||
except Exception:
|
||||
return 0
|
||||
|
||||
def create_bucket(self, bucket_name: str) -> bool:
|
||||
"""Create a new S3 bucket."""
|
||||
if not self._ensure_connected():
|
||||
return False
|
||||
try:
|
||||
self._client.create_bucket(Bucket=bucket_name)
|
||||
self._last_ok = time.time()
|
||||
log.info("S3 bucket created: %s", bucket_name)
|
||||
return True
|
||||
except Exception as exc:
|
||||
log.error("S3 create_bucket failed: %s", exc)
|
||||
return False
|
||||
|
||||
def delete_bucket(self, bucket_name: str) -> bool:
|
||||
"""Delete an empty S3 bucket."""
|
||||
if not self._ensure_connected():
|
||||
return False
|
||||
try:
|
||||
self._client.delete_bucket(Bucket=bucket_name)
|
||||
self._last_ok = time.time()
|
||||
log.info("S3 bucket deleted: %s", bucket_name)
|
||||
return True
|
||||
except Exception as exc:
|
||||
log.error("S3 delete_bucket failed: %s", exc)
|
||||
return False
|
||||
|
||||
@@ -256,3 +256,13 @@ class SessionPool:
|
||||
if has_active:
|
||||
active.append(alias)
|
||||
return active
|
||||
|
||||
def has_active_session(self, alias: str) -> bool:
|
||||
with self._lock:
|
||||
sd = self._sessions.get(alias)
|
||||
if not sd:
|
||||
return False
|
||||
return bool(
|
||||
(sd.shell_session and sd.shell_session.connected) or
|
||||
(sd.sftp_session and sd.sftp_session.connected)
|
||||
)
|
||||
113
gui/app.py
113
gui/app.py
@@ -2,6 +2,7 @@
|
||||
Main application window — sidebar + tabview layout.
|
||||
"""
|
||||
|
||||
import sys
|
||||
import tkinter
|
||||
import customtkinter as ctk
|
||||
from tkinter import messagebox
|
||||
@@ -91,7 +92,7 @@ class App(ctk.CTk):
|
||||
|
||||
# Restore saved window geometry or use default
|
||||
saved_geo = self.store._window_geometry
|
||||
if saved_geo:
|
||||
if saved_geo and self._is_valid_geometry(saved_geo):
|
||||
self.geometry(saved_geo)
|
||||
else:
|
||||
self.geometry("1100x700")
|
||||
@@ -118,6 +119,32 @@ class App(ctk.CTk):
|
||||
# Cleanup on close
|
||||
self.protocol("WM_DELETE_WINDOW", self._on_close)
|
||||
|
||||
# Win32: restore window when stuck minimized after Win+D
|
||||
self._restore_check_id = None
|
||||
if sys.platform == "win32":
|
||||
self.after(3000, self._start_restore_watchdog)
|
||||
|
||||
def _start_restore_watchdog(self):
|
||||
"""Start periodic check for stuck minimized state (Windows only)."""
|
||||
try:
|
||||
import ctypes
|
||||
self._user32 = ctypes.windll.user32
|
||||
self._hwnd = int(self.wm_frame(), 16)
|
||||
self._check_restore()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def _check_restore(self):
|
||||
"""If window is iconic but user clicked taskbar, force restore."""
|
||||
try:
|
||||
if self._user32.IsIconic(self._hwnd):
|
||||
fg = self._user32.GetForegroundWindow()
|
||||
if fg == self._hwnd:
|
||||
self._user32.ShowWindow(self._hwnd, 9) # SW_RESTORE
|
||||
except Exception:
|
||||
pass
|
||||
self._restore_check_id = self.after(500, self._check_restore)
|
||||
|
||||
def _build_layout(self):
|
||||
# PanedWindow — resizable sidebar | main area
|
||||
self._paned = tkinter.PanedWindow(
|
||||
@@ -127,7 +154,7 @@ class App(ctk.CTk):
|
||||
self._paned.pack(fill="both", expand=True)
|
||||
|
||||
# Sidebar
|
||||
self.sidebar = Sidebar(self._paned, self.store, on_select=self._on_server_select, session_pool=self.session_pool)
|
||||
self.sidebar = Sidebar(self._paned, self.store, on_select=self._on_server_select, on_double_click=self._on_server_connect, session_pool=self.session_pool)
|
||||
self._paned.add(self.sidebar, minsize=180, width=self.store._sidebar_width)
|
||||
self.sidebar.add_callback = self._add_server
|
||||
self.sidebar.edit_callback = self._edit_server
|
||||
@@ -136,20 +163,19 @@ class App(ctk.CTk):
|
||||
self.sidebar.open_tab_callback = self._context_open_tab
|
||||
self.sidebar.check_status_callback = self._context_check_status
|
||||
self.sidebar.open_browser_callback = self._context_open_browser
|
||||
self.sidebar.disconnect_callback = self._on_server_disconnect
|
||||
|
||||
# Main area
|
||||
self._main_frame = ctk.CTkFrame(self._paned, fg_color="transparent")
|
||||
self._paned.add(self._main_frame, minsize=500)
|
||||
|
||||
# Header bar (language + about)
|
||||
header_bar = ctk.CTkFrame(self._main_frame, fg_color="transparent", height=40)
|
||||
header_bar.pack(fill="x", padx=10, pady=(8, 0))
|
||||
header_bar.pack_propagate(False)
|
||||
# Header controls — overlay frame placed on top of tabview's tab row
|
||||
self._header_controls = ctk.CTkFrame(self._main_frame, fg_color="transparent", height=30)
|
||||
|
||||
# Language selector
|
||||
_lang_img = ctk_icon("globe", 18)
|
||||
self._lang_icon = ctk.CTkLabel(
|
||||
header_bar, text="" if _lang_img else "\U0001f310",
|
||||
self._header_controls, text="" if _lang_img else "\U0001f310",
|
||||
image=_lang_img, font=ctk.CTkFont(size=14), width=20,
|
||||
)
|
||||
self._lang_icon.pack(side="right", padx=(5, 0))
|
||||
@@ -157,17 +183,17 @@ class App(ctk.CTk):
|
||||
current_display = LANGUAGES.get(i18n.get_language(), "English")
|
||||
self._lang_var = ctk.StringVar(value=current_display)
|
||||
self.lang_menu = ctk.CTkOptionMenu(
|
||||
header_bar, values=lang_values, variable=self._lang_var,
|
||||
width=110, height=30, command=self._change_language
|
||||
self._header_controls, values=lang_values, variable=self._lang_var,
|
||||
width=110, height=26, command=self._change_language
|
||||
)
|
||||
self.lang_menu.pack(side="right", padx=(5, 0))
|
||||
|
||||
# Check Updates button
|
||||
_sync_img = ctk_icon("refresh", 18)
|
||||
self._update_check_btn = ctk.CTkButton(
|
||||
header_bar, text="" if _sync_img else "\u21bb",
|
||||
image=_sync_img, width=30, height=30,
|
||||
corner_radius=15, fg_color="#6b7280", hover_color="#4b5563",
|
||||
self._header_controls, text="" if _sync_img else "\u21bb",
|
||||
image=_sync_img, width=26, height=26,
|
||||
corner_radius=13, fg_color="#6b7280", hover_color="#4b5563",
|
||||
command=self._check_updates_manual,
|
||||
)
|
||||
self._update_check_btn.pack(side="right", padx=(5, 0))
|
||||
@@ -175,12 +201,12 @@ class App(ctk.CTk):
|
||||
# About button
|
||||
_info_img = ctk_icon("info", 18)
|
||||
self.about_btn = ctk.CTkButton(
|
||||
header_bar, text="" if _info_img else "ⓘ",
|
||||
image=_info_img, width=30, height=30,
|
||||
corner_radius=15, fg_color="#6b7280", hover_color="#4b5563",
|
||||
self._header_controls, text="" if _info_img else "ⓘ",
|
||||
image=_info_img, width=26, height=26,
|
||||
corner_radius=13, fg_color="#6b7280", hover_color="#4b5563",
|
||||
command=self._show_about
|
||||
)
|
||||
self.about_btn.pack(side="right", padx=(5, 5))
|
||||
self.about_btn.pack(side="right", padx=(5, 0))
|
||||
|
||||
# Update banner (hidden by default)
|
||||
self._update_banner = None
|
||||
@@ -222,7 +248,14 @@ class App(ctk.CTk):
|
||||
|
||||
# Create new tabview
|
||||
self.tabview = ctk.CTkTabview(self._main_frame, command=self._on_tab_changed)
|
||||
self.tabview.pack(fill="both", expand=True, padx=10, pady=10)
|
||||
self.tabview._outer_spacing = 0
|
||||
self.tabview._outer_button_overhang = 0
|
||||
self.tabview._configure_grid()
|
||||
self.tabview.pack(fill="both", expand=True, padx=10, pady=(4, 10))
|
||||
|
||||
# Overlay header controls on top-right of tabview (same row as tab buttons)
|
||||
self._header_controls.lift()
|
||||
self._header_controls.place(in_=self.tabview, relx=1.0, y=0, anchor="ne")
|
||||
|
||||
for key in self._tab_keys:
|
||||
self.tabview.add(_tab_label(key))
|
||||
@@ -237,6 +270,11 @@ class App(ctk.CTk):
|
||||
widget.pack(fill="both", expand=True)
|
||||
self._tab_instances[key] = widget
|
||||
|
||||
# Wire disconnect callback for terminal toolbar button
|
||||
terminal = self._tab_instances.get("terminal")
|
||||
if terminal and hasattr(terminal, "_on_disconnect_callback"):
|
||||
terminal._on_disconnect_callback = self._on_server_disconnect
|
||||
|
||||
# Restore previously active tab if still available
|
||||
if restore_tab_key and restore_tab_key in self._tab_keys:
|
||||
try:
|
||||
@@ -280,6 +318,20 @@ class App(ctk.CTk):
|
||||
# Update session indicators after a short delay (connection is async)
|
||||
self.after(1500, self.sidebar.update_session_indicators)
|
||||
|
||||
def _on_server_connect(self, alias: str):
|
||||
"""Double-click: connect interactive tabs (terminal, files, powershell)."""
|
||||
for key, widget in self._tab_instances.items():
|
||||
if hasattr(widget, "connect"):
|
||||
widget.connect()
|
||||
|
||||
def _on_server_disconnect(self, alias: str):
|
||||
"""Disconnect all sessions for a server."""
|
||||
for key, widget in self._tab_instances.items():
|
||||
if hasattr(widget, "disconnect"):
|
||||
widget.disconnect()
|
||||
self.session_pool.disconnect_session(alias)
|
||||
self.after(500, self.sidebar.update_session_indicators)
|
||||
|
||||
def _add_server(self):
|
||||
dialog = ServerDialog(self, self.store)
|
||||
self.wait_window(dialog)
|
||||
@@ -329,6 +381,10 @@ class App(ctk.CTk):
|
||||
self.tabview.set(_tab_label(tab_key))
|
||||
except Exception:
|
||||
pass
|
||||
# Connect the target tab if it supports explicit connection
|
||||
widget = self._tab_instances.get(tab_key)
|
||||
if widget and hasattr(widget, "connect"):
|
||||
widget.connect()
|
||||
|
||||
def _context_check_status(self, alias: str):
|
||||
"""Context menu: check single server status in background."""
|
||||
@@ -667,10 +723,31 @@ class App(ctk.CTk):
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
@staticmethod
|
||||
def _is_valid_geometry(geo: str) -> bool:
|
||||
"""Reject geometry with offscreen coordinates (e.g. minimized -32000)."""
|
||||
try:
|
||||
# format: WxH+X+Y or WxH-X-Y
|
||||
import re
|
||||
m = re.match(r"(\d+)x(\d+)([+-]\d+)([+-]\d+)", geo)
|
||||
if not m:
|
||||
return False
|
||||
x, y = int(m.group(3)), int(m.group(4))
|
||||
return -100 < x < 10000 and -100 < y < 10000
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
def _on_close(self):
|
||||
# Cancel restore watchdog
|
||||
try:
|
||||
if self._restore_check_id:
|
||||
self.after_cancel(self._restore_check_id)
|
||||
except Exception:
|
||||
pass
|
||||
# Save window geometry (size + position) and sidebar width
|
||||
try:
|
||||
self.store._window_geometry = self.geometry()
|
||||
geo = self.geometry()
|
||||
self.store._window_geometry = geo if self._is_valid_geometry(geo) else None
|
||||
# Save sidebar width from PanedWindow sash position
|
||||
try:
|
||||
sash_pos = self._paned.sash_coord(0)
|
||||
|
||||
@@ -20,7 +20,7 @@ FIELD_MAP = {
|
||||
"mssql": ["user", "password", "database"],
|
||||
"postgresql": ["user", "password", "database"],
|
||||
"redis": ["password", "db_index", "use_ssl"],
|
||||
"grafana": ["api_token", "use_ssl"],
|
||||
"grafana": ["user", "password", "api_token", "use_ssl"],
|
||||
"prometheus": ["use_ssl"],
|
||||
"rdp": ["user", "password", "rdp_resolution", "rdp_quality", "rdp_clipboard", "rdp_drives", "rdp_printers"],
|
||||
"vnc": ["password"],
|
||||
|
||||
@@ -34,10 +34,11 @@ _CONTEXT_ACTIONS = {
|
||||
|
||||
|
||||
class Sidebar(ctk.CTkFrame):
|
||||
def __init__(self, master, store, on_select=None, session_pool=None):
|
||||
def __init__(self, master, store, on_select=None, on_double_click=None, session_pool=None):
|
||||
super().__init__(master, width=250, corner_radius=0)
|
||||
self.store = store
|
||||
self.on_select = on_select
|
||||
self.on_double_click = on_double_click
|
||||
self.session_pool = session_pool
|
||||
self._selected_alias: str | None = None
|
||||
self._server_frames: dict[str, ctk.CTkFrame] = {}
|
||||
@@ -96,6 +97,7 @@ class Sidebar(ctk.CTkFrame):
|
||||
self.open_tab_callback = None # (alias, tab_key) → select server + switch tab
|
||||
self.check_status_callback = None # (alias) → check single server
|
||||
self.open_browser_callback = None # (alias) → open server URL in browser
|
||||
self.disconnect_callback = None # (alias) → disconnect all sessions
|
||||
|
||||
# Subscribe to store changes
|
||||
self.store.subscribe(self._refresh_list)
|
||||
@@ -272,6 +274,7 @@ class Sidebar(ctk.CTkFrame):
|
||||
# Click handlers
|
||||
for widget in [frame, info, name_label, detail_label, badge, type_badge, session_ind]:
|
||||
widget.bind("<Button-1>", lambda e, a=alias: self._select(a))
|
||||
widget.bind("<Double-Button-1>", lambda e, a=alias: self._on_double_click(a))
|
||||
widget.bind("<Button-3>", lambda e, a=alias: self._show_context_menu(e, a))
|
||||
|
||||
self._server_frames[alias] = frame
|
||||
@@ -371,6 +374,11 @@ class Sidebar(ctk.CTkFrame):
|
||||
if self.on_select:
|
||||
self.on_select(alias)
|
||||
|
||||
def _on_double_click(self, alias: str):
|
||||
self._select(alias)
|
||||
if self.on_double_click:
|
||||
self.on_double_click(alias)
|
||||
|
||||
def _highlight_selected(self):
|
||||
for alias, frame in self._server_frames.items():
|
||||
if alias == self._selected_alias:
|
||||
@@ -460,6 +468,18 @@ class Sidebar(ctk.CTkFrame):
|
||||
if actions:
|
||||
menu.add_separator()
|
||||
|
||||
# Dynamic disconnect if session is active
|
||||
if self.session_pool and self.session_pool.has_active_session(alias):
|
||||
dc_icon = icon(CTX_ICONS.get("ctx_disconnect", ""))
|
||||
dc_label = f"{dc_icon} {t('ctx_disconnect')}" if dc_icon else t("ctx_disconnect")
|
||||
menu.add_command(
|
||||
label=dc_label,
|
||||
command=lambda a=alias: (
|
||||
self.disconnect_callback(a) if self.disconnect_callback else None
|
||||
),
|
||||
)
|
||||
menu.add_separator()
|
||||
|
||||
# "Move to Group" submenu
|
||||
groups = self.store.get_groups()
|
||||
if groups:
|
||||
|
||||
@@ -307,13 +307,21 @@ class FilesTab(ctk.CTkFrame):
|
||||
stored_path, stored_sudo = self.session_pool.get_sftp_state(alias)
|
||||
if stored_path != "/":
|
||||
self._remote_path = stored_path
|
||||
# The stored sudo mode will be applied when the connection is established
|
||||
self._connect_sftp()
|
||||
self._remote_status.configure(text=t("sftp_click_to_connect"))
|
||||
else:
|
||||
self._remote_list.populate([])
|
||||
self._remote_status.configure(text=t("connect_to_browse"))
|
||||
self._set_remote_buttons_state("disabled")
|
||||
|
||||
def connect(self):
|
||||
"""Explicitly connect SFTP (double-click or context menu)."""
|
||||
if self._current_alias and not self._sftp:
|
||||
self._connect_sftp()
|
||||
|
||||
def disconnect(self):
|
||||
"""Disconnect SFTP and update UI (called by app)."""
|
||||
self._disconnect_sftp()
|
||||
|
||||
# ── SFTP connection ──
|
||||
|
||||
def _connect_sftp(self):
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
"""
|
||||
Grafana tab — dashboards browser and alerts overview.
|
||||
Grafana tab — dashboards browser, active alerts, and datasources overview.
|
||||
"""
|
||||
|
||||
import threading
|
||||
@@ -9,7 +9,7 @@ from tkinter import ttk
|
||||
import customtkinter as ctk
|
||||
from core.grafana_client import GrafanaClient
|
||||
from core.i18n import t
|
||||
from core.icons import icon_text, make_icon_button, reconfigure_icon_button
|
||||
from core.icons import make_icon_button, reconfigure_icon_button
|
||||
from gui.tabs.query_tab import apply_dark_scrollbar_style
|
||||
|
||||
|
||||
@@ -25,18 +25,23 @@ class GrafanaTab(ctk.CTkFrame):
|
||||
|
||||
def _build_ui(self):
|
||||
apply_dark_scrollbar_style()
|
||||
# ── Header + Refresh ──
|
||||
# ── Header + buttons ──
|
||||
header_frame = ctk.CTkFrame(self, fg_color="transparent")
|
||||
header_frame.pack(fill="x", padx=15, pady=(15, 5))
|
||||
|
||||
title = ctk.CTkLabel(header_frame, text=t("grafana_title"),
|
||||
title = ctk.CTkLabel(header_frame, text="Grafana",
|
||||
font=ctk.CTkFont(size=18, weight="bold"))
|
||||
title.pack(side="left")
|
||||
|
||||
self._refresh_btn = make_icon_button(header_frame, "refresh", t("grafana_refresh"), width=110,
|
||||
self._refresh_btn = make_icon_button(header_frame, "refresh", t("grafana_refresh"), width=100,
|
||||
command=self._refresh)
|
||||
self._refresh_btn.pack(side="right")
|
||||
|
||||
self._open_btn = make_icon_button(header_frame, "browser", t("grafana_open_browser"), width=130,
|
||||
fg_color="#6b7280", hover_color="#4b5563",
|
||||
command=self._open_grafana)
|
||||
self._open_btn.pack(side="right", padx=(0, 5))
|
||||
|
||||
# ── Dashboards section ──
|
||||
dash_label = ctk.CTkLabel(self, text=t("grafana_dashboards"),
|
||||
font=ctk.CTkFont(size=14, weight="bold"), anchor="w")
|
||||
@@ -47,7 +52,7 @@ class GrafanaTab(ctk.CTkFrame):
|
||||
|
||||
columns = ("uid", "title", "folder")
|
||||
self._dash_tree = ttk.Treeview(dash_frame, columns=columns, show="headings",
|
||||
selectmode="browse", height=8)
|
||||
selectmode="browse", height=6)
|
||||
self._dash_tree.heading("uid", text="UID")
|
||||
self._dash_tree.heading("title", text=t("grafana_dash_title"))
|
||||
self._dash_tree.heading("folder", text=t("grafana_dash_folder"))
|
||||
@@ -59,7 +64,6 @@ class GrafanaTab(ctk.CTkFrame):
|
||||
dash_scroll = ttk.Scrollbar(dash_frame, orient="vertical", command=self._dash_tree.yview, style="Dark.Vertical.TScrollbar")
|
||||
dash_scroll.pack(side="right", fill="y")
|
||||
self._dash_tree.configure(yscrollcommand=dash_scroll.set)
|
||||
|
||||
self._dash_tree.bind("<Double-1>", self._on_dashboard_click)
|
||||
|
||||
# ── Alerts section ──
|
||||
@@ -72,7 +76,7 @@ class GrafanaTab(ctk.CTkFrame):
|
||||
|
||||
alert_columns = ("state", "name", "severity")
|
||||
self._alerts_tree = ttk.Treeview(alerts_frame, columns=alert_columns, show="headings",
|
||||
selectmode="browse", height=6)
|
||||
selectmode="browse", height=5)
|
||||
self._alerts_tree.heading("state", text=t("grafana_alert_state"))
|
||||
self._alerts_tree.heading("name", text=t("grafana_alert_name"))
|
||||
self._alerts_tree.heading("severity", text=t("grafana_alert_severity"))
|
||||
@@ -85,6 +89,31 @@ class GrafanaTab(ctk.CTkFrame):
|
||||
alerts_scroll.pack(side="right", fill="y")
|
||||
self._alerts_tree.configure(yscrollcommand=alerts_scroll.set)
|
||||
|
||||
# ── Datasources section ──
|
||||
ds_label = ctk.CTkLabel(self, text=t("grafana_datasources"),
|
||||
font=ctk.CTkFont(size=14, weight="bold"), anchor="w")
|
||||
ds_label.pack(fill="x", padx=15, pady=(10, 3))
|
||||
|
||||
ds_frame = ctk.CTkFrame(self, fg_color="transparent")
|
||||
ds_frame.pack(fill="both", expand=True, padx=15, pady=(0, 5))
|
||||
|
||||
ds_columns = ("name", "type", "url", "default")
|
||||
self._ds_tree = ttk.Treeview(ds_frame, columns=ds_columns, show="headings",
|
||||
selectmode="browse", height=4)
|
||||
self._ds_tree.heading("name", text=t("grafana_ds_name"))
|
||||
self._ds_tree.heading("type", text=t("grafana_ds_type"))
|
||||
self._ds_tree.heading("url", text="URL")
|
||||
self._ds_tree.heading("default", text=t("grafana_ds_default"))
|
||||
self._ds_tree.column("name", width=150, minwidth=100)
|
||||
self._ds_tree.column("type", width=120, minwidth=80)
|
||||
self._ds_tree.column("url", width=250, minwidth=120)
|
||||
self._ds_tree.column("default", width=60, minwidth=40)
|
||||
self._ds_tree.pack(side="left", fill="both", expand=True)
|
||||
|
||||
ds_scroll = ttk.Scrollbar(ds_frame, orient="vertical", command=self._ds_tree.yview, style="Dark.Vertical.TScrollbar")
|
||||
ds_scroll.pack(side="right", fill="y")
|
||||
self._ds_tree.configure(yscrollcommand=ds_scroll.set)
|
||||
|
||||
# ── Status bar ──
|
||||
self._status_bar = ctk.CTkLabel(self, text=t("grafana_no_server"), anchor="w",
|
||||
font=ctk.CTkFont(size=11), text_color="#9ca3af")
|
||||
@@ -93,7 +122,6 @@ class GrafanaTab(ctk.CTkFrame):
|
||||
# ── Public API ──
|
||||
|
||||
def set_server(self, alias: str | None):
|
||||
"""Called when user selects a server in sidebar."""
|
||||
self._current_alias = alias
|
||||
self._client = None
|
||||
self._dashboards.clear()
|
||||
@@ -109,24 +137,26 @@ class GrafanaTab(ctk.CTkFrame):
|
||||
|
||||
def _refresh(self):
|
||||
if not self._current_alias:
|
||||
self._set_status(t("no_server_selected"), "#ef4444")
|
||||
self._set_status(t("grafana_no_server"), "#ef4444")
|
||||
return
|
||||
|
||||
self._refresh_btn.configure(state="disabled", text=t("grafana_loading"))
|
||||
self._refresh_btn.configure(state="disabled")
|
||||
self._set_status(t("grafana_loading"), "#ccaa00")
|
||||
|
||||
def _do():
|
||||
try:
|
||||
client = self._get_client()
|
||||
|
||||
dashboards = client.list_dashboards()
|
||||
alerts = client.list_alerts()
|
||||
alerts = client.get_active_alerts()
|
||||
datasources = client.list_datasources()
|
||||
|
||||
self.after(0, lambda: self._populate_dashboards(dashboards))
|
||||
self.after(0, lambda: self._populate_alerts(alerts))
|
||||
self.after(0, lambda: self._populate_datasources(datasources))
|
||||
self.after(0, lambda: self._set_status(
|
||||
t("grafana_loaded").format(
|
||||
dashboards=len(dashboards), alerts=len(alerts)
|
||||
dashboards=len(dashboards), alerts=len(alerts),
|
||||
datasources=len(datasources)
|
||||
), "#22c55e"))
|
||||
except Exception as e:
|
||||
self.after(0, lambda: self._set_status(f"(error) {e}", "#ef4444"))
|
||||
@@ -140,7 +170,10 @@ class GrafanaTab(ctk.CTkFrame):
|
||||
|
||||
def _get_client(self) -> GrafanaClient:
|
||||
if self._client is None:
|
||||
self._client = GrafanaClient(self._current_alias, self.store)
|
||||
server = self.store.get_server(self._current_alias)
|
||||
if not server:
|
||||
raise ValueError(f"Server '{self._current_alias}' not found")
|
||||
self._client = GrafanaClient(server)
|
||||
return self._client
|
||||
|
||||
# ── Table population ──
|
||||
@@ -157,28 +190,37 @@ class GrafanaTab(ctk.CTkFrame):
|
||||
def _populate_alerts(self, alerts: list[dict]):
|
||||
self._alerts_tree.delete(*self._alerts_tree.get_children())
|
||||
for a in alerts:
|
||||
state = a.get("state", a.get("status", "unknown"))
|
||||
name = a.get("name", a.get("title", ""))
|
||||
severity = a.get("severity", a.get("labels", {}).get("severity", "—"))
|
||||
status = a.get("status", {})
|
||||
state = status.get("state", "unknown") if isinstance(status, dict) else str(status)
|
||||
labels = a.get("labels", {})
|
||||
name = labels.get("alertname", a.get("name", ""))
|
||||
severity = labels.get("severity", "---")
|
||||
tag = ""
|
||||
if state in ("alerting", "firing"):
|
||||
if state in ("active", "firing", "alerting"):
|
||||
tag = "alerting"
|
||||
elif state in ("ok", "normal", "inactive"):
|
||||
elif state in ("suppressed", "resolved", "inactive"):
|
||||
tag = "ok"
|
||||
self._alerts_tree.insert("", "end", values=(state, name, severity), tags=(tag,))
|
||||
|
||||
# Color-code alert states
|
||||
self._alerts_tree.tag_configure("alerting", foreground="#ef4444")
|
||||
self._alerts_tree.tag_configure("ok", foreground="#22c55e")
|
||||
|
||||
def _populate_datasources(self, datasources: list[dict]):
|
||||
self._ds_tree.delete(*self._ds_tree.get_children())
|
||||
for ds in datasources:
|
||||
name = ds.get("name", "")
|
||||
ds_type = ds.get("type", "")
|
||||
url = ds.get("url", "")
|
||||
is_default = "Yes" if ds.get("isDefault", False) else ""
|
||||
self._ds_tree.insert("", "end", values=(name, ds_type, url, is_default))
|
||||
|
||||
def _clear_tables(self):
|
||||
self._dash_tree.delete(*self._dash_tree.get_children())
|
||||
self._alerts_tree.delete(*self._alerts_tree.get_children())
|
||||
self._ds_tree.delete(*self._ds_tree.get_children())
|
||||
|
||||
# ── Events ──
|
||||
|
||||
def _on_dashboard_click(self, _event):
|
||||
"""Open dashboard URL in browser on double-click."""
|
||||
selection = self._dash_tree.selection()
|
||||
if not selection:
|
||||
return
|
||||
@@ -186,21 +228,26 @@ class GrafanaTab(ctk.CTkFrame):
|
||||
uid = item["values"][0] if item["values"] else None
|
||||
if not uid:
|
||||
return
|
||||
|
||||
# Find the dashboard data to get the URL
|
||||
for d in self._dashboards:
|
||||
if d.get("uid") == uid:
|
||||
url = d.get("url", "")
|
||||
if url:
|
||||
try:
|
||||
client = self._get_client()
|
||||
full_url = client.get_dashboard_url(url)
|
||||
webbrowser.open(full_url)
|
||||
webbrowser.open(f"{client.base_url}{url}")
|
||||
except Exception:
|
||||
# Fallback: just open relative URL
|
||||
webbrowser.open(url)
|
||||
break
|
||||
|
||||
def _open_grafana(self):
|
||||
if not self._current_alias:
|
||||
return
|
||||
try:
|
||||
client = self._get_client()
|
||||
webbrowser.open(client.base_url)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# ── Helpers ──
|
||||
|
||||
def _set_status(self, text: str, color: str = "#9ca3af"):
|
||||
|
||||
@@ -97,7 +97,18 @@ class PowershellTab(ctk.CTkFrame):
|
||||
self._set_status(t("ps_disconnected"), "#888888")
|
||||
return
|
||||
|
||||
self._connect(alias)
|
||||
self._set_status(t("term_click_to_connect").format(alias=alias), "#f59e0b")
|
||||
|
||||
def connect(self):
|
||||
"""Explicitly connect WinRM (double-click or context menu)."""
|
||||
if self._current_alias and not self._client:
|
||||
self._connect(self._current_alias)
|
||||
|
||||
def disconnect(self):
|
||||
"""Disconnect WinRM and update UI (called by app)."""
|
||||
self._disconnect()
|
||||
if self._current_alias:
|
||||
self._set_status(t("term_click_to_connect").format(alias=self._current_alias), "#f59e0b")
|
||||
|
||||
# ── Connection ───────────────────────────────────────────────────
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
"""
|
||||
Prometheus tab — PromQL query executor, targets overview, and alerts.
|
||||
Prometheus tab — PromQL query executor, targets overview, alerts, and rules.
|
||||
"""
|
||||
|
||||
import threading
|
||||
@@ -8,7 +8,7 @@ from tkinter import ttk
|
||||
import customtkinter as ctk
|
||||
from core.prometheus_client import PrometheusClient
|
||||
from core.i18n import t
|
||||
from core.icons import icon_text, make_icon_button, reconfigure_icon_button
|
||||
from core.icons import make_icon_button, reconfigure_icon_button
|
||||
from gui.tabs.query_tab import apply_dark_scrollbar_style
|
||||
|
||||
|
||||
@@ -41,19 +41,35 @@ class PrometheusTab(ctk.CTkFrame):
|
||||
command=self._execute_query)
|
||||
self._exec_btn.pack(side="left")
|
||||
|
||||
# ── Quick query buttons ──
|
||||
quick_frame = ctk.CTkFrame(self, fg_color="transparent")
|
||||
quick_frame.pack(fill="x", padx=15, pady=(0, 5))
|
||||
|
||||
for label, query in [("up", "up"), ("CPU", "process_cpu_seconds_total"),
|
||||
("Goroutines", "go_goroutines")]:
|
||||
btn = make_icon_button(quick_frame, "metrics", label, width=80,
|
||||
fg_color="#6b7280", hover_color="#4b5563",
|
||||
command=lambda q=query: self._run_quick(q))
|
||||
btn.pack(side="left", padx=(0, 5))
|
||||
|
||||
self._metrics_btn = make_icon_button(quick_frame, "search", t("prom_metrics_browser"), width=100,
|
||||
fg_color="#6b7280", hover_color="#4b5563",
|
||||
command=self._open_metrics_browser)
|
||||
self._metrics_btn.pack(side="left", padx=(0, 5))
|
||||
|
||||
# ── Query results ──
|
||||
results_label = ctk.CTkLabel(self, text=t("prom_results"),
|
||||
font=ctk.CTkFont(size=12, weight="bold"), anchor="w")
|
||||
results_label.pack(fill="x", padx=15, pady=(10, 3))
|
||||
results_label.pack(fill="x", padx=15, pady=(5, 3))
|
||||
|
||||
self._results_box = ctk.CTkTextbox(self, height=150,
|
||||
self._results_box = ctk.CTkTextbox(self, height=120,
|
||||
font=ctk.CTkFont(family="Consolas", size=12),
|
||||
state="disabled")
|
||||
self._results_box.pack(fill="x", padx=15, pady=(0, 5))
|
||||
|
||||
# ── Targets section ──
|
||||
targets_header = ctk.CTkFrame(self, fg_color="transparent")
|
||||
targets_header.pack(fill="x", padx=15, pady=(10, 3))
|
||||
targets_header.pack(fill="x", padx=15, pady=(5, 3))
|
||||
|
||||
targets_label = ctk.CTkLabel(targets_header, text=t("prom_targets"),
|
||||
font=ctk.CTkFont(size=14, weight="bold"), anchor="w")
|
||||
@@ -68,7 +84,7 @@ class PrometheusTab(ctk.CTkFrame):
|
||||
|
||||
target_columns = ("job", "instance", "health", "last_scrape")
|
||||
self._targets_tree = ttk.Treeview(targets_frame, columns=target_columns, show="headings",
|
||||
selectmode="browse", height=6)
|
||||
selectmode="browse", height=5)
|
||||
self._targets_tree.heading("job", text=t("prom_target_job"))
|
||||
self._targets_tree.heading("instance", text=t("prom_target_instance"))
|
||||
self._targets_tree.heading("health", text=t("prom_target_health"))
|
||||
@@ -88,13 +104,39 @@ class PrometheusTab(ctk.CTkFrame):
|
||||
# ── Alerts section ──
|
||||
alerts_label = ctk.CTkLabel(self, text=t("prom_alerts"),
|
||||
font=ctk.CTkFont(size=14, weight="bold"), anchor="w")
|
||||
alerts_label.pack(fill="x", padx=15, pady=(10, 3))
|
||||
alerts_label.pack(fill="x", padx=15, pady=(5, 3))
|
||||
|
||||
self._alerts_box = ctk.CTkTextbox(self, height=100,
|
||||
self._alerts_box = ctk.CTkTextbox(self, height=80,
|
||||
font=ctk.CTkFont(family="Consolas", size=12),
|
||||
state="disabled")
|
||||
self._alerts_box.pack(fill="x", padx=15, pady=(0, 5))
|
||||
|
||||
# ── Rules section ──
|
||||
rules_label = ctk.CTkLabel(self, text=t("prom_rules"),
|
||||
font=ctk.CTkFont(size=14, weight="bold"), anchor="w")
|
||||
rules_label.pack(fill="x", padx=15, pady=(5, 3))
|
||||
|
||||
rules_frame = ctk.CTkFrame(self, fg_color="transparent")
|
||||
rules_frame.pack(fill="both", expand=True, padx=15, pady=(0, 5))
|
||||
|
||||
rules_columns = ("type", "name", "group", "health")
|
||||
self._rules_tree = ttk.Treeview(rules_frame, columns=rules_columns, show="headings",
|
||||
selectmode="browse", height=5)
|
||||
self._rules_tree.heading("type", text=t("prom_rule_type"))
|
||||
self._rules_tree.heading("name", text=t("prom_rule_name"))
|
||||
self._rules_tree.heading("group", text=t("prom_rule_group"))
|
||||
self._rules_tree.heading("health", text=t("prom_rule_health"))
|
||||
self._rules_tree.column("type", width=80, minwidth=60)
|
||||
self._rules_tree.column("name", width=250, minwidth=120)
|
||||
self._rules_tree.column("group", width=150, minwidth=80)
|
||||
self._rules_tree.column("health", width=80, minwidth=60)
|
||||
self._rules_tree.pack(side="left", fill="both", expand=True)
|
||||
|
||||
rules_scroll = ttk.Scrollbar(rules_frame, orient="vertical", command=self._rules_tree.yview,
|
||||
style="Dark.Vertical.TScrollbar")
|
||||
rules_scroll.pack(side="right", fill="y")
|
||||
self._rules_tree.configure(yscrollcommand=rules_scroll.set)
|
||||
|
||||
# ── Status bar ──
|
||||
self._status_bar = ctk.CTkLabel(self, text=t("prom_no_server"), anchor="w",
|
||||
font=ctk.CTkFont(size=11), text_color="#9ca3af")
|
||||
@@ -103,7 +145,6 @@ class PrometheusTab(ctk.CTkFrame):
|
||||
# ── Public API ──
|
||||
|
||||
def set_server(self, alias: str | None):
|
||||
"""Called when user selects a server in sidebar."""
|
||||
self._current_alias = alias
|
||||
self._client = None
|
||||
self._clear_all()
|
||||
@@ -114,6 +155,13 @@ class PrometheusTab(ctk.CTkFrame):
|
||||
else:
|
||||
self._set_status(t("prom_no_server"), "#9ca3af")
|
||||
|
||||
# ── Quick query ──
|
||||
|
||||
def _run_quick(self, query: str):
|
||||
self._query_entry.delete(0, "end")
|
||||
self._query_entry.insert(0, query)
|
||||
self._execute_query()
|
||||
|
||||
# ── PromQL execution ──
|
||||
|
||||
def _execute_query(self):
|
||||
@@ -121,7 +169,7 @@ class PrometheusTab(ctk.CTkFrame):
|
||||
if not query:
|
||||
return
|
||||
if not self._current_alias:
|
||||
self._set_results(t("no_server_selected"))
|
||||
self._set_results(t("prom_no_server"))
|
||||
return
|
||||
|
||||
self._exec_btn.configure(state="disabled")
|
||||
@@ -141,11 +189,9 @@ class PrometheusTab(ctk.CTkFrame):
|
||||
threading.Thread(target=_do, daemon=True).start()
|
||||
|
||||
def _format_query_result(self, result: dict) -> str:
|
||||
"""Format Prometheus query API response for display."""
|
||||
status = result.get("status", "unknown")
|
||||
if status != "success":
|
||||
error = result.get("error", "Unknown error")
|
||||
return f"Error: {error}"
|
||||
return f"Error: {result.get('error', 'Unknown error')}"
|
||||
|
||||
data = result.get("data", {})
|
||||
result_type = data.get("resultType", "")
|
||||
@@ -166,7 +212,7 @@ class PrometheusTab(ctk.CTkFrame):
|
||||
elif result_type == "matrix":
|
||||
values = item.get("values", [])
|
||||
lines.append(f"{{{metric_str}}}")
|
||||
for ts, val in values[-10:]: # Show last 10 points
|
||||
for ts, val in values[-10:]:
|
||||
lines.append(f" @{ts} => {val}")
|
||||
if len(values) > 10:
|
||||
lines.append(f" ... ({len(values)} total points)")
|
||||
@@ -175,28 +221,105 @@ class PrometheusTab(ctk.CTkFrame):
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
# ── Refresh targets & alerts ──
|
||||
# ── Metrics browser ──
|
||||
|
||||
def _open_metrics_browser(self):
|
||||
if not self._current_alias:
|
||||
self._set_status(t("prom_no_server"), "#ef4444")
|
||||
return
|
||||
|
||||
self._metrics_btn.configure(state="disabled")
|
||||
|
||||
def _do():
|
||||
try:
|
||||
client = self._get_client()
|
||||
resp = client._get("/api/v1/label/__name__/values")
|
||||
metrics = resp.get("data", [])
|
||||
self.after(0, lambda: self._show_metrics_popup(metrics))
|
||||
except Exception as e:
|
||||
self.after(0, lambda: self._set_status(f"(error) {e}", "#ef4444"))
|
||||
finally:
|
||||
self.after(0, lambda: self._metrics_btn.configure(state="normal"))
|
||||
|
||||
threading.Thread(target=_do, daemon=True).start()
|
||||
|
||||
def _show_metrics_popup(self, metrics: list[str]):
|
||||
popup = ctk.CTkToplevel(self)
|
||||
popup.title(t("prom_metrics_browser"))
|
||||
popup.geometry("450x500")
|
||||
popup.transient(self.winfo_toplevel())
|
||||
|
||||
filter_entry = ctk.CTkEntry(popup, placeholder_text=t("prom_filter_metrics"))
|
||||
filter_entry.pack(fill="x", padx=10, pady=(10, 5))
|
||||
|
||||
listbox_frame = ctk.CTkFrame(popup, fg_color="transparent")
|
||||
listbox_frame.pack(fill="both", expand=True, padx=10, pady=(0, 10))
|
||||
|
||||
tree = ttk.Treeview(listbox_frame, columns=("metric",), show="headings",
|
||||
selectmode="browse")
|
||||
tree.heading("metric", text="Metric Name")
|
||||
tree.column("metric", width=400)
|
||||
tree.pack(side="left", fill="both", expand=True)
|
||||
|
||||
scroll = ttk.Scrollbar(listbox_frame, orient="vertical", command=tree.yview,
|
||||
style="Dark.Vertical.TScrollbar")
|
||||
scroll.pack(side="right", fill="y")
|
||||
tree.configure(yscrollcommand=scroll.set)
|
||||
|
||||
all_metrics = sorted(metrics)
|
||||
|
||||
def populate(filter_text=""):
|
||||
tree.delete(*tree.get_children())
|
||||
for m in all_metrics:
|
||||
if filter_text.lower() in m.lower():
|
||||
tree.insert("", "end", values=(m,))
|
||||
|
||||
def on_filter(*_):
|
||||
populate(filter_entry.get())
|
||||
|
||||
filter_entry.bind("<KeyRelease>", on_filter)
|
||||
|
||||
def on_select(event):
|
||||
sel = tree.selection()
|
||||
if sel:
|
||||
metric = tree.item(sel[0])["values"][0]
|
||||
self._query_entry.delete(0, "end")
|
||||
self._query_entry.insert(0, metric)
|
||||
popup.destroy()
|
||||
|
||||
tree.bind("<Double-1>", on_select)
|
||||
populate()
|
||||
filter_entry.focus_set()
|
||||
|
||||
# ── Refresh targets, alerts & rules ──
|
||||
|
||||
def _refresh_all(self):
|
||||
if not self._current_alias:
|
||||
self._set_status(t("no_server_selected"), "#ef4444")
|
||||
self._set_status(t("prom_no_server"), "#ef4444")
|
||||
return
|
||||
|
||||
self._refresh_btn.configure(state="disabled", text=t("prom_loading"))
|
||||
self._refresh_btn.configure(state="disabled")
|
||||
self._set_status(t("prom_loading"), "#ccaa00")
|
||||
|
||||
def _do():
|
||||
try:
|
||||
client = self._get_client()
|
||||
|
||||
targets = client.get_targets()
|
||||
alerts = client.get_alerts()
|
||||
targets_resp = client.targets()
|
||||
targets = targets_resp.get("data", {}).get("activeTargets", [])
|
||||
alerts_resp = client.alerts()
|
||||
alerts = alerts_resp.get("data", {}).get("alerts", [])
|
||||
rules_resp = client.rules()
|
||||
rule_groups = rules_resp.get("data", {}).get("groups", [])
|
||||
|
||||
self.after(0, lambda: self._populate_targets(targets))
|
||||
self.after(0, lambda: self._populate_alerts(alerts))
|
||||
self.after(0, lambda: self._populate_rules(rule_groups))
|
||||
|
||||
rule_count = sum(len(g.get("rules", [])) for g in rule_groups)
|
||||
self.after(0, lambda: self._set_status(
|
||||
t("prom_loaded").format(
|
||||
targets=len(targets), alerts=len(alerts)
|
||||
targets=len(targets), alerts=len(alerts), rules=rule_count
|
||||
), "#22c55e"))
|
||||
except Exception as e:
|
||||
self.after(0, lambda: self._set_status(f"(error) {e}", "#ef4444"))
|
||||
@@ -210,7 +333,10 @@ class PrometheusTab(ctk.CTkFrame):
|
||||
|
||||
def _get_client(self) -> PrometheusClient:
|
||||
if self._client is None:
|
||||
self._client = PrometheusClient(self._current_alias, self.store)
|
||||
server = self.store.get_server(self._current_alias)
|
||||
if not server:
|
||||
raise ValueError(f"Server '{self._current_alias}' not found")
|
||||
self._client = PrometheusClient(server)
|
||||
return self._client
|
||||
|
||||
# ── Table population ──
|
||||
@@ -218,28 +344,19 @@ class PrometheusTab(ctk.CTkFrame):
|
||||
def _populate_targets(self, targets: list[dict]):
|
||||
self._targets_tree.delete(*self._targets_tree.get_children())
|
||||
for target in targets:
|
||||
job = target.get("labels", {}).get("job", "—")
|
||||
instance = target.get("labels", {}).get("instance", "—")
|
||||
job = target.get("labels", {}).get("job", "---")
|
||||
instance = target.get("labels", {}).get("instance", "---")
|
||||
health = target.get("health", "unknown")
|
||||
last_scrape = target.get("lastScrape", "—")
|
||||
|
||||
tag = ""
|
||||
if health == "up":
|
||||
tag = "up"
|
||||
elif health == "down":
|
||||
tag = "down"
|
||||
|
||||
last_scrape = target.get("lastScrape", "---")
|
||||
tag = "up" if health == "up" else ("down" if health == "down" else "")
|
||||
self._targets_tree.insert("", "end",
|
||||
values=(job, instance, health, last_scrape),
|
||||
tags=(tag,))
|
||||
|
||||
values=(job, instance, health, last_scrape), tags=(tag,))
|
||||
self._targets_tree.tag_configure("up", foreground="#22c55e")
|
||||
self._targets_tree.tag_configure("down", foreground="#ef4444")
|
||||
|
||||
def _populate_alerts(self, alerts: list[dict]):
|
||||
self._alerts_box.configure(state="normal")
|
||||
self._alerts_box.delete("1.0", "end")
|
||||
|
||||
if not alerts:
|
||||
self._alerts_box.insert("1.0", t("prom_no_alerts"))
|
||||
else:
|
||||
@@ -247,12 +364,25 @@ class PrometheusTab(ctk.CTkFrame):
|
||||
for a in alerts:
|
||||
name = a.get("labels", {}).get("alertname", a.get("name", "unknown"))
|
||||
state = a.get("state", "unknown")
|
||||
severity = a.get("labels", {}).get("severity", "—")
|
||||
severity = a.get("labels", {}).get("severity", "---")
|
||||
lines.append(f"[{state.upper()}] {name} (severity: {severity})")
|
||||
self._alerts_box.insert("1.0", "\n".join(lines))
|
||||
|
||||
self._alerts_box.configure(state="disabled")
|
||||
|
||||
def _populate_rules(self, groups: list[dict]):
|
||||
self._rules_tree.delete(*self._rules_tree.get_children())
|
||||
for group in groups:
|
||||
group_name = group.get("name", "")
|
||||
for rule in group.get("rules", []):
|
||||
rtype = rule.get("type", "")
|
||||
name = rule.get("name", "")
|
||||
health = rule.get("health", "")
|
||||
tag = "up" if health == "ok" else ("down" if health == "err" else "")
|
||||
self._rules_tree.insert("", "end",
|
||||
values=(rtype, name, group_name, health), tags=(tag,))
|
||||
self._rules_tree.tag_configure("up", foreground="#22c55e")
|
||||
self._rules_tree.tag_configure("down", foreground="#ef4444")
|
||||
|
||||
# ── Helpers ──
|
||||
|
||||
def _set_results(self, text: str):
|
||||
@@ -263,6 +393,7 @@ class PrometheusTab(ctk.CTkFrame):
|
||||
|
||||
def _clear_all(self):
|
||||
self._targets_tree.delete(*self._targets_tree.get_children())
|
||||
self._rules_tree.delete(*self._rules_tree.get_children())
|
||||
self._set_results("")
|
||||
self._alerts_box.configure(state="normal")
|
||||
self._alerts_box.delete("1.0", "end")
|
||||
|
||||
@@ -153,7 +153,24 @@ class S3Tab(ctk.CTkFrame):
|
||||
bucket_frame, variable=self._bucket_var, values=[""],
|
||||
width=200, command=self._on_bucket_change,
|
||||
)
|
||||
self._bucket_menu.pack(side="left", padx=(0, 15))
|
||||
self._bucket_menu.pack(side="left", padx=(0, 5))
|
||||
|
||||
# Create bucket [+]
|
||||
self._create_bucket_btn = ctk.CTkButton(
|
||||
bucket_frame, text="+", width=28, height=28,
|
||||
corner_radius=6, font=ctk.CTkFont(size=14, weight="bold"),
|
||||
command=self._create_bucket,
|
||||
)
|
||||
self._create_bucket_btn.pack(side="left", padx=(0, 3))
|
||||
|
||||
# Delete bucket [🗑]
|
||||
self._delete_bucket_btn = ctk.CTkButton(
|
||||
bucket_frame, text="\U0001f5d1", width=28, height=28,
|
||||
corner_radius=6, fg_color="#dc2626", hover_color="#b91c1c",
|
||||
font=ctk.CTkFont(size=13),
|
||||
command=self._delete_bucket,
|
||||
)
|
||||
self._delete_bucket_btn.pack(side="left", padx=(0, 15))
|
||||
|
||||
# Path display
|
||||
self._path_label = ctk.CTkLabel(
|
||||
@@ -626,6 +643,64 @@ class S3Tab(ctk.CTkFrame):
|
||||
|
||||
threading.Thread(target=_do, daemon=True).start()
|
||||
|
||||
def _create_bucket(self):
|
||||
"""Prompt for bucket name and create it."""
|
||||
if not self._client:
|
||||
return
|
||||
dialog = ctk.CTkInputDialog(
|
||||
text=t("s3_bucket_name_prompt"),
|
||||
title=t("s3_create_bucket"),
|
||||
)
|
||||
name = dialog.get_input()
|
||||
if not name or not name.strip():
|
||||
return
|
||||
name = name.strip()
|
||||
self._status_label.configure(text="...")
|
||||
|
||||
def _do():
|
||||
ok = self._client.create_bucket(name)
|
||||
self.after(0, lambda: self._on_bucket_created(ok, name))
|
||||
|
||||
threading.Thread(target=_do, daemon=True).start()
|
||||
|
||||
def _on_bucket_created(self, ok: bool, name: str):
|
||||
if ok:
|
||||
self._status_label.configure(
|
||||
text=t("s3_bucket_created").format(name=name))
|
||||
self._current_bucket = name
|
||||
self._load_buckets()
|
||||
else:
|
||||
self._status_label.configure(text=t("s3_folder_failed"))
|
||||
|
||||
def _delete_bucket(self):
|
||||
"""Delete the currently selected bucket (must be empty)."""
|
||||
if not self._client or not self._current_bucket:
|
||||
return
|
||||
from tkinter import messagebox
|
||||
ok = messagebox.askyesno(
|
||||
t("s3_delete_bucket"),
|
||||
t("s3_delete_bucket_confirm").format(name=self._current_bucket),
|
||||
)
|
||||
if not ok:
|
||||
return
|
||||
bucket_name = self._current_bucket
|
||||
self._status_label.configure(text="...")
|
||||
|
||||
def _do():
|
||||
ok = self._client.delete_bucket(bucket_name)
|
||||
self.after(0, lambda: self._on_bucket_deleted(ok, bucket_name))
|
||||
|
||||
threading.Thread(target=_do, daemon=True).start()
|
||||
|
||||
def _on_bucket_deleted(self, ok: bool, name: str):
|
||||
if ok:
|
||||
self._status_label.configure(
|
||||
text=t("s3_bucket_deleted").format(name=name))
|
||||
self._current_bucket = ""
|
||||
self._load_buckets()
|
||||
else:
|
||||
self._status_label.configure(text=t("s3_delete_failed"))
|
||||
|
||||
def _go_back(self):
|
||||
if self._nav_stack:
|
||||
self._current_prefix = self._nav_stack.pop()
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
"""
|
||||
Setup tab — one-click installation for Claude Code integration.
|
||||
Setup tab — one-click installation for local AI agent integration.
|
||||
Includes configuration path management and backup/restore.
|
||||
"""
|
||||
|
||||
@@ -8,7 +8,15 @@ import threading
|
||||
from datetime import datetime
|
||||
from tkinter import filedialog, messagebox
|
||||
import customtkinter as ctk
|
||||
from core.claude_setup import check_status, install_all, install_ssh_script, install_skill, generate_ssh_key
|
||||
from core.claude_setup import (
|
||||
check_status,
|
||||
generate_ssh_key,
|
||||
install_all,
|
||||
install_claude_skill,
|
||||
install_codex_skill,
|
||||
install_gemini_skill,
|
||||
install_ssh_script,
|
||||
)
|
||||
from core.i18n import t
|
||||
from core.icons import icon_text, make_icon_button
|
||||
from core.logger import log
|
||||
@@ -25,13 +33,13 @@ class SetupTab(ctk.CTkFrame):
|
||||
|
||||
# Header
|
||||
self.header_label = ctk.CTkLabel(
|
||||
self._scroll, text=t("claude_integration"),
|
||||
self._scroll, text=t("agent_integration"),
|
||||
font=ctk.CTkFont(size=20, weight="bold")
|
||||
)
|
||||
self.header_label.pack(padx=20, pady=(20, 5))
|
||||
|
||||
self.desc_label = ctk.CTkLabel(
|
||||
self._scroll, text=t("claude_desc"),
|
||||
self._scroll, text=t("agent_desc"),
|
||||
text_color="#9ca3af", justify="center"
|
||||
)
|
||||
self.desc_label.pack(padx=20, pady=(0, 15))
|
||||
@@ -53,7 +61,11 @@ class SetupTab(ctk.CTkFrame):
|
||||
("servers_json", "status_servers_json"),
|
||||
("ssh_script", "status_ssh_script"),
|
||||
("encryption", "status_encryption"),
|
||||
("skill_installed", "status_skill"),
|
||||
("claude_skill_installed", "status_claude_skill"),
|
||||
("codex_skill_installed", "status_codex_skill"),
|
||||
("codex_wrapper_installed", "status_codex_wrapper"),
|
||||
("gemini_skill_installed", "status_gemini_skill"),
|
||||
("gemini_wrapper_installed", "status_gemini_wrapper"),
|
||||
("ssh_key_exists", "status_ssh_key"),
|
||||
]
|
||||
for key, i18n_key in status_items:
|
||||
@@ -82,17 +94,46 @@ class SetupTab(ctk.CTkFrame):
|
||||
ind_frame = ctk.CTkFrame(btn_frame, fg_color="transparent")
|
||||
ind_frame.pack(fill="x")
|
||||
|
||||
self.ssh_py_btn = make_icon_button(ind_frame, "confirm", t("install_ssh_py"), width=110, fg_color="#6b7280",
|
||||
command=self._install_script)
|
||||
top_btn_row = ctk.CTkFrame(ind_frame, fg_color="transparent")
|
||||
top_btn_row.pack(fill="x", pady=(0, 5))
|
||||
|
||||
self.ssh_py_btn = make_icon_button(
|
||||
top_btn_row, "confirm", t("install_ssh_py"), width=120, fg_color="#6b7280",
|
||||
command=self._install_script
|
||||
)
|
||||
self.ssh_py_btn.pack(side="left", padx=(0, 5))
|
||||
self.skill_btn = make_icon_button(ind_frame, "confirm", t("install_skill"), width=110, fg_color="#6b7280",
|
||||
command=self._install_skill)
|
||||
self.skill_btn.pack(side="left", padx=5)
|
||||
self.ssh_key_btn = make_icon_button(ind_frame, "confirm", t("install_ssh_key"), width=110, fg_color="#6b7280",
|
||||
command=self._gen_key)
|
||||
self.ssh_key_btn.pack(side="left", padx=5)
|
||||
self.refresh_btn = make_icon_button(ind_frame, "refresh", t("refresh"), width=90, fg_color="#3b82f6",
|
||||
command=self._refresh_status)
|
||||
|
||||
self.claude_skill_btn = make_icon_button(
|
||||
top_btn_row, "confirm", t("install_claude_skill"), width=130, fg_color="#6b7280",
|
||||
command=self._install_claude_skill
|
||||
)
|
||||
self.claude_skill_btn.pack(side="left", padx=5)
|
||||
|
||||
self.codex_skill_btn = make_icon_button(
|
||||
top_btn_row, "confirm", t("install_codex_skill"), width=130, fg_color="#6b7280",
|
||||
command=self._install_codex_skill
|
||||
)
|
||||
self.codex_skill_btn.pack(side="left", padx=5)
|
||||
|
||||
self.gemini_skill_btn = make_icon_button(
|
||||
top_btn_row, "confirm", t("install_gemini_skill"), width=130, fg_color="#6b7280",
|
||||
command=self._install_gemini_skill
|
||||
)
|
||||
self.gemini_skill_btn.pack(side="left", padx=5)
|
||||
|
||||
bottom_btn_row = ctk.CTkFrame(ind_frame, fg_color="transparent")
|
||||
bottom_btn_row.pack(fill="x")
|
||||
|
||||
self.ssh_key_btn = make_icon_button(
|
||||
bottom_btn_row, "confirm", t("install_ssh_key"), width=120, fg_color="#6b7280",
|
||||
command=self._gen_key
|
||||
)
|
||||
self.ssh_key_btn.pack(side="left", padx=(0, 5))
|
||||
|
||||
self.refresh_btn = make_icon_button(
|
||||
bottom_btn_row, "refresh", t("refresh"), width=90, fg_color="#3b82f6",
|
||||
command=self._refresh_status
|
||||
)
|
||||
self.refresh_btn.pack(side="right")
|
||||
|
||||
# ── Monitoring section ─────────────────────────
|
||||
@@ -328,8 +369,23 @@ class SetupTab(ctk.CTkFrame):
|
||||
self._log(msg)
|
||||
self._refresh_status()
|
||||
|
||||
def _install_claude_skill(self):
|
||||
msg = install_claude_skill()
|
||||
self._log(msg)
|
||||
self._refresh_status()
|
||||
|
||||
def _install_codex_skill(self):
|
||||
msg = install_codex_skill()
|
||||
self._log(msg)
|
||||
self._refresh_status()
|
||||
|
||||
def _install_gemini_skill(self):
|
||||
msg = install_gemini_skill()
|
||||
self._log(msg)
|
||||
self._refresh_status()
|
||||
|
||||
def _install_skill(self):
|
||||
msg = install_skill()
|
||||
msg = install_claude_skill()
|
||||
self._log(msg)
|
||||
self._refresh_status()
|
||||
|
||||
|
||||
@@ -29,6 +29,18 @@ class TerminalTab(ctk.CTkFrame):
|
||||
# Import here to avoid circular issues
|
||||
from gui.widgets.terminal_widget import TerminalWidget
|
||||
|
||||
self._toolbar = ctk.CTkFrame(self, fg_color="transparent", height=32)
|
||||
self._toolbar.pack(fill="x", padx=5, pady=(5, 0))
|
||||
self._toolbar.pack_propagate(False)
|
||||
self._conn_btn = ctk.CTkButton(
|
||||
self._toolbar, text=t("ctx_connect"), width=120, height=28,
|
||||
fg_color="#6b7280", hover_color="#4b5563",
|
||||
font=ctk.CTkFont(size=12), state="disabled",
|
||||
command=self._on_conn_btn_click,
|
||||
)
|
||||
self._conn_btn.pack(side="right", padx=2)
|
||||
self._connected = False
|
||||
|
||||
self._terminal = TerminalWidget(
|
||||
self,
|
||||
send_callback=self._send_to_shell,
|
||||
@@ -37,6 +49,15 @@ class TerminalTab(ctk.CTkFrame):
|
||||
on_font_size_changed=self._on_font_size_changed,
|
||||
)
|
||||
self._terminal.pack(fill="both", expand=True, padx=5, pady=5)
|
||||
|
||||
# Overlay "OFF" label (shown when disconnected)
|
||||
self._overlay = ctk.CTkLabel(
|
||||
self._terminal, text=t("term_off"),
|
||||
font=ctk.CTkFont(size=72, weight="bold"),
|
||||
text_color=("#cccccc", "#333333"),
|
||||
fg_color="transparent",
|
||||
)
|
||||
self._overlay.place(relx=0.5, rely=0.45, anchor="center")
|
||||
self._terminal.set_status(t("term_disconnected"), "#888888")
|
||||
|
||||
# Thread-safe data queue
|
||||
@@ -45,6 +66,7 @@ class TerminalTab(ctk.CTkFrame):
|
||||
# Sudo auto-password detection
|
||||
self._sudo_buffer = b"" # Buffer for detecting sudo prompts
|
||||
self._sudo_sent = False # Prevent sending password twice for same prompt
|
||||
self._on_disconnect_callback = None
|
||||
|
||||
def set_server(self, alias: str | None):
|
||||
if alias == self._current_alias:
|
||||
@@ -61,11 +83,48 @@ class TerminalTab(ctk.CTkFrame):
|
||||
|
||||
self._current_alias = alias
|
||||
if alias:
|
||||
self._connect()
|
||||
self._set_conn_btn_disconnected()
|
||||
self._conn_btn.configure(state="normal")
|
||||
self._terminal.set_status(t("term_click_to_connect").format(alias=alias), "#f59e0b")
|
||||
else:
|
||||
self._set_conn_btn_disconnected()
|
||||
self._conn_btn.configure(state="disabled")
|
||||
self._terminal.reset()
|
||||
self._terminal.set_status(t("term_disconnected"), "#888888")
|
||||
|
||||
def connect(self):
|
||||
"""Explicitly connect (double-click or context menu)."""
|
||||
if self._current_alias and not self._session:
|
||||
self._connect()
|
||||
|
||||
def _on_conn_btn_click(self):
|
||||
if self._connected:
|
||||
if self._on_disconnect_callback and self._current_alias:
|
||||
self._on_disconnect_callback(self._current_alias)
|
||||
else:
|
||||
self.connect()
|
||||
|
||||
def _set_conn_btn_connected(self):
|
||||
self._connected = True
|
||||
self._conn_btn.configure(
|
||||
text=t("ctx_disconnect"), fg_color="#dc2626", hover_color="#b91c1c", state="normal",
|
||||
)
|
||||
self._overlay.place_forget()
|
||||
|
||||
def _set_conn_btn_disconnected(self):
|
||||
self._connected = False
|
||||
self._conn_btn.configure(
|
||||
text=t("ctx_connect"), fg_color="#6b7280", hover_color="#4b5563",
|
||||
)
|
||||
self._overlay.place(relx=0.5, rely=0.45, anchor="center")
|
||||
|
||||
def disconnect(self):
|
||||
"""Disconnect and update UI (called by app)."""
|
||||
self._disconnect()
|
||||
self._set_conn_btn_disconnected()
|
||||
if self._current_alias:
|
||||
self._terminal.set_status(t("term_click_to_connect").format(alias=self._current_alias), "#f59e0b")
|
||||
|
||||
def _connect(self):
|
||||
if not self._current_alias:
|
||||
return
|
||||
@@ -135,6 +194,7 @@ class TerminalTab(ctk.CTkFrame):
|
||||
# Only grab focus if terminal tab is currently visible
|
||||
if self._terminal.winfo_ismapped():
|
||||
self._terminal.focus_terminal()
|
||||
self._set_conn_btn_connected()
|
||||
self.after(0, _set_session)
|
||||
except Exception as e:
|
||||
self.after(0, lambda: self._terminal.set_status(
|
||||
|
||||
160
plans/disable-terminal-autoconnect.md
Normal file
160
plans/disable-terminal-autoconnect.md
Normal file
@@ -0,0 +1,160 @@
|
||||
# Отключить автоподключение терминала при одинарном клике
|
||||
|
||||
## Контекст
|
||||
|
||||
При одинарном клике на сервер в sidebar все табы (terminal, files, powershell) сразу подключаются к серверу. Пользователь хочет просто переключаться между серверами без автоподключения. Подключение — только по двойному клику.
|
||||
|
||||
## Подход
|
||||
|
||||
- **Одинарный клик** — выбрать сервер, обновить табы (info, setup, keys и т.д.), но НЕ подключаться к terminal/files/powershell
|
||||
- **Двойной клик** — выбрать сервер + подключить все "connecting" табы (terminal, files, powershell)
|
||||
- **Контекстное меню** "Open Terminal" / "Browse Files" — тоже подключает
|
||||
|
||||
Tkinter при двойном клике генерирует оба события: `<Button-1>` (первый клик) → `<Double-Button-1>`. Это нам на руку: первый клик выберет сервер, двойной клик — подключит. Debounce не нужен.
|
||||
|
||||
## Изменения — 4 файла
|
||||
|
||||
### 1. `gui/sidebar.py` — добавить двойной клик
|
||||
|
||||
**Строка 37** — добавить `on_double_click` в конструктор:
|
||||
```python
|
||||
def __init__(self, master, store, on_select=None, on_double_click=None, session_pool=None):
|
||||
...
|
||||
self.on_select = on_select
|
||||
self.on_double_click = on_double_click
|
||||
```
|
||||
|
||||
**Строки 272-275** — добавить `<Double-Button-1>` binding:
|
||||
```python
|
||||
for widget in [frame, info, name_label, detail_label, badge, type_badge, session_ind]:
|
||||
widget.bind("<Button-1>", lambda e, a=alias: self._select(a))
|
||||
widget.bind("<Double-Button-1>", lambda e, a=alias: self._on_double_click(a))
|
||||
widget.bind("<Button-3>", lambda e, a=alias: self._show_context_menu(e, a))
|
||||
```
|
||||
|
||||
**После `_select()`** (строка 372) — новый метод:
|
||||
```python
|
||||
def _on_double_click(self, alias: str):
|
||||
self._select(alias)
|
||||
if self.on_double_click:
|
||||
self.on_double_click(alias)
|
||||
```
|
||||
|
||||
### 2. `gui/tabs/terminal_tab.py` — убрать автоподключение
|
||||
|
||||
**Строки 49-67** — `set_server()`: заменить `self._connect()` на показ статуса:
|
||||
```python
|
||||
def set_server(self, alias: str | None):
|
||||
if alias == self._current_alias:
|
||||
return
|
||||
if self._current_alias and self._session and self.session_pool:
|
||||
buf = self._terminal.get_current_buffer()
|
||||
self.session_pool.store_shell_state(self._current_alias, buf)
|
||||
self._disconnect()
|
||||
self._current_alias = alias
|
||||
if alias:
|
||||
self._terminal.set_status(t("term_click_to_connect").format(alias=alias), "#f59e0b")
|
||||
else:
|
||||
self._terminal.reset()
|
||||
self._terminal.set_status(t("term_disconnected"), "#888888")
|
||||
```
|
||||
|
||||
**Добавить публичный метод `connect()`** после `set_server()`:
|
||||
```python
|
||||
def connect(self):
|
||||
"""Explicitly connect (double-click or context menu)."""
|
||||
if self._current_alias and not self._session:
|
||||
self._connect()
|
||||
```
|
||||
|
||||
### 3. `gui/tabs/files_tab.py` — убрать автоподключение
|
||||
|
||||
**Строки 304-311** — `set_server()`: заменить `self._connect_sftp()` на статус:
|
||||
```python
|
||||
if alias:
|
||||
if self.session_pool:
|
||||
stored_path, stored_sudo = self.session_pool.get_sftp_state(alias)
|
||||
if stored_path != "/":
|
||||
self._remote_path = stored_path
|
||||
self._remote_status.configure(text=t("sftp_click_to_connect"))
|
||||
else:
|
||||
...
|
||||
```
|
||||
|
||||
**Добавить публичный метод `connect()`**:
|
||||
```python
|
||||
def connect(self):
|
||||
"""Explicitly connect SFTP (double-click or context menu)."""
|
||||
if self._current_alias and not self._sftp:
|
||||
self._connect_sftp()
|
||||
```
|
||||
|
||||
### 4. `gui/tabs/powershell_tab.py` — убрать автоподключение
|
||||
|
||||
**Строка 100** — заменить `self._connect(alias)` на статус:
|
||||
```python
|
||||
if alias is None:
|
||||
self._set_status(t("ps_disconnected"), "#888888")
|
||||
return
|
||||
self._set_status(t("term_click_to_connect").format(alias=alias), "#f59e0b")
|
||||
```
|
||||
|
||||
**Добавить публичный метод `connect()`**:
|
||||
```python
|
||||
def connect(self):
|
||||
"""Explicitly connect WinRM (double-click or context menu)."""
|
||||
if self._current_alias and not self._client:
|
||||
self._connect(self._current_alias)
|
||||
```
|
||||
|
||||
### 5. `gui/app.py` — подключить двойной клик
|
||||
|
||||
**Строка 157** — передать `on_double_click`:
|
||||
```python
|
||||
self.sidebar = Sidebar(self._paned, self.store,
|
||||
on_select=self._on_server_select,
|
||||
on_double_click=self._on_server_connect,
|
||||
session_pool=self.session_pool)
|
||||
```
|
||||
|
||||
**Новый метод `_on_server_connect()`** (после `_on_server_select`):
|
||||
```python
|
||||
def _on_server_connect(self, alias: str):
|
||||
"""Double-click: connect interactive tabs (terminal, files, powershell)."""
|
||||
for key, widget in self._tab_instances.items():
|
||||
if hasattr(widget, "connect"):
|
||||
widget.connect()
|
||||
```
|
||||
|
||||
**Строки 350-358** — `_context_open_tab()`: добавить вызов `connect()`:
|
||||
```python
|
||||
def _context_open_tab(self, alias: str, tab_key: str):
|
||||
self._on_server_select(alias)
|
||||
self.sidebar._select(alias)
|
||||
if tab_key in self._tab_keys:
|
||||
try:
|
||||
self.tabview.set(_tab_label(tab_key))
|
||||
except Exception:
|
||||
pass
|
||||
# Connect the target tab if it supports explicit connection
|
||||
widget = self._tab_instances.get(tab_key)
|
||||
if widget and hasattr(widget, "connect"):
|
||||
widget.connect()
|
||||
```
|
||||
|
||||
### 6. `core/i18n.py` — 2 ключа перевода
|
||||
|
||||
Рядом с `term_disconnected`:
|
||||
|
||||
| Ключ | EN | RU | ZH |
|
||||
|------|----|----|-----|
|
||||
| `term_click_to_connect` | `Double-click to connect to {alias}` | `Двойной клик для подключения к {alias}` | `双击连接 {alias}` |
|
||||
| `sftp_click_to_connect` | `Double-click server to browse files` | `Двойной клик для просмотра файлов` | `双击服务器浏览文件` |
|
||||
|
||||
## Верификация
|
||||
|
||||
1. `python build.py` — собрать exe
|
||||
2. Запустить exe, одинарный клик на SSH-сервер → терминал показывает "Двойной клик для подключения", файлы показывают аналогичное сообщение, info таб работает как раньше
|
||||
3. Двойной клик на сервер → терминал и файлы подключаются
|
||||
4. Правый клик → "Open Terminal" → терминал подключается
|
||||
5. Переключение между серверами одним кликом → нет автоподключений, быстрое переключение
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
releases/ServerManager-v1.9.44-win-x64.exe
Normal file
BIN
releases/ServerManager-v1.9.44-win-x64.exe
Normal file
Binary file not shown.
Binary file not shown.
93
test_ai_setup.py
Normal file
93
test_ai_setup.py
Normal file
@@ -0,0 +1,93 @@
|
||||
import os
|
||||
import tempfile
|
||||
import unittest
|
||||
from pathlib import Path
|
||||
|
||||
from core import claude_setup as cs
|
||||
|
||||
|
||||
class AISetupTests(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self._old_target = os.environ.get("SERVER_MANAGER_TARGET_HOME")
|
||||
self._old_all = os.environ.get("SERVER_MANAGER_INSTALL_ALL_USERS")
|
||||
self._old_iter = cs._iter_all_user_homes
|
||||
self._old_platform = cs.sys.platform
|
||||
|
||||
def tearDown(self):
|
||||
if self._old_target is None:
|
||||
os.environ.pop("SERVER_MANAGER_TARGET_HOME", None)
|
||||
else:
|
||||
os.environ["SERVER_MANAGER_TARGET_HOME"] = self._old_target
|
||||
|
||||
if self._old_all is None:
|
||||
os.environ.pop("SERVER_MANAGER_INSTALL_ALL_USERS", None)
|
||||
else:
|
||||
os.environ["SERVER_MANAGER_INSTALL_ALL_USERS"] = self._old_all
|
||||
|
||||
cs._iter_all_user_homes = self._old_iter
|
||||
cs.sys.platform = self._old_platform
|
||||
|
||||
def test_single_target_installers_create_expected_files(self):
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
os.environ["SERVER_MANAGER_TARGET_HOME"] = tmp
|
||||
|
||||
cs.install_ssh_script()
|
||||
cs.install_claude_skill()
|
||||
cs.install_codex_skill()
|
||||
cs.install_gemini_skill()
|
||||
cs.install_global_claude_md()
|
||||
cs.install_global_gemini_md()
|
||||
|
||||
self.assertTrue(Path(tmp, ".server-connections", "ssh.py").exists())
|
||||
self.assertTrue(Path(tmp, ".server-connections", "encryption.py").exists())
|
||||
self.assertTrue(Path(tmp, ".claude", "commands", "ssh.md").exists())
|
||||
self.assertTrue(Path(tmp, ".codex", "skills", "server-manager", "SKILL.md").exists())
|
||||
self.assertTrue(Path(tmp, ".gemini", "skills", "server-manager", "SKILL.md").exists())
|
||||
self.assertTrue(Path(tmp, ".server-connections", "codex-ssh").exists())
|
||||
self.assertTrue(Path(tmp, ".server-connections", "gemini-ssh").exists())
|
||||
self.assertTrue(Path(tmp, ".claude", "CLAUDE.md").exists())
|
||||
self.assertTrue(Path(tmp, ".gemini", "GEMINI.md").exists())
|
||||
self.assertFalse(Path(tmp, ".agents", "skills", "server-manager").exists())
|
||||
|
||||
status = cs.check_status()
|
||||
self.assertTrue(status["claude_skill_installed"])
|
||||
self.assertTrue(status["codex_skill_installed"])
|
||||
self.assertTrue(status["gemini_skill_installed"])
|
||||
self.assertTrue(status["codex_wrapper_installed"])
|
||||
self.assertTrue(status["gemini_wrapper_installed"])
|
||||
|
||||
def test_install_all_users_mode_installs_into_each_home_and_skips_ssh_key(self):
|
||||
with tempfile.TemporaryDirectory() as base:
|
||||
home1 = Path(base, "user1")
|
||||
home2 = Path(base, "user2")
|
||||
home1.mkdir()
|
||||
home2.mkdir()
|
||||
|
||||
os.environ["SERVER_MANAGER_INSTALL_ALL_USERS"] = "1"
|
||||
cs._iter_all_user_homes = lambda: [str(home1), str(home2)]
|
||||
|
||||
results = cs.install_all()
|
||||
|
||||
self.assertIn("INFO: SSH key generation skipped", "\n".join(results))
|
||||
self.assertTrue(Path(home1, ".codex", "skills", "server-manager", "SKILL.md").exists())
|
||||
self.assertTrue(Path(home1, ".gemini", "skills", "server-manager", "SKILL.md").exists())
|
||||
self.assertTrue(Path(home2, ".codex", "skills", "server-manager", "SKILL.md").exists())
|
||||
self.assertTrue(Path(home2, ".gemini", "skills", "server-manager", "SKILL.md").exists())
|
||||
self.assertFalse(Path(home1, ".ssh", "id_ed25519").exists())
|
||||
self.assertFalse(Path(home2, ".ssh", "id_ed25519").exists())
|
||||
|
||||
def test_windows_wrapper_names_are_generated_with_cmd_suffix(self):
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
os.environ["SERVER_MANAGER_TARGET_HOME"] = tmp
|
||||
cs.sys.platform = "win32"
|
||||
|
||||
cs.install_ssh_script()
|
||||
cs.install_codex_skill()
|
||||
cs.install_gemini_skill()
|
||||
|
||||
self.assertTrue(Path(tmp, ".server-connections", "codex-ssh.cmd").exists())
|
||||
self.assertTrue(Path(tmp, ".server-connections", "gemini-ssh.cmd").exists())
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
435
tools/install.sh
435
tools/install.sh
@@ -1,30 +1,40 @@
|
||||
#!/usr/bin/env bash
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
# ServerManager CLI Installer for Linux (headless / no-GUI)
|
||||
# ServerManager AI Integration Installer for Linux/macOS (headless / no-GUI)
|
||||
#
|
||||
# Устанавливает:
|
||||
# - ssh.py + encryption.py → ~/.server-connections/
|
||||
# - servers.json + settings.json → ~/.server-connections/ (если есть)
|
||||
# - CLAUDE.md → ~/.claude/
|
||||
# - ssh.md (скилл) → ~/.claude/commands/
|
||||
# - Python-зависимости для CLI (paramiko, cryptography, etc.)
|
||||
# Installs for each target home:
|
||||
# - ssh.py + encryption.py -> ~/.server-connections/
|
||||
# - Claude /ssh skill -> ~/.claude/commands/
|
||||
# - Codex server-manager skill -> ~/.codex/skills/server-manager/
|
||||
# - Gemini server-manager skill -> ~/.gemini/skills/server-manager/
|
||||
# - codex-ssh / gemini-ssh wrappers -> ~/.server-connections/
|
||||
# - CLAUDE.md / GEMINI.md (if available) -> ~/.claude/ / ~/.gemini/
|
||||
#
|
||||
# Запуск:
|
||||
# curl -sSL https://git.sensey24.ru/aibot777/server-manager/raw/branch/master/tools/install.sh | bash
|
||||
# или:
|
||||
# Optional per-target local config copy:
|
||||
# - servers.json + settings.json -> ~/.server-connections/
|
||||
#
|
||||
# Notes:
|
||||
# - servers.json is NEVER downloaded remotely.
|
||||
# - --all-users installs code/skills/wrappers for discovered homes, but skips
|
||||
# copying servers.json to avoid replicating credentials between users.
|
||||
# - Gemini also supports ~/.agents/skills, but this installer avoids placing
|
||||
# the same skill in both ~/.gemini/skills and ~/.agents/skills by default
|
||||
# because Gemini reports that as a duplicate-skill conflict.
|
||||
#
|
||||
# Usage:
|
||||
# bash install.sh
|
||||
# или с указанием источника файлов:
|
||||
# bash install.sh /path/to/server-manager/
|
||||
# bash install.sh /path/to/server-manager
|
||||
# bash install.sh --source-dir /path/to/server-manager --target-home /root
|
||||
# bash install.sh --all-users --source-dir /path/to/server-manager
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
set -euo pipefail
|
||||
|
||||
# ── Colors ──
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
BLUE='\033[0;34m'
|
||||
CYAN='\033[0;36m'
|
||||
NC='\033[0m' # No Color
|
||||
NC='\033[0m'
|
||||
|
||||
info() { echo -e "${BLUE}[INFO]${NC} $*"; }
|
||||
ok() { echo -e "${GREEN}[OK]${NC} $*"; }
|
||||
@@ -32,33 +42,80 @@ warn() { echo -e "${YELLOW}[WARN]${NC} $*"; }
|
||||
error() { echo -e "${RED}[ERROR]${NC} $*"; }
|
||||
step() { echo -e "\n${CYAN}━━━ $* ━━━${NC}"; }
|
||||
|
||||
# ── Config ──
|
||||
CONN_DIR="$HOME/.server-connections"
|
||||
CLAUDE_DIR="$HOME/.claude"
|
||||
COMMANDS_DIR="$CLAUDE_DIR/commands"
|
||||
usage() {
|
||||
cat <<USAGE
|
||||
ServerManager AI integration installer
|
||||
|
||||
Options:
|
||||
--source-dir PATH Use local repo as source of files
|
||||
--target-home PATH Install into a specific user's home
|
||||
--all-users Install into all discovered user homes on this machine
|
||||
--install-agents-mirror Also mirror Gemini skill into ~/.agents/skills
|
||||
-h, --help Show this help
|
||||
|
||||
Positional compatibility:
|
||||
install.sh /path/to/server-manager # same as --source-dir
|
||||
USAGE
|
||||
}
|
||||
|
||||
GITEA_RAW="https://git.sensey24.ru/aibot777/server-manager/raw/branch/master"
|
||||
SRC_DIR=""
|
||||
TARGET_HOME="${SERVER_MANAGER_TARGET_HOME:-${TARGET_HOME:-$HOME}}"
|
||||
INSTALL_ALL_USERS=0
|
||||
INSTALL_AGENTS_MIRROR=0
|
||||
|
||||
# Source directory (optional argument)
|
||||
SRC_DIR="${1:-}"
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case "$1" in
|
||||
--source-dir)
|
||||
SRC_DIR="$2"
|
||||
shift 2
|
||||
;;
|
||||
--target-home)
|
||||
TARGET_HOME="$2"
|
||||
shift 2
|
||||
;;
|
||||
--all-users)
|
||||
INSTALL_ALL_USERS=1
|
||||
shift
|
||||
;;
|
||||
--install-agents-mirror)
|
||||
INSTALL_AGENTS_MIRROR=1
|
||||
shift
|
||||
;;
|
||||
-h|--help)
|
||||
usage
|
||||
exit 0
|
||||
;;
|
||||
*)
|
||||
if [[ -z "$SRC_DIR" ]]; then
|
||||
SRC_DIR="$1"
|
||||
shift
|
||||
else
|
||||
error "Неизвестный аргумент: $1"
|
||||
usage
|
||||
exit 2
|
||||
fi
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
# ── Banner ──
|
||||
echo -e "${CYAN}"
|
||||
echo "╔══════════════════════════════════════════════╗"
|
||||
echo "║ ServerManager CLI Installer for Linux ║"
|
||||
echo "║ github: git.sensey24.ru/aibot777 ║"
|
||||
echo "╚══════════════════════════════════════════════╝"
|
||||
echo "╔══════════════════════════════════════════════════════╗"
|
||||
echo "║ ServerManager AI Integration Installer (headless) ║"
|
||||
echo "║ Claude + Codex + Gemini ║"
|
||||
echo "╚══════════════════════════════════════════════════════╝"
|
||||
echo -e "${NC}"
|
||||
|
||||
# ── Step 1: Check Python ──
|
||||
step "1/5 Проверка Python"
|
||||
|
||||
PYTHON=""
|
||||
for cmd in python3 python; do
|
||||
if command -v "$cmd" &>/dev/null; then
|
||||
ver=$("$cmd" --version 2>&1 | grep -oP '\d+\.\d+')
|
||||
major=$(echo "$ver" | cut -d. -f1)
|
||||
minor=$(echo "$ver" | cut -d. -f2)
|
||||
if [ "$major" -ge 3 ] && [ "$minor" -ge 8 ]; then
|
||||
if "$cmd" - <<'PY' &>/dev/null
|
||||
import sys
|
||||
raise SystemExit(0 if sys.version_info >= (3, 8) else 1)
|
||||
PY
|
||||
then
|
||||
PYTHON="$cmd"
|
||||
ok "Python найден: $($cmd --version)"
|
||||
break
|
||||
@@ -66,14 +123,11 @@ for cmd in python3 python; do
|
||||
fi
|
||||
done
|
||||
|
||||
if [ -z "$PYTHON" ]; then
|
||||
error "Python 3.8+ не найден!"
|
||||
echo " Установите: sudo apt install python3 python3-pip"
|
||||
echo " или: sudo yum install python3 python3-pip"
|
||||
if [[ -z "$PYTHON" ]]; then
|
||||
error "Python 3.8+ не найден"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check pip
|
||||
PIP=""
|
||||
for cmd in pip3 pip; do
|
||||
if command -v "$cmd" &>/dev/null; then
|
||||
@@ -81,22 +135,26 @@ for cmd in pip3 pip; do
|
||||
break
|
||||
fi
|
||||
done
|
||||
|
||||
if [ -z "$PIP" ]; then
|
||||
# Try python -m pip
|
||||
if [[ -z "$PIP" ]]; then
|
||||
if $PYTHON -m pip --version &>/dev/null; then
|
||||
PIP="$PYTHON -m pip"
|
||||
else
|
||||
error "pip не найден!"
|
||||
echo " Установите: sudo apt install python3-pip"
|
||||
error "pip не найден"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
ok "pip найден: $($PIP --version 2>&1 | head -1)"
|
||||
|
||||
# ── Step 2: Install Python dependencies ──
|
||||
step "2/5 Установка Python-зависимостей"
|
||||
resolve_home() {
|
||||
"$PYTHON" - "$1" <<'PY'
|
||||
import os, sys
|
||||
print(os.path.abspath(os.path.expanduser(sys.argv[1])))
|
||||
PY
|
||||
}
|
||||
|
||||
TARGET_HOME="$(resolve_home "$TARGET_HOME")"
|
||||
|
||||
step "2/5 Установка Python-зависимостей"
|
||||
CLI_DEPS=(
|
||||
"paramiko>=3.4.0"
|
||||
"cryptography>=41.0.0"
|
||||
@@ -105,7 +163,6 @@ CLI_DEPS=(
|
||||
"redis>=5.0.0"
|
||||
"requests>=2.31.0"
|
||||
)
|
||||
|
||||
for dep in "${CLI_DEPS[@]}"; do
|
||||
pkg=$(echo "$dep" | sed 's/[>=<].*//')
|
||||
if $PYTHON -c "import $pkg" 2>/dev/null; then
|
||||
@@ -120,38 +177,26 @@ for dep in "${CLI_DEPS[@]}"; do
|
||||
fi
|
||||
done
|
||||
|
||||
# ── Step 3: Create directories ──
|
||||
step "3/5 Создание директорий"
|
||||
|
||||
mkdir -p "$CONN_DIR" "$COMMANDS_DIR"
|
||||
chmod 700 "$CONN_DIR" 2>/dev/null || true
|
||||
ok "$CONN_DIR"
|
||||
ok "$COMMANDS_DIR"
|
||||
|
||||
# ── Step 4: Copy/Download files ──
|
||||
step "4/5 Установка файлов"
|
||||
|
||||
copy_or_download() {
|
||||
local src_relative="$1"
|
||||
local dst="$2"
|
||||
local perms="$3"
|
||||
local desc="$4"
|
||||
|
||||
# Try local source first
|
||||
if [ -n "$SRC_DIR" ] && [ -f "$SRC_DIR/$src_relative" ]; then
|
||||
mkdir -p "$(dirname "$dst")"
|
||||
|
||||
if [[ -n "$SRC_DIR" && -f "$SRC_DIR/$src_relative" ]]; then
|
||||
cp "$SRC_DIR/$src_relative" "$dst"
|
||||
chmod "$perms" "$dst"
|
||||
chmod "$perms" "$dst" 2>/dev/null || true
|
||||
ok "$desc (из $SRC_DIR)"
|
||||
return 0
|
||||
fi
|
||||
|
||||
# Try download from Gitea
|
||||
local url="$GITEA_RAW/$src_relative"
|
||||
if command -v curl &>/dev/null; then
|
||||
if curl -sSL -o "$dst" "$url" 2>/dev/null; then
|
||||
# Verify not empty and not HTML error page
|
||||
if [ -s "$dst" ] && ! head -1 "$dst" | grep -qi '<!doctype\|<html'; then
|
||||
chmod "$perms" "$dst"
|
||||
if curl -fsSL -o "$dst" "$url" 2>/dev/null; then
|
||||
if [[ -s "$dst" ]] && ! head -1 "$dst" | grep -qi '<!doctype\|<html'; then
|
||||
chmod "$perms" "$dst" 2>/dev/null || true
|
||||
ok "$desc (скачан с Gitea)"
|
||||
return 0
|
||||
fi
|
||||
@@ -159,8 +204,8 @@ copy_or_download() {
|
||||
fi
|
||||
elif command -v wget &>/dev/null; then
|
||||
if wget -q -O "$dst" "$url" 2>/dev/null; then
|
||||
if [ -s "$dst" ] && ! head -1 "$dst" | grep -qi '<!doctype\|<html'; then
|
||||
chmod "$perms" "$dst"
|
||||
if [[ -s "$dst" ]] && ! head -1 "$dst" | grep -qi '<!doctype\|<html'; then
|
||||
chmod "$perms" "$dst" 2>/dev/null || true
|
||||
ok "$desc (скачан с Gitea)"
|
||||
return 0
|
||||
fi
|
||||
@@ -172,86 +217,180 @@ copy_or_download() {
|
||||
return 1
|
||||
}
|
||||
|
||||
# Core files (always install)
|
||||
copy_or_download "tools/ssh.py" "$CONN_DIR/ssh.py" "755" "ssh.py"
|
||||
copy_or_download "core/encryption.py" "$CONN_DIR/encryption.py" "644" "encryption.py"
|
||||
install_skill_tree() {
|
||||
local prefix="$1"
|
||||
local dst_root="$2"
|
||||
shift 2
|
||||
mkdir -p "$dst_root"
|
||||
local rel
|
||||
for rel in "$@"; do
|
||||
copy_or_download "$prefix/$rel" "$dst_root/$rel" 644 "$prefix/$rel" || true
|
||||
done
|
||||
find "$dst_root/scripts" -type f -name '*.sh' -exec chmod 755 {} + 2>/dev/null || true
|
||||
find "$dst_root/scripts" -type f -name '*.cmd' -exec chmod 644 {} + 2>/dev/null || true
|
||||
}
|
||||
|
||||
# Claude Code skill
|
||||
copy_or_download "tools/skill-ssh.md" "$COMMANDS_DIR/ssh.md" "644" "ssh.md (скилл /ssh)"
|
||||
discover_homes() {
|
||||
local homes=()
|
||||
local uname_s
|
||||
uname_s="$(uname -s 2>/dev/null || echo Linux)"
|
||||
|
||||
# CLAUDE.md
|
||||
if [ -n "$SRC_DIR" ] && [ -f "$SRC_DIR/CLAUDE.md" ]; then
|
||||
cp "$SRC_DIR/CLAUDE.md" "$CLAUDE_DIR/CLAUDE.md"
|
||||
chmod 644 "$CLAUDE_DIR/CLAUDE.md"
|
||||
ok "CLAUDE.md"
|
||||
fi
|
||||
|
||||
# servers.json — only copy if exists locally, never download (contains encrypted creds)
|
||||
if [ -n "$SRC_DIR" ] && [ -f "$SRC_DIR/servers.json" ]; then
|
||||
cp "$SRC_DIR/servers.json" "$CONN_DIR/servers.json"
|
||||
chmod 600 "$CONN_DIR/servers.json"
|
||||
ok "servers.json (зашифрованный)"
|
||||
elif [ ! -f "$CONN_DIR/servers.json" ]; then
|
||||
warn "servers.json не найден — скопируйте с основной машины:"
|
||||
echo " scp user@main:~/.server-connections/servers.json $CONN_DIR/"
|
||||
fi
|
||||
|
||||
# settings.json
|
||||
if [ -n "$SRC_DIR" ] && [ -f "$SRC_DIR/settings.json" ]; then
|
||||
cp "$SRC_DIR/settings.json" "$CONN_DIR/settings.json"
|
||||
chmod 600 "$CONN_DIR/settings.json"
|
||||
ok "settings.json"
|
||||
elif [ ! -f "$CONN_DIR/settings.json" ]; then
|
||||
# Create minimal settings
|
||||
echo '{"language":"en","check_interval":60}' > "$CONN_DIR/settings.json"
|
||||
chmod 600 "$CONN_DIR/settings.json"
|
||||
ok "settings.json (создан по умолчанию)"
|
||||
fi
|
||||
|
||||
# ── Step 5: Verify ──
|
||||
step "5/5 Проверка установки"
|
||||
|
||||
ALL_OK=true
|
||||
|
||||
if [ -f "$CONN_DIR/ssh.py" ] && [ -x "$CONN_DIR/ssh.py" ]; then
|
||||
ok "ssh.py — исполняемый"
|
||||
else
|
||||
error "ssh.py — не найден или не исполняемый"
|
||||
ALL_OK=false
|
||||
fi
|
||||
|
||||
if [ -f "$CONN_DIR/encryption.py" ]; then
|
||||
ok "encryption.py"
|
||||
else
|
||||
error "encryption.py — не найден"
|
||||
ALL_OK=false
|
||||
fi
|
||||
|
||||
if [ -f "$COMMANDS_DIR/ssh.md" ]; then
|
||||
ok "ssh.md скилл"
|
||||
else
|
||||
warn "ssh.md скилл — не найден"
|
||||
fi
|
||||
|
||||
if [ -f "$CONN_DIR/servers.json" ]; then
|
||||
ok "servers.json"
|
||||
else
|
||||
warn "servers.json — отсутствует (нужно скопировать вручную)"
|
||||
fi
|
||||
|
||||
# Test ssh.py
|
||||
info "Тест ssh.py..."
|
||||
if $PYTHON "$CONN_DIR/ssh.py" --list &>/dev/null; then
|
||||
ok "ssh.py --list работает"
|
||||
else
|
||||
if [ ! -f "$CONN_DIR/servers.json" ]; then
|
||||
warn "ssh.py не может запуститься (нет servers.json)"
|
||||
if [[ "$INSTALL_ALL_USERS" -eq 1 ]]; then
|
||||
if [[ "$uname_s" == "Darwin" ]]; then
|
||||
[[ -d /var/root ]] && homes+=("/var/root")
|
||||
if [[ -d /Users ]]; then
|
||||
while IFS= read -r -d '' d; do homes+=("$d"); done < <(find /Users -mindepth 1 -maxdepth 1 -type d -print0 2>/dev/null)
|
||||
fi
|
||||
else
|
||||
[[ -d /root ]] && homes+=("/root")
|
||||
if [[ -d /home ]]; then
|
||||
while IFS= read -r -d '' d; do homes+=("$d"); done < <(find /home -mindepth 1 -maxdepth 1 -type d -print0 2>/dev/null)
|
||||
fi
|
||||
fi
|
||||
else
|
||||
warn "ssh.py вернул ошибку — проверьте зависимости"
|
||||
homes+=("$TARGET_HOME")
|
||||
fi
|
||||
|
||||
printf '%s\n' "${homes[@]}" | awk 'NF && !seen[$0]++'
|
||||
}
|
||||
|
||||
step "3/5 Подготовка директорий"
|
||||
TARGET_HOMES=()
|
||||
while IFS= read -r home; do
|
||||
[[ -n "$home" ]] || continue
|
||||
TARGET_HOMES+=("$home")
|
||||
ok "target home: $home"
|
||||
done < <(discover_homes)
|
||||
|
||||
if [[ "${#TARGET_HOMES[@]}" -eq 0 ]]; then
|
||||
error "Не удалось определить target home"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# ── Summary ──
|
||||
step "4/5 Установка файлов"
|
||||
CODEX_SKILL_FILES=(
|
||||
"SKILL.md"
|
||||
"references/command-matrix.md"
|
||||
"references/project.md"
|
||||
"scripts/codex-ssh-wrapper.sh"
|
||||
"scripts/codex-ssh-wrapper.cmd"
|
||||
"scripts/server-manager-doctor.sh"
|
||||
"scripts/server-manager-doctor.cmd"
|
||||
)
|
||||
GEMINI_SKILL_FILES=(
|
||||
"SKILL.md"
|
||||
"references/command-matrix.md"
|
||||
"references/project.md"
|
||||
"scripts/gemini-ssh-wrapper.sh"
|
||||
"scripts/gemini-ssh-wrapper.cmd"
|
||||
"scripts/server-manager-gemini-doctor.sh"
|
||||
"scripts/server-manager-gemini-doctor.cmd"
|
||||
)
|
||||
|
||||
for HOME_DIR in "${TARGET_HOMES[@]}"; do
|
||||
CONN_DIR="$HOME_DIR/.server-connections"
|
||||
CLAUDE_DIR="$HOME_DIR/.claude"
|
||||
COMMANDS_DIR="$CLAUDE_DIR/commands"
|
||||
CODEX_DIR="$HOME_DIR/.codex/skills/server-manager"
|
||||
GEMINI_DIR="$HOME_DIR/.gemini"
|
||||
GEMINI_SKILL_DIR="$GEMINI_DIR/skills/server-manager"
|
||||
AGENTS_DIR="$HOME_DIR/.agents/skills/server-manager"
|
||||
|
||||
mkdir -p "$CONN_DIR" "$COMMANDS_DIR" "$CODEX_DIR" "$GEMINI_SKILL_DIR"
|
||||
chmod 700 "$CONN_DIR" 2>/dev/null || true
|
||||
|
||||
info "Устанавливаю в $HOME_DIR"
|
||||
|
||||
copy_or_download "tools/ssh.py" "$CONN_DIR/ssh.py" 755 "ssh.py"
|
||||
copy_or_download "core/encryption.py" "$CONN_DIR/encryption.py" 644 "encryption.py"
|
||||
copy_or_download "tools/skill-ssh.md" "$COMMANDS_DIR/ssh.md" 644 "ssh.md (скилл /ssh)"
|
||||
|
||||
if [[ -n "$SRC_DIR" && -f "$SRC_DIR/CLAUDE.md" ]]; then
|
||||
cp "$SRC_DIR/CLAUDE.md" "$CLAUDE_DIR/CLAUDE.md"
|
||||
chmod 644 "$CLAUDE_DIR/CLAUDE.md"
|
||||
ok "CLAUDE.md"
|
||||
elif [[ ! -f "$CLAUDE_DIR/CLAUDE.md" ]]; then
|
||||
copy_or_download "CLAUDE.md" "$CLAUDE_DIR/CLAUDE.md" 644 "CLAUDE.md" || true
|
||||
fi
|
||||
|
||||
if [[ -n "$SRC_DIR" && -f "$SRC_DIR/GEMINI.md" ]]; then
|
||||
cp "$SRC_DIR/GEMINI.md" "$GEMINI_DIR/GEMINI.md"
|
||||
chmod 644 "$GEMINI_DIR/GEMINI.md"
|
||||
ok "GEMINI.md"
|
||||
elif [[ ! -f "$GEMINI_DIR/GEMINI.md" ]]; then
|
||||
copy_or_download "GEMINI.md" "$GEMINI_DIR/GEMINI.md" 644 "GEMINI.md" || true
|
||||
fi
|
||||
|
||||
install_skill_tree ".codex/skills/server-manager" "$CODEX_DIR" "${CODEX_SKILL_FILES[@]}"
|
||||
install_skill_tree ".gemini/skills/server-manager" "$GEMINI_SKILL_DIR" "${GEMINI_SKILL_FILES[@]}"
|
||||
if [[ "$INSTALL_AGENTS_MIRROR" -eq 1 ]]; then
|
||||
mkdir -p "$AGENTS_DIR"
|
||||
install_skill_tree ".gemini/skills/server-manager" "$AGENTS_DIR" "${GEMINI_SKILL_FILES[@]}"
|
||||
ok "agents skill mirror"
|
||||
elif [[ -d "$AGENTS_DIR" ]]; then
|
||||
rm -rf "$AGENTS_DIR"
|
||||
ok "removed stale agents skill mirror to avoid Gemini conflict"
|
||||
fi
|
||||
|
||||
if [[ -f "$CODEX_DIR/scripts/codex-ssh-wrapper.sh" ]]; then
|
||||
cp "$CODEX_DIR/scripts/codex-ssh-wrapper.sh" "$CONN_DIR/codex-ssh"
|
||||
chmod 755 "$CONN_DIR/codex-ssh"
|
||||
ok "codex-ssh wrapper"
|
||||
else
|
||||
copy_or_download ".codex/skills/server-manager/scripts/codex-ssh-wrapper.sh" "$CONN_DIR/codex-ssh" 755 "codex-ssh wrapper" || true
|
||||
fi
|
||||
|
||||
if [[ -f "$GEMINI_SKILL_DIR/scripts/gemini-ssh-wrapper.sh" ]]; then
|
||||
cp "$GEMINI_SKILL_DIR/scripts/gemini-ssh-wrapper.sh" "$CONN_DIR/gemini-ssh"
|
||||
chmod 755 "$CONN_DIR/gemini-ssh"
|
||||
ok "gemini-ssh wrapper"
|
||||
else
|
||||
copy_or_download ".gemini/skills/server-manager/scripts/gemini-ssh-wrapper.sh" "$CONN_DIR/gemini-ssh" 755 "gemini-ssh wrapper" || true
|
||||
fi
|
||||
|
||||
if [[ "$INSTALL_ALL_USERS" -eq 0 ]]; then
|
||||
if [[ -n "$SRC_DIR" && -f "$SRC_DIR/servers.json" ]]; then
|
||||
cp "$SRC_DIR/servers.json" "$CONN_DIR/servers.json"
|
||||
chmod 600 "$CONN_DIR/servers.json"
|
||||
ok "servers.json (зашифрованный)"
|
||||
elif [[ ! -f "$CONN_DIR/servers.json" ]]; then
|
||||
warn "servers.json не найден для $HOME_DIR — скопируйте вручную"
|
||||
fi
|
||||
|
||||
if [[ -n "$SRC_DIR" && -f "$SRC_DIR/settings.json" ]]; then
|
||||
cp "$SRC_DIR/settings.json" "$CONN_DIR/settings.json"
|
||||
chmod 600 "$CONN_DIR/settings.json"
|
||||
ok "settings.json"
|
||||
elif [[ ! -f "$CONN_DIR/settings.json" ]]; then
|
||||
echo '{"language":"en","check_interval":60}' > "$CONN_DIR/settings.json"
|
||||
chmod 600 "$CONN_DIR/settings.json"
|
||||
ok "settings.json (создан по умолчанию)"
|
||||
fi
|
||||
else
|
||||
warn "all-users mode: servers.json/settings.json не копируются автоматически для $HOME_DIR"
|
||||
fi
|
||||
done
|
||||
|
||||
step "5/5 Проверка установки"
|
||||
ALL_OK=true
|
||||
for HOME_DIR in "${TARGET_HOMES[@]}"; do
|
||||
CONN_DIR="$HOME_DIR/.server-connections"
|
||||
COMMANDS_DIR="$HOME_DIR/.claude/commands"
|
||||
CODEX_DIR="$HOME_DIR/.codex/skills/server-manager"
|
||||
GEMINI_SKILL_DIR="$HOME_DIR/.gemini/skills/server-manager"
|
||||
|
||||
info "Проверка $HOME_DIR"
|
||||
|
||||
[[ -x "$CONN_DIR/ssh.py" ]] && ok "ssh.py — исполняемый" || { error "ssh.py — не найден или не исполняемый"; ALL_OK=false; }
|
||||
[[ -f "$CONN_DIR/encryption.py" ]] && ok "encryption.py" || { error "encryption.py — не найден"; ALL_OK=false; }
|
||||
[[ -f "$COMMANDS_DIR/ssh.md" ]] && ok "Claude /ssh skill" || warn "Claude /ssh skill — не найден"
|
||||
[[ -f "$CODEX_DIR/SKILL.md" ]] && ok "Codex skill" || { warn "Codex skill — не найден"; ALL_OK=false; }
|
||||
[[ -x "$CONN_DIR/codex-ssh" ]] && ok "codex-ssh wrapper" || { warn "codex-ssh wrapper — не найден"; ALL_OK=false; }
|
||||
[[ -f "$GEMINI_SKILL_DIR/SKILL.md" ]] && ok "Gemini skill" || { warn "Gemini skill — не найден"; ALL_OK=false; }
|
||||
[[ -x "$CONN_DIR/gemini-ssh" ]] && ok "gemini-ssh wrapper" || { warn "gemini-ssh wrapper — не найден"; ALL_OK=false; }
|
||||
|
||||
done
|
||||
|
||||
echo ""
|
||||
echo -e "${CYAN}━━━ Готово ━━━${NC}"
|
||||
echo ""
|
||||
@@ -260,17 +399,19 @@ if $ALL_OK; then
|
||||
else
|
||||
echo -e "${YELLOW}Установка завершена с предупреждениями.${NC}"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "Файлы:"
|
||||
echo " $CONN_DIR/ssh.py — CLI-утилита"
|
||||
echo " $CONN_DIR/encryption.py — модуль шифрования"
|
||||
echo " $CONN_DIR/servers.json — серверы (зашифрованные)"
|
||||
echo " $COMMANDS_DIR/ssh.md — скилл /ssh для Claude Code"
|
||||
echo "Установлено для home:"
|
||||
printf ' - %s\n' "${TARGET_HOMES[@]}"
|
||||
echo ""
|
||||
echo "Использование:"
|
||||
echo " python3 ~/.server-connections/ssh.py --list"
|
||||
echo " python3 ~/.server-connections/ssh.py --info ALIAS"
|
||||
echo " python3 ~/.server-connections/ssh.py ALIAS \"command\""
|
||||
echo ""
|
||||
echo "Claude Code скилл: /ssh"
|
||||
echo " ~/.server-connections/codex-ssh --list"
|
||||
echo " ~/.server-connections/gemini-ssh --list"
|
||||
echo ""
|
||||
echo "Claude skill: ~/.claude/commands/ssh.md"
|
||||
echo "Codex skill: ~/.codex/skills/server-manager/"
|
||||
echo "Gemini skill: ~/.gemini/skills/server-manager/"
|
||||
if [[ "$INSTALL_AGENTS_MIRROR" -eq 1 ]]; then
|
||||
echo "Mirror skill: ~/.agents/skills/server-manager/"
|
||||
fi
|
||||
|
||||
48
tools/install_ai_integrations.py
Normal file
48
tools/install_ai_integrations.py
Normal file
@@ -0,0 +1,48 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Cross-platform installer for ServerManager AI integrations.
|
||||
|
||||
Supports Claude (/ssh), Codex (server-manager), and Gemini (server-manager)
|
||||
for the current user, a target home, or all discovered user homes.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import os
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
PROJECT_ROOT = Path(__file__).resolve().parents[1]
|
||||
if str(PROJECT_ROOT) not in sys.path:
|
||||
sys.path.insert(0, str(PROJECT_ROOT))
|
||||
|
||||
from core.claude_setup import install_all # noqa: E402
|
||||
|
||||
|
||||
def parse_args() -> argparse.Namespace:
|
||||
p = argparse.ArgumentParser(description="Install ServerManager AI integrations")
|
||||
p.add_argument("--target-home", help="Install into this home directory instead of the current user")
|
||||
p.add_argument("--all-users", action="store_true", help="Install to all discovered user homes on this system")
|
||||
return p.parse_args()
|
||||
|
||||
|
||||
def main() -> int:
|
||||
args = parse_args()
|
||||
|
||||
if args.target_home and args.all_users:
|
||||
print("error: --target-home and --all-users are mutually exclusive", file=sys.stderr)
|
||||
return 2
|
||||
|
||||
if args.target_home:
|
||||
os.environ["SERVER_MANAGER_TARGET_HOME"] = os.path.abspath(os.path.expanduser(args.target_home))
|
||||
|
||||
if args.all_users:
|
||||
os.environ["SERVER_MANAGER_INSTALL_ALL_USERS"] = "1"
|
||||
|
||||
for line in install_all():
|
||||
print(line)
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
raise SystemExit(main())
|
||||
232
tools/patch_claude_code.js
Normal file
232
tools/patch_claude_code.js
Normal file
@@ -0,0 +1,232 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* Patcher for Claude Code CLI — fixes image reading crash on Windows.
|
||||
*
|
||||
* Root cause: In some code paths, `media_type` field is undefined when
|
||||
* constructing image content blocks for the API. JSON.stringify omits
|
||||
* undefined values, so the field is absent from the request body.
|
||||
* The API returns 400 "media_type: Field required" which permanently
|
||||
* poisons the conversation context and kills the session.
|
||||
*
|
||||
* This patcher:
|
||||
* 1. Installs `sharp` into claude-code's node_modules (if missing)
|
||||
* 2. Patches the Nv8 (image reader) function to gracefully handle errors
|
||||
* 3. Patches the image mapper to guarantee media_type is always present
|
||||
*
|
||||
* Usage:
|
||||
* node tools/patch_claude_code.js # apply patch
|
||||
* node tools/patch_claude_code.js --check # check status only
|
||||
* node tools/patch_claude_code.js --revert # revert patch
|
||||
*
|
||||
* Safe to run multiple times — idempotent.
|
||||
*/
|
||||
|
||||
const fs = require("fs");
|
||||
const path = require("path");
|
||||
const { execSync } = require("child_process");
|
||||
|
||||
// Find claude-code installation
|
||||
function findClaudeCodeDir() {
|
||||
const npmGlobal = execSync("npm root -g", { encoding: "utf8" }).trim();
|
||||
const claudeDir = path.join(npmGlobal, "@anthropic-ai", "claude-code");
|
||||
if (fs.existsSync(path.join(claudeDir, "cli.js"))) return claudeDir;
|
||||
|
||||
// Fallback: try common paths
|
||||
const fallbacks = [
|
||||
path.join(process.env.APPDATA || "", "npm", "node_modules", "@anthropic-ai", "claude-code"),
|
||||
path.join(process.env.HOME || "", ".npm-global", "lib", "node_modules", "@anthropic-ai", "claude-code"),
|
||||
"/usr/local/lib/node_modules/@anthropic-ai/claude-code",
|
||||
];
|
||||
for (const dir of fallbacks) {
|
||||
if (fs.existsSync(path.join(dir, "cli.js"))) return dir;
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
// Check if sharp is installed
|
||||
function isSharpInstalled(claudeDir) {
|
||||
try {
|
||||
const sharpDir = path.join(claudeDir, "node_modules", "sharp");
|
||||
return fs.existsSync(sharpDir);
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Install sharp
|
||||
function installSharp(claudeDir) {
|
||||
console.log("[*] Installing sharp...");
|
||||
try {
|
||||
execSync("npm install sharp", { cwd: claudeDir, encoding: "utf8", stdio: "pipe" });
|
||||
console.log("[+] sharp installed successfully");
|
||||
return true;
|
||||
} catch (e) {
|
||||
console.error("[-] Failed to install sharp:", e.message);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
const PATCH_MARKER = "/* PATCHED_NV8_SAFE_IMAGE_READ */";
|
||||
const MAPPER_PATCH_MARKER = "/* PATCHED_IMAGE_MEDIA_TYPE */";
|
||||
|
||||
function readCliJs(claudeDir) {
|
||||
return fs.readFileSync(path.join(claudeDir, "cli.js"), "utf8");
|
||||
}
|
||||
|
||||
function writeCliJs(claudeDir, code) {
|
||||
// Backup first
|
||||
const backupPath = path.join(claudeDir, "cli.js.bak");
|
||||
if (!fs.existsSync(backupPath)) {
|
||||
fs.copyFileSync(path.join(claudeDir, "cli.js"), backupPath);
|
||||
console.log("[+] Backup created: cli.js.bak");
|
||||
}
|
||||
fs.writeFileSync(path.join(claudeDir, "cli.js"), code, "utf8");
|
||||
}
|
||||
|
||||
function isPatched(code) {
|
||||
return code.includes(PATCH_MARKER);
|
||||
}
|
||||
|
||||
function isMapperPatched(code) {
|
||||
return code.includes(MAPPER_PATCH_MARKER);
|
||||
}
|
||||
|
||||
// Patch 1: Nv8 safety wrapper (try/catch around image reader)
|
||||
function applyNv8Patch(code) {
|
||||
if (code.includes(PATCH_MARKER)) {
|
||||
console.log("[=] Nv8 safety patch already applied");
|
||||
return code;
|
||||
}
|
||||
|
||||
const ORIGINAL_NV8_SIGNATURE = "async function Nv8(A,q=Tv8(),K){let Y=await X1().readFileBytes(A,K)";
|
||||
const idx = code.indexOf(ORIGINAL_NV8_SIGNATURE);
|
||||
if (idx === -1) {
|
||||
console.error("[-] Could not find Nv8 function signature in cli.js");
|
||||
console.error(" Claude Code may have been updated.");
|
||||
return code;
|
||||
}
|
||||
|
||||
const endMarker = "}var Ns9";
|
||||
const endIdx = code.indexOf(endMarker, idx);
|
||||
if (endIdx === -1) {
|
||||
console.error("[-] Could not find end of Nv8 function");
|
||||
return code;
|
||||
}
|
||||
|
||||
const patchedNv8 = `${PATCH_MARKER}async function Nv8(A,q=Tv8(),K){try{let Y=await X1().readFileBytes(A,K),z=Y.length;if(z===0)throw Error("Image file is empty: "+A);let w=kp6(Y),_=w.split("/")[1]||"png",$;try{let H=await ig(Y,z,_);$=q01(H.buffer,H.mediaType,z,H.dimensions)}catch(H){$6(H);$=q01(Y,_,z)}if(Math.ceil($.file.base64.length*0.125)>q)try{let H=await DP1(Y,q,w);return{type:"image",file:{base64:H.base64,type:H.mediaType||"image/png",originalSize:z}}}catch(H){$6(H);try{let j=await Promise.resolve().then(()=>q6(yN8(),1)),M=await(j.default||j)(Y).resize(400,400,{fit:"inside",withoutEnlargement:!0}).jpeg({quality:20}).toBuffer();return q01(M,"jpeg",z)}catch(j){return $6(j),q01(Y,_,z)}}return $}catch(_err){return{type:"text",file:{content:"[Error reading image: "+_err.message+"] File: "+A,totalLines:1}}}}`;
|
||||
|
||||
code = code.slice(0, idx) + patchedNv8 + code.slice(endIdx + 1);
|
||||
console.log("[+] Nv8 safety patch applied");
|
||||
return code;
|
||||
}
|
||||
|
||||
// Patch 2: Image mapper — guarantee media_type is always a valid string
|
||||
// This is the ROOT CAUSE fix: A.file.type can be undefined in some code paths,
|
||||
// and JSON.stringify silently drops undefined fields, causing API 400 error.
|
||||
function applyMapperPatch(code) {
|
||||
if (code.includes(MAPPER_PATCH_MARKER)) {
|
||||
console.log("[=] Image mapper patch already applied");
|
||||
return code;
|
||||
}
|
||||
|
||||
const ORIGINAL_MAPPER = 'case"image":return{tool_use_id:q,type:"tool_result",content:[{type:"image",source:{type:"base64",data:A.file.base64,media_type:A.file.type}}]}';
|
||||
const idx = code.indexOf(ORIGINAL_MAPPER);
|
||||
if (idx === -1) {
|
||||
console.error("[-] Could not find image mapper in cli.js");
|
||||
console.error(" Claude Code may have been updated.");
|
||||
return code;
|
||||
}
|
||||
|
||||
// Patched version: fallback media_type to "image/png" if undefined
|
||||
const PATCHED_MAPPER = `${MAPPER_PATCH_MARKER}case"image":return{tool_use_id:q,type:"tool_result",content:[{type:"image",source:{type:"base64",data:A.file.base64,media_type:A.file.type||"image/png"}}]}`;
|
||||
|
||||
code = code.slice(0, idx) + PATCHED_MAPPER + code.slice(idx + ORIGINAL_MAPPER.length);
|
||||
console.log("[+] Image mapper patched — media_type guaranteed non-empty");
|
||||
return code;
|
||||
}
|
||||
|
||||
function revertPatch(claudeDir) {
|
||||
const backupPath = path.join(claudeDir, "cli.js.bak");
|
||||
if (!fs.existsSync(backupPath)) {
|
||||
console.error("[-] No backup found at cli.js.bak");
|
||||
return false;
|
||||
}
|
||||
fs.copyFileSync(backupPath, path.join(claudeDir, "cli.js"));
|
||||
console.log("[+] Reverted to original cli.js from backup");
|
||||
return true;
|
||||
}
|
||||
|
||||
// Main
|
||||
function main() {
|
||||
const args = process.argv.slice(2);
|
||||
const checkOnly = args.includes("--check");
|
||||
const revert = args.includes("--revert");
|
||||
|
||||
console.log("=== Claude Code Image Read Patcher v2 ===\n");
|
||||
|
||||
const claudeDir = findClaudeCodeDir();
|
||||
if (!claudeDir) {
|
||||
console.error("[-] Claude Code installation not found!");
|
||||
console.error(" Install it with: npm install -g @anthropic-ai/claude-code");
|
||||
process.exit(1);
|
||||
}
|
||||
console.log("[*] Found Claude Code at:", claudeDir);
|
||||
|
||||
// Read package version
|
||||
try {
|
||||
const pkg = JSON.parse(fs.readFileSync(path.join(claudeDir, "package.json"), "utf8"));
|
||||
console.log("[*] Version:", pkg.version);
|
||||
} catch {}
|
||||
|
||||
// Check sharp
|
||||
const hasSharp = isSharpInstalled(claudeDir);
|
||||
console.log("[*] sharp module:", hasSharp ? "installed" : "MISSING");
|
||||
|
||||
// Check patch status
|
||||
const code = readCliJs(claudeDir);
|
||||
const nv8Patched = isPatched(code);
|
||||
const mapperPatched = isMapperPatched(code);
|
||||
console.log("[*] Nv8 safety patch:", nv8Patched ? "applied" : "not applied");
|
||||
console.log("[*] Image mapper patch:", mapperPatched ? "applied" : "not applied");
|
||||
|
||||
if (checkOnly) {
|
||||
const fullyProtected = hasSharp && nv8Patched && mapperPatched;
|
||||
const status = fullyProtected ? "FULLY PROTECTED" :
|
||||
(mapperPatched ? "PROTECTED (mapper fix applied)" : "VULNERABLE");
|
||||
console.log("\nStatus:", status);
|
||||
process.exit(fullyProtected ? 0 : 1);
|
||||
}
|
||||
|
||||
if (revert) {
|
||||
revertPatch(claudeDir);
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
console.log("");
|
||||
|
||||
// Step 1: Install sharp
|
||||
if (!hasSharp) {
|
||||
if (!installSharp(claudeDir)) {
|
||||
console.error("\n[-] Could not install sharp. Applying safety patches anyway...");
|
||||
}
|
||||
} else {
|
||||
console.log("[=] sharp already installed, skipping");
|
||||
}
|
||||
|
||||
// Step 2: Apply Nv8 safety patch
|
||||
let patchedCode = applyNv8Patch(readCliJs(claudeDir));
|
||||
|
||||
// Step 3: Apply image mapper patch (ROOT CAUSE FIX)
|
||||
patchedCode = applyMapperPatch(patchedCode);
|
||||
|
||||
writeCliJs(claudeDir, patchedCode);
|
||||
|
||||
console.log("\n=== Done! Claude Code is now protected against image read crashes ===");
|
||||
console.log("Patches applied:");
|
||||
console.log(" 1. Nv8 try/catch — prevents binary leak on image read failure");
|
||||
console.log(" 2. Image mapper — guarantees media_type is always present (ROOT FIX)");
|
||||
console.log("\nNote: After updating Claude Code (npm update -g @anthropic-ai/claude-code),");
|
||||
console.log(" re-run this patcher to reapply the fixes.");
|
||||
}
|
||||
|
||||
main();
|
||||
@@ -1,7 +1,7 @@
|
||||
# Скилл /ssh — управление удалёнными серверами
|
||||
|
||||
Ты управляешь удалёнными серверами через универсальную CLI-утилиту.
|
||||
Поддерживаются: SSH, SQL (MariaDB/MSSQL/PostgreSQL), Redis, S3, Grafana, Prometheus, WinRM (PowerShell/CMD).
|
||||
Поддерживаются: SSH, SQL (MariaDB/MSSQL/PostgreSQL), Redis, S3/MinIO, Grafana, Prometheus, WinRM (PowerShell/CMD).
|
||||
|
||||
## ВАЖНО — Безопасность
|
||||
|
||||
@@ -19,38 +19,52 @@
|
||||
|
||||
Пользователь передаёт через `$ARGUMENTS`. Разбери и выполни.
|
||||
|
||||
## КРИТИЧНО — Команды зависят от типа сервера
|
||||
## КРИТИЧНО — СНАЧАЛА ПРОВЕРЬ ТИП СЕРВЕРА
|
||||
|
||||
`--list` возвращает колонку `Type` для каждого сервера. **Тип определяет какие команды использовать:**
|
||||
**ПЕРЕД ЛЮБОЙ операцией** с сервером — **ОБЯЗАТЕЛЬНО** выполни `--list` и посмотри колонку `Type`.
|
||||
**ЗАПРЕЩЕНО** угадывать тип сервера. MinIO/S3 — это НЕ SSH, Redis — это НЕ SSH, MariaDB — это НЕ SSH.
|
||||
|
||||
| Тип | Команды |
|
||||
|-----|---------|
|
||||
| `ssh` | `ALIAS "command"`, `--upload`, `--download`, `--ping`, `--install-key` |
|
||||
| `telnet` | `ALIAS "command"` (как ssh, но без SFTP/sudo/ключей) |
|
||||
| `mariadb` / `mssql` / `postgresql` | `--sql`, `--sql-databases`, `--sql-tables` |
|
||||
| `redis` | `--redis`, `--redis-info`, `--redis-keys` |
|
||||
| `s3` | `--s3-buckets`, `--s3-ls`, `--s3-upload`, `--s3-download`, `--s3-delete` |
|
||||
| `grafana` | `--grafana-dashboards`, `--grafana-alerts` |
|
||||
| `prometheus` | `--prom-query`, `--prom-targets`, `--prom-alerts` |
|
||||
| `winrm` | `--ps`, `--cmd` |
|
||||
| `rdp` / `vnc` | Только GUI (запуск внешнего клиента), CLI-команд нет |
|
||||
**Тип сервера определяет КАКИЕ команды использовать. Использование команд не того типа — СЛОМАЕТ операцию.**
|
||||
|
||||
**`ALIAS "command"` — ТОЛЬКО для типа `ssh`.** Для Redis — `--redis`, для SQL — `--sql`, для WinRM — `--ps`/`--cmd` и т.д.
|
||||
| Тип | Команды | НЕ использовать |
|
||||
|-----|---------|-----------------|
|
||||
| `ssh` | `ALIAS "command"`, `--upload`, `--download`, `--ping`, `--install-key` | — |
|
||||
| `telnet` | `ALIAS "command"` (без SFTP/sudo/ключей) | `--upload`, `--download` |
|
||||
| `mariadb` / `mssql` / `postgresql` | `--sql`, `--sql-databases`, `--sql-tables` | `ALIAS "command"` |
|
||||
| `redis` | `--redis`, `--redis-info`, `--redis-keys` | `ALIAS "command"` |
|
||||
| `s3` (MinIO, AWS S3, и др.) | `--s3-buckets`, `--s3-ls`, `--s3-upload`, `--s3-download`, `--s3-delete`, `--s3-url`, `--s3-create-bucket` | `ALIAS "command"`, `--upload`, `--download` |
|
||||
| `grafana` | `--grafana-dashboards`, `--grafana-alerts` | `ALIAS "command"` |
|
||||
| `prometheus` | `--prom-query`, `--prom-targets`, `--prom-alerts` | `ALIAS "command"` |
|
||||
| `winrm` | `--ps`, `--cmd` | `ALIAS "command"` |
|
||||
| `rdp` / `vnc` | Только GUI | всё |
|
||||
|
||||
**`ALIAS "command"` (shell-команды типа ls, cat, mkdir) — ТОЛЬКО для типов `ssh` и `telnet`.**
|
||||
|
||||
```bash
|
||||
# Тип redis → --redis-info, НЕ ALIAS "INFO"
|
||||
python ~/.server-connections/ssh.py --redis-info "Reddis main ovh"
|
||||
# ❌ НЕПРАВИЛЬНО — MinIO/S3 это НЕ SSH, нельзя выполнять shell-команды
|
||||
python ~/.server-connections/ssh.py "minio-alias" "ls /bucket"
|
||||
python ~/.server-connections/ssh.py "minio-alias" "mkdir /bucket/folder"
|
||||
|
||||
# Тип mariadb → --sql-databases, НЕ ALIAS "SHOW DATABASES"
|
||||
python ~/.server-connections/ssh.py --sql-databases "Maria Db Connection main ovh"
|
||||
# ✅ ПРАВИЛЬНО — S3-команды для типа s3
|
||||
python ~/.server-connections/ssh.py --s3-ls "minio-alias" bucket
|
||||
python ~/.server-connections/ssh.py --s3-upload "minio-alias" "D:/file.txt" bucket/folder/file.txt
|
||||
|
||||
# Тип ssh → ALIAS "command"
|
||||
python ~/.server-connections/ssh.py investor "uptime"
|
||||
# ❌ НЕПРАВИЛЬНО — Redis это НЕ SSH
|
||||
python ~/.server-connections/ssh.py "redis-alias" "INFO"
|
||||
|
||||
# ✅ ПРАВИЛЬНО
|
||||
python ~/.server-connections/ssh.py --redis-info "redis-alias"
|
||||
|
||||
# ❌ НЕПРАВИЛЬНО — MariaDB это НЕ SSH
|
||||
python ~/.server-connections/ssh.py "mariadb-alias" "SHOW DATABASES"
|
||||
|
||||
# ✅ ПРАВИЛЬНО
|
||||
python ~/.server-connections/ssh.py --sql-databases "mariadb-alias"
|
||||
```
|
||||
|
||||
## Общие команды
|
||||
|
||||
### Список серверов (безопасный — alias, тип, ключ, заметки)
|
||||
### Список серверов (безопасный — alias, тип, группа, ключ, заметки)
|
||||
```bash
|
||||
python ~/.server-connections/ssh.py --list
|
||||
```
|
||||
@@ -159,7 +173,12 @@ python ~/.server-connections/ssh.py --redis-info ALIAS
|
||||
python ~/.server-connections/ssh.py --redis-keys ALIAS "user:*"
|
||||
```
|
||||
|
||||
## S3-команды (тип: s3)
|
||||
## S3-команды (тип: s3) — MinIO, AWS S3, любое S3-совместимое хранилище
|
||||
|
||||
**MinIO = тип `s3`.** Когда пользователь говорит "MinIO" или "S3" — используй ТОЛЬКО `--s3-*` команды.
|
||||
**НЕ пытайся** выполнять shell-команды (`ls`, `mkdir`, `cat`) на S3-серверах — это не SSH!
|
||||
|
||||
**Папки в S3 не существуют** — это префиксы. "Создать папку" = загрузить файл с префиксом в ключе (например `bucket/folder/file.txt`).
|
||||
|
||||
### Список бакетов
|
||||
```bash
|
||||
@@ -187,6 +206,30 @@ python ~/.server-connections/ssh.py --s3-download ALIAS bucket/key "D:/local/fil
|
||||
python ~/.server-connections/ssh.py --s3-delete ALIAS bucket/key
|
||||
```
|
||||
|
||||
### Получить ссылку на файл (presigned URL)
|
||||
```bash
|
||||
python ~/.server-connections/ssh.py --s3-url ALIAS bucket/key
|
||||
python ~/.server-connections/ssh.py --s3-url ALIAS bucket/key 86400
|
||||
```
|
||||
По умолчанию ссылка действует 1 час (3600 сек). Второй аргумент — время жизни в секундах (например 86400 = 24 часа).
|
||||
|
||||
### Создать бакет
|
||||
```bash
|
||||
python ~/.server-connections/ssh.py --s3-create-bucket ALIAS bucket-name
|
||||
```
|
||||
|
||||
### Типичный workflow: "создай папку и залей файл"
|
||||
```bash
|
||||
# 1. Посмотри бакеты
|
||||
python ~/.server-connections/ssh.py --s3-buckets ALIAS
|
||||
# 2. "Создать папку" = просто загрузить файл с нужным путём (prefix)
|
||||
python ~/.server-connections/ssh.py --s3-upload ALIAS "D:/file.txt" mybucket/newfolder/file.txt
|
||||
# 3. Проверить
|
||||
python ~/.server-connections/ssh.py --s3-ls ALIAS mybucket/newfolder/
|
||||
# 4. Получить ссылку
|
||||
python ~/.server-connections/ssh.py --s3-url ALIAS mybucket/newfolder/file.txt
|
||||
```
|
||||
|
||||
## Grafana-команды (тип: grafana)
|
||||
|
||||
### Список дашбордов
|
||||
|
||||
131
tools/ssh.py
131
tools/ssh.py
@@ -42,6 +42,8 @@ S3 (type: s3):
|
||||
python ssh.py --s3-upload ALIAS local bucket/key # upload file
|
||||
python ssh.py --s3-download ALIAS bucket/key local # download file
|
||||
python ssh.py --s3-delete ALIAS bucket/key # delete object
|
||||
python ssh.py --s3-url ALIAS bucket/key [SEC] # presigned URL (default 3600s)
|
||||
python ssh.py --s3-create-bucket ALIAS name # create bucket
|
||||
|
||||
WinRM (type: winrm):
|
||||
python ssh.py --ps ALIAS "Get-Process" # PowerShell via WinRM
|
||||
@@ -100,6 +102,11 @@ def load_servers():
|
||||
return data, {s["alias"]: s for s in data.get("servers", [])}
|
||||
|
||||
|
||||
def _group_map(data: dict) -> dict:
|
||||
"""Map group UUID → group name."""
|
||||
return {g["id"]: g.get("name", "") for g in data.get("groups", [])}
|
||||
|
||||
|
||||
def save_servers(data):
|
||||
servers_file = _get_servers_file()
|
||||
text = json.dumps(data, indent=2, ensure_ascii=False)
|
||||
@@ -777,7 +784,8 @@ def ping_server(server: dict):
|
||||
|
||||
|
||||
def list_servers(full=False):
|
||||
_, servers = load_servers()
|
||||
data, servers = load_servers()
|
||||
groups = _group_map(data)
|
||||
if full:
|
||||
# WARNING: full mode shows sensitive data (IP, port, user)
|
||||
# Only for local/manual use, NEVER through AI API
|
||||
@@ -789,13 +797,14 @@ def list_servers(full=False):
|
||||
print(f"{alias:<20} {s['ip']:<20} {s.get('port', 22):<8} {s.get('user', 'root'):<10} {has_key:<6}")
|
||||
else:
|
||||
# Safe mode: only aliases (no IPs, ports, users)
|
||||
print(f"{'Alias':<20} {'Type':<10} {'Key':<6} {'Notes'}")
|
||||
print("-" * 70)
|
||||
print(f"{'Alias':<20} {'Type':<10} {'Group':<14} {'Key':<6} {'Notes'}")
|
||||
print("-" * 80)
|
||||
for alias, s in servers.items():
|
||||
has_key = "yes" if os.path.exists(SSH_KEY_PATH) else "no"
|
||||
stype = s.get("type", "ssh")
|
||||
group_name = groups.get(s.get("group", ""), "-")
|
||||
notes = s.get("notes", "")
|
||||
print(f"{alias:<20} {stype:<10} {has_key:<6} {notes}")
|
||||
print(f"{alias:<20} {stype:<10} {group_name:<14} {has_key:<6} {notes}")
|
||||
|
||||
|
||||
def _resolve_alias(alias: str, servers: dict) -> str:
|
||||
@@ -829,12 +838,16 @@ def _resolve_alias(alias: str, servers: dict) -> str:
|
||||
|
||||
def server_info(alias: str):
|
||||
"""Show server info safe for AI context — NO ip, user, password, port, totp_secret."""
|
||||
_, servers = load_servers()
|
||||
data, servers = load_servers()
|
||||
groups = _group_map(data)
|
||||
alias = _resolve_alias(alias, servers)
|
||||
s = servers[alias]
|
||||
has_key = "yes" if os.path.exists(SSH_KEY_PATH) else "no"
|
||||
print(f"Alias: {s['alias']}")
|
||||
print(f"Type: {s.get('type', 'ssh')}")
|
||||
group_name = groups.get(s.get("group", ""), "")
|
||||
if group_name:
|
||||
print(f"Group: {group_name}")
|
||||
print(f"Key: {has_key}")
|
||||
print(f"Auth: {s.get('auth', 'password')}")
|
||||
print(f"2FA: {'yes' if s.get('totp_secret') else 'no'}")
|
||||
@@ -1459,6 +1472,38 @@ def s3_delete(server: dict, remote_path: str):
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def s3_url(server: dict, remote_path: str, expires: int = 3600):
|
||||
"""Generate a presigned URL for an S3 object."""
|
||||
client = _get_s3_client(server)
|
||||
parts = remote_path.split("/", 1)
|
||||
bucket = parts[0] if parts else server.get("bucket", "")
|
||||
key = parts[1] if len(parts) > 1 else ""
|
||||
if not bucket or not key:
|
||||
print("ERROR: Usage: --s3-url ALIAS bucket/key [seconds]", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
try:
|
||||
url = client.generate_presigned_url(
|
||||
"get_object",
|
||||
Params={"Bucket": bucket, "Key": key},
|
||||
ExpiresIn=expires,
|
||||
)
|
||||
print(url)
|
||||
except Exception as e:
|
||||
print(f"ERROR: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def s3_create_bucket(server: dict, bucket_name: str):
|
||||
"""Create a new S3 bucket."""
|
||||
client = _get_s3_client(server)
|
||||
try:
|
||||
client.create_bucket(Bucket=bucket_name)
|
||||
print(f"Bucket created: {bucket_name}")
|
||||
except Exception as e:
|
||||
print(f"ERROR: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
# ── Grafana commands ──────────────────────────────────
|
||||
|
||||
def _grafana_request(server: dict, endpoint: str) -> dict:
|
||||
@@ -1466,16 +1511,19 @@ def _grafana_request(server: dict, endpoint: str) -> dict:
|
||||
import requests
|
||||
host = server["ip"]
|
||||
port = server.get("port", 3000)
|
||||
protocol = "https" if server.get("ssl", False) else "http"
|
||||
protocol = "https" if server.get("use_ssl", server.get("ssl", False)) else "http"
|
||||
base_url = server.get("base_url", f"{protocol}://{host}:{port}")
|
||||
api_key = server.get("api_key", server.get("password", ""))
|
||||
api_token = server.get("api_token", server.get("api_key", ""))
|
||||
|
||||
headers = {}
|
||||
if api_key:
|
||||
headers["Authorization"] = f"Bearer {api_key}"
|
||||
auth = None
|
||||
if api_token:
|
||||
headers["Authorization"] = f"Bearer {api_token}"
|
||||
elif server.get("user") and server.get("password"):
|
||||
auth = (server["user"], server["password"])
|
||||
|
||||
url = f"{base_url.rstrip('/')}/api/{endpoint.lstrip('/')}"
|
||||
resp = requests.get(url, headers=headers, timeout=15, verify=server.get("ssl_verify", True))
|
||||
resp = requests.get(url, headers=headers, auth=auth, timeout=15, verify=server.get("ssl_verify", True))
|
||||
resp.raise_for_status()
|
||||
return resp.json()
|
||||
|
||||
@@ -1521,6 +1569,25 @@ def grafana_alerts(server: dict):
|
||||
print(f"\n({len(rows)} alert{'s' if len(rows) != 1 else ''})")
|
||||
|
||||
|
||||
def grafana_datasources(server: dict):
|
||||
"""List Grafana datasources."""
|
||||
data = _grafana_request(server, "datasources")
|
||||
if not data:
|
||||
print("(no datasources found)")
|
||||
return
|
||||
headers = ["Name", "Type", "URL", "Default"]
|
||||
rows = []
|
||||
for ds in data:
|
||||
rows.append([
|
||||
ds.get("name", ""),
|
||||
ds.get("type", ""),
|
||||
ds.get("url", ""),
|
||||
"yes" if ds.get("isDefault", False) else "",
|
||||
])
|
||||
_print_table(headers, rows)
|
||||
print(f"\n({len(rows)} datasource{'s' if len(rows) != 1 else ''})")
|
||||
|
||||
|
||||
# ── Prometheus commands ───────────────────────────────
|
||||
|
||||
def _prom_request(server: dict, endpoint: str, params: dict = None) -> dict:
|
||||
@@ -1633,6 +1700,29 @@ def prom_alerts(server: dict):
|
||||
print(f"\n({len(rows)} alert{'s' if len(rows) != 1 else ''})")
|
||||
|
||||
|
||||
def prom_rules(server: dict):
|
||||
"""List Prometheus rules (recording + alerting)."""
|
||||
data = _prom_request(server, "rules")
|
||||
groups = data.get("data", {}).get("groups", [])
|
||||
if not groups:
|
||||
print("(no rules)")
|
||||
return
|
||||
headers = ["Type", "Name", "Group", "Health", "Query/Expr"]
|
||||
rows = []
|
||||
for group in groups:
|
||||
gname = group.get("name", "")
|
||||
for rule in group.get("rules", []):
|
||||
rows.append([
|
||||
rule.get("type", ""),
|
||||
rule.get("name", ""),
|
||||
gname,
|
||||
rule.get("health", ""),
|
||||
(rule.get("query", rule.get("expr", "")))[:60],
|
||||
])
|
||||
_print_table(headers, rows)
|
||||
print(f"\n({len(rows)} rule{'s' if len(rows) != 1 else ''} in {len(groups)} group{'s' if len(groups) != 1 else ''})")
|
||||
|
||||
|
||||
# ── WinRM commands ────────────────────────────────────
|
||||
|
||||
def _get_winrm_session(server: dict):
|
||||
@@ -1763,6 +1853,17 @@ def main():
|
||||
alias = _resolve_alias(sys.argv[2], servers)
|
||||
s3_delete(servers[alias], sys.argv[3])
|
||||
sys.exit(0)
|
||||
if cmd == "--s3-url" and len(sys.argv) >= 4:
|
||||
_, servers = load_servers()
|
||||
alias = _resolve_alias(sys.argv[2], servers)
|
||||
expires = int(sys.argv[4]) if len(sys.argv) >= 5 else 3600
|
||||
s3_url(servers[alias], sys.argv[3], expires)
|
||||
sys.exit(0)
|
||||
if cmd == "--s3-create-bucket" and len(sys.argv) >= 4:
|
||||
_, servers = load_servers()
|
||||
alias = _resolve_alias(sys.argv[2], servers)
|
||||
s3_create_bucket(servers[alias], sys.argv[3])
|
||||
sys.exit(0)
|
||||
|
||||
# ── Grafana commands ──
|
||||
if cmd == "--grafana-dashboards" and len(sys.argv) >= 3:
|
||||
@@ -1775,6 +1876,11 @@ def main():
|
||||
alias = _resolve_alias(sys.argv[2], servers)
|
||||
grafana_alerts(servers[alias])
|
||||
sys.exit(0)
|
||||
if cmd == "--grafana-datasources" and len(sys.argv) >= 3:
|
||||
_, servers = load_servers()
|
||||
alias = _resolve_alias(sys.argv[2], servers)
|
||||
grafana_datasources(servers[alias])
|
||||
sys.exit(0)
|
||||
|
||||
# ── Prometheus commands ──
|
||||
if cmd == "--prom-query" and len(sys.argv) >= 4:
|
||||
@@ -1792,6 +1898,11 @@ def main():
|
||||
alias = _resolve_alias(sys.argv[2], servers)
|
||||
prom_alerts(servers[alias])
|
||||
sys.exit(0)
|
||||
if cmd == "--prom-rules" and len(sys.argv) >= 3:
|
||||
_, servers = load_servers()
|
||||
alias = _resolve_alias(sys.argv[2], servers)
|
||||
prom_rules(servers[alias])
|
||||
sys.exit(0)
|
||||
|
||||
# ── WinRM commands ──
|
||||
if cmd == "--ps" and len(sys.argv) >= 4:
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
"""Version info for ServerManager."""
|
||||
|
||||
__version__ = "1.9.13"
|
||||
__version__ = "1.9.44"
|
||||
__app_name__ = "ServerManager"
|
||||
__author__ = "aibot777"
|
||||
__description__ = "Desktop GUI for managing remote servers"
|
||||
|
||||
Reference in New Issue
Block a user