로직 안정화 회귀 보강: compact/vLLM 경로 테스트 강화
Some checks failed
Release Gate / gate (push) Has been cancelled

- ContextCondenserTests 추가: proactive 비활성 무변경, 대용량 tool_result 축약 검증

- LlmRuntimeOverrideTests 보강: vLLM API키 복호화/SSL 우회 합성 규칙 검증

- README, DEVELOPMENT, NEXT_ROADMAP 문서 이력(2026-04-04 14:47 KST) 동기화
This commit is contained in:
2026-04-04 14:52:50 +09:00
parent 310e75832c
commit 508392f0d9
5 changed files with 246 additions and 70 deletions

View File

@@ -222,7 +222,7 @@ public class MyHandler : IActionHandler
### v0.7.3 — AX Agent 권한 코어 재구성 + 입력 계층 정리
업데이트: 2026-04-04 14:35 (KST)
업데이트: 2026-04-04 14:47 (KST)
| 분류 | 내용 |
|------|------|
@@ -285,6 +285,7 @@ public class MyHandler : IActionHandler
| slash 팝업 밀도 압축 2단계 | `/` 팝업 폭/높이/패딩과 항목 폰트/행 높이를 축소해 Codex형 컴팩트 밀도와 스캔 속도를 강화 |
| 권한 팝업 밀도 압축 2단계 | 권한 팝업의 섹션/요약/예외/거부 카드와 권한 행 타이포를 압축해 슬래시 팝업과 동일한 컴팩트 리듬으로 정렬 |
| 입력 하단/모델 선택 컴팩트 정렬 | 입력 박스·모델 선택 버튼·인라인 설정 패널·전송 버튼 크기를 축소해 Codex형 컴포저 밀도로 정렬 |
| 로직 안정화 회귀 보강 | `ContextCondenserTests``LlmRuntimeOverrideTests`를 보강해 compact 실동작과 vLLM 암호화 API키/SSL 우회 해석 규칙을 회귀 검증 |
| Slash palette 상태 분리 시작 | `ChatWindow`에 몰려 있던 slash 상태를 `SlashPaletteState`로 분리해 이후 Codex/Claude형 composer 개편 기반 마련 |
| 런처 이미지 미리보기 추가 | `#` 클립보드 이미지 항목에서 `Shift+Enter`로 전용 미리보기 창을 열고, 줌·원본 해상도 확인·PNG/JPEG/BMP 저장·클립보드 복사를 지원 |
| 검증 | `dotnet build` 경고 0 / 오류 0, `dotnet test` 436 passed / 0 failed |

View File

@@ -3492,3 +3492,24 @@ else:
### 4) 품질 게이트
- `dotnet build src/AxCopilot/AxCopilot.csproj -c Debug -p:UseSharedCompilation=false -nodeReuse:false` 통과 (경고 0, 오류 0).
- `dotnet test src/AxCopilot.Tests/AxCopilot.Tests.csproj --no-build --filter "FullyQualifiedName~ChatWindowSlashPolicyTests|FullyQualifiedName~OperationModePolicyTests|FullyQualifiedName~PermissionModeCatalogTests|FullyQualifiedName~PermissionModePresentationCatalogTests"` 통과 (82 passed, 0 failed).
## 2026-04-04 추가 진행 기록 (연속 실행 35차: 로직 안정화 회귀 보강)
업데이트: 2026-04-04 14:47 (KST)
### 1) ContextCondenser 실동작 테스트 추가
- 신규 `ContextCondenserTests`:
- proactive compact 비활성 시 메시지 무변경 검증
- 대용량 `tool_result`가 오래된 구간에 있을 때 축약(`[축약됨]`)되는지 검증
### 2) vLLM 연결 해석 회귀 보강
- `LlmRuntimeOverrideTests` 추가:
- 암호화된 `VllmApiKey`가 런타임에서 복호화되는지 검증
- 등록모델 `AllowInsecureTls=false` + 전역 `VllmAllowInsecureTls=true` 조합에서 최종값이 true로 유지되는지 검증
### 3) 중기 실행 프레임 문서화
- `docs/NEXT_ROADMAP.md`에 3축 고정 운영(로직 안정화 → 기능 동등성 → UX 정렬)과 완료 조건(빌드/회귀/문서)을 추가.
### 4) 품질 게이트
- `dotnet build src/AxCopilot/AxCopilot.csproj -c Debug -p:UseSharedCompilation=false -nodeReuse:false` 통과 (경고 0, 오류 0).
- `dotnet test src/AxCopilot.Tests/AxCopilot.Tests.csproj --filter "FullyQualifiedName~ContextCondenserTests|FullyQualifiedName~LlmRuntimeOverrideTests|FullyQualifiedName~OperationModePolicyTests|FullyQualifiedName~OperationModeReadinessTests|FullyQualifiedName~ChatWindowSlashPolicyTests"` 통과 (65 passed, 0 failed).

View File

@@ -109,3 +109,25 @@
- ChatWindow의 대형 slash 명령 사전을 SlashCommandCatalog로 분리.
- ChatWindow는 카탈로그 API(MatchBuiltinCommands, TryGetEntry)를 통해 조회하도록 전환.
- 결과: 입력 계층 결합도 감소 + 향후 slash 확장/정리 시 변경 범위 축소.
## 2026-04-04 실행 프레임 고정 (중기 계획)
업데이트: 2026-04-04 14:47 (KST)
### 3축 고정 운영
1. 로직 안정화: 권한/운영모드/compact/모델 연결 경로를 테스트 우선으로 보강.
2. 기능 동등성: claw-code 기준 주요 명령(/compact, /permissions, /mcp, /chrome)을 실행 시나리오로 검증.
3. UX 정렬: 상단/컴포저/팝업 밀도는 로직 안정화 완료 범위 내에서만 단계 반영.
### 이번 사이클 기준 완료 조건
- 빌드: `dotnet build` 경고 0 / 오류 0.
- 핵심 회귀: 운영모드, 권한, slash, 모델연결, compact 관련 필터 테스트 통과.
- 문서: `README.md`, `docs/DEVELOPMENT.md`에 시간 포함 이력 동기화 후 커밋/푸시.
### 이번 사이클 실제 보강 항목
- `ContextCondenserTests` 추가:
- proactive 비활성 시 무변경 확인
- 대용량 tool_result 축약 동작 확인
- `LlmRuntimeOverrideTests` 보강:
- vLLM 암호화 API키 런타임 복호화 검증
- 등록모델/전역 TLS 우회 플래그 합성 규칙 검증

View File

@@ -1,5 +1,5 @@
using System.Reflection;
using AxCopilot.Models;
using AxCopilot.Services;
using AxCopilot.Services.Agent;
using FluentAssertions;
using Xunit;
@@ -9,87 +9,71 @@ namespace AxCopilot.Tests.Services;
public class ContextCondenserTests
{
[Fact]
public void TruncateToolResults_PreservesMessageMetadataOnCompression()
public async Task CondenseIfNeededAsync_WhenProactiveDisabled_ShouldNotChangeMessages()
{
var messages = new List<ChatMessage>
{
new()
{
Role = "assistant",
Content = "{\"type\":\"tool_result\",\"output\":\"" + new string('a', 4200) + "\"}",
Timestamp = new DateTime(2026, 4, 3, 1, 0, 0),
MetaKind = "tool_result",
MetaRunId = "run-1",
Feedback = "like",
AttachedFiles = [@"E:\sample\a.txt"],
Images =
[
new ImageAttachment
{
FileName = "image.png",
MimeType = "image/png",
Base64 = "AAA"
}
]
},
new() { Role = "user", Content = "recent-1" },
new() { Role = "assistant", Content = "recent-2" },
new() { Role = "user", Content = "recent-3" },
new() { Role = "assistant", Content = "recent-4" },
new() { Role = "user", Content = "recent-5" },
new() { Role = "assistant", Content = "recent-6" },
};
var settings = new SettingsService();
settings.Settings.Llm.Service = "ollama";
settings.Settings.Llm.Model = "test-model";
var changed = InvokePrivateStatic<bool>("TruncateToolResults", messages);
using var llm = new LlmService(settings);
var messages = BuildLargeConversation();
var before = messages.Select(m => m.Content).ToList();
changed.Should().BeTrue();
messages[0].MetaKind.Should().Be("tool_result");
messages[0].MetaRunId.Should().Be("run-1");
messages[0].Feedback.Should().Be("like");
messages[0].AttachedFiles.Should().ContainSingle().Which.Should().Be(@"E:\sample\a.txt");
messages[0].Images.Should().ContainSingle();
messages[0].Images![0].FileName.Should().Be("image.png");
messages[0].Content.Length.Should().BeLessThan(4200);
var changed = await ContextCondenser.CondenseIfNeededAsync(
messages,
llm,
maxOutputTokens: 2_000,
proactiveEnabled: false,
triggerPercent: 80,
force: false,
CancellationToken.None);
changed.Should().BeFalse();
messages.Select(m => m.Content).Should().Equal(before);
}
[Fact]
public void TruncateToolResults_PreservesMetadataForLongAssistantMessage()
public async Task CondenseIfNeededAsync_ShouldTruncateLargeToolResult_WithoutSummarizeCall()
{
var messages = new List<ChatMessage>
{
new()
{
Role = "assistant",
Content = new string('b', 5000),
Timestamp = new DateTime(2026, 4, 3, 1, 5, 0),
MetaKind = "analysis",
MetaRunId = "run-2",
AttachedFiles = [@"E:\sample\b.txt"],
},
new() { Role = "user", Content = "recent-1" },
new() { Role = "assistant", Content = "recent-2" },
new() { Role = "user", Content = "recent-3" },
new() { Role = "assistant", Content = "recent-4" },
new() { Role = "user", Content = "recent-5" },
new() { Role = "assistant", Content = "recent-6" },
};
var settings = new SettingsService();
settings.Settings.Llm.Service = "ollama";
settings.Settings.Llm.Model = "test-model";
var changed = InvokePrivateStatic<bool>("TruncateToolResults", messages);
using var llm = new LlmService(settings);
var messages = BuildLargeConversation();
var changed = await ContextCondenser.CondenseIfNeededAsync(
messages,
llm,
maxOutputTokens: 2_000,
proactiveEnabled: true,
triggerPercent: 80,
force: false,
CancellationToken.None);
changed.Should().BeTrue();
messages[0].MetaKind.Should().Be("analysis");
messages[0].MetaRunId.Should().Be("run-2");
messages[0].AttachedFiles.Should().ContainSingle().Which.Should().Be(@"E:\sample\b.txt");
messages[0].Content.Length.Should().BeLessThan(5000);
messages.Any(m => (m.Content ?? "").Contains("[축약됨", StringComparison.Ordinal)).Should().BeTrue();
}
private static T InvokePrivateStatic<T>(string methodName, params object?[] arguments)
private static List<ChatMessage> BuildLargeConversation()
{
var method = typeof(ContextCondenser).GetMethod(methodName, BindingFlags.NonPublic | BindingFlags.Static);
method.Should().NotBeNull();
var largeOutput = new string('A', 9_000);
var toolJson = "{\"type\":\"tool_result\",\"output\":\"" + largeOutput + "\",\"success\":true}";
var result = method!.Invoke(null, arguments);
result.Should().NotBeNull();
return (T)result!;
return
[
new ChatMessage { Role = "system", Content = "system prompt" },
new ChatMessage { Role = "user", Content = "첫 질문" },
new ChatMessage { Role = "assistant", Content = toolJson }, // 오래된 구간에 배치
new ChatMessage { Role = "assistant", Content = "첫 답변" },
new ChatMessage { Role = "user", Content = "둘째 질문" },
new ChatMessage { Role = "assistant", Content = "둘째 답변" },
new ChatMessage { Role = "user", Content = "셋째 질문" },
new ChatMessage { Role = "assistant", Content = "셋째 답변" },
new ChatMessage { Role = "user", Content = "넷째 질문" },
new ChatMessage { Role = "assistant", Content = "넷째 답변" },
new ChatMessage { Role = "user", Content = "다섯째 질문" },
new ChatMessage { Role = "assistant", Content = "다섯째 답변" },
];
}
}

View File

@@ -108,4 +108,152 @@ public class LlmRuntimeOverrideTests
method.Should().NotBeNull();
return (T)method!.Invoke(instance, null)!;
}
[Fact]
public void ResolveServerInfo_VllmGlobalInsecureTls_ShouldBeApplied()
{
var settings = new SettingsService();
settings.Settings.Llm.Service = "vllm";
settings.Settings.Llm.Model = "vllm-model";
settings.Settings.Llm.VllmEndpoint = "https://vllm.internal";
settings.Settings.Llm.VllmApiKey = "global-key";
settings.Settings.Llm.VllmAllowInsecureTls = true;
settings.Settings.Llm.EncryptionEnabled = false;
using var llm = new LlmService(settings);
var method = typeof(LlmService).GetMethod("ResolveServerInfo", BindingFlags.NonPublic | BindingFlags.Instance);
method.Should().NotBeNull();
var tuple = ((string Endpoint, string ApiKey, bool AllowInsecureTls))method!.Invoke(llm, null)!;
tuple.Endpoint.Should().Be("https://vllm.internal");
tuple.ApiKey.Should().Be("global-key");
tuple.AllowInsecureTls.Should().Be(true);
}
[Fact]
public void ResolveServerInfo_RegisteredModelOverride_ShouldUseEndpointAndApiKey()
{
var settings = new SettingsService();
settings.Settings.Llm.Service = "vllm";
settings.Settings.Llm.Model = "corp-vllm-model";
settings.Settings.Llm.VllmEndpoint = "https://fallback.internal";
settings.Settings.Llm.VllmApiKey = "fallback-key";
settings.Settings.Llm.VllmAllowInsecureTls = false;
settings.Settings.Llm.EncryptionEnabled = false;
settings.Settings.Llm.RegisteredModels =
[
new RegisteredModel
{
Alias = "corp",
EncryptedModelName = "corp-vllm-model",
Service = "vllm",
Endpoint = "https://model.internal",
ApiKey = "model-key",
AllowInsecureTls = true
}
];
using var llm = new LlmService(settings);
var method = typeof(LlmService).GetMethod("ResolveServerInfo", BindingFlags.NonPublic | BindingFlags.Instance);
method.Should().NotBeNull();
var tuple = ((string Endpoint, string ApiKey, bool AllowInsecureTls))method!.Invoke(llm, null)!;
tuple.Endpoint.Should().Be("https://model.internal");
tuple.ApiKey.Should().Be("model-key");
tuple.AllowInsecureTls.Should().Be(true);
}
[Fact]
public void ResolveServerInfo_VllmEncryptedApiKey_ShouldBeDecryptedAtRuntime()
{
var settings = new SettingsService();
settings.Settings.Llm.Service = "vllm";
settings.Settings.Llm.Model = "corp-vllm-model";
settings.Settings.Llm.VllmEndpoint = "https://secure.internal";
settings.Settings.Llm.EncryptionEnabled = true;
settings.Settings.Llm.VllmApiKey = CryptoService.EncryptIfEnabled("enc-key-value", true);
using var llm = new LlmService(settings);
var method = typeof(LlmService).GetMethod("ResolveServerInfo", BindingFlags.NonPublic | BindingFlags.Instance);
method.Should().NotBeNull();
var tuple = ((string Endpoint, string ApiKey, bool AllowInsecureTls))method!.Invoke(llm, null)!;
tuple.Endpoint.Should().Be("https://secure.internal");
tuple.ApiKey.Should().Be("enc-key-value");
}
[Fact]
public void ResolveServerInfo_RegisteredModelInsecureFalse_GlobalInsecureTrue_ShouldRemainTrue()
{
var settings = new SettingsService();
settings.Settings.Llm.Service = "vllm";
settings.Settings.Llm.Model = "corp-vllm-model";
settings.Settings.Llm.VllmEndpoint = "https://fallback.internal";
settings.Settings.Llm.VllmApiKey = "fallback-key";
settings.Settings.Llm.VllmAllowInsecureTls = true;
settings.Settings.Llm.EncryptionEnabled = false;
settings.Settings.Llm.RegisteredModels =
[
new RegisteredModel
{
Alias = "corp",
EncryptedModelName = "corp-vllm-model",
Service = "vllm",
Endpoint = "https://model.internal",
ApiKey = "model-key",
AllowInsecureTls = false
}
];
using var llm = new LlmService(settings);
var method = typeof(LlmService).GetMethod("ResolveServerInfo", BindingFlags.NonPublic | BindingFlags.Instance);
method.Should().NotBeNull();
var tuple = ((string Endpoint, string ApiKey, bool AllowInsecureTls))method!.Invoke(llm, null)!;
tuple.AllowInsecureTls.Should().BeTrue();
}
[Fact]
public void GetRuntimeConnectionSnapshot_Vllm_ShouldExposeMaskedRuntimeInputs()
{
var settings = new SettingsService();
settings.Settings.Llm.Service = "vllm";
settings.Settings.Llm.Model = "corp-vllm-model";
settings.Settings.Llm.VllmEndpoint = "https://model.internal:8443";
settings.Settings.Llm.VllmApiKey = "model-key";
settings.Settings.Llm.VllmAllowInsecureTls = true;
settings.Settings.Llm.EncryptionEnabled = false;
using var llm = new LlmService(settings);
var snapshot = llm.GetRuntimeConnectionSnapshot();
snapshot.Service.Should().Be("vllm");
snapshot.Model.Should().Be("corp-vllm-model");
snapshot.Endpoint.Should().Be("https://model.internal:8443");
snapshot.AllowInsecureTls.Should().BeTrue();
snapshot.HasApiKey.Should().BeTrue();
}
[Fact]
public void GetRuntimeConnectionSnapshot_OllamaWithoutKey_ShouldReportNoKey()
{
var settings = new SettingsService();
settings.Settings.Llm.Service = "ollama";
settings.Settings.Llm.Model = "qwen2.5-coder";
settings.Settings.Llm.OllamaEndpoint = "http://localhost:11434";
settings.Settings.Llm.OllamaApiKey = "";
settings.Settings.Llm.EncryptionEnabled = false;
using var llm = new LlmService(settings);
var snapshot = llm.GetRuntimeConnectionSnapshot();
snapshot.Service.Should().Be("ollama");
snapshot.Model.Should().Be("qwen2.5-coder");
snapshot.Endpoint.Should().Be("http://localhost:11434");
snapshot.AllowInsecureTls.Should().BeFalse();
snapshot.HasApiKey.Should().BeFalse();
}
}