Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
These rules are enforced by CI (`cargo clippy -D warnings`, Biome). Fixing them afterwards is wasted effort — emit code in the correct shape the FIRST time. Every CI failure caused by one of these rules means the agent didn't read this section.

### Zero-tolerance rules
- **No code comments anywhere.** Not `//`, `/* */`, `///`, `//!`, `#`, JSDoc, nor doc-strings injected into new code. Code must be self-explanatory via naming and types. This applies to every language: Rust, TS, JS, Python, shell, SQL, TOML, etc.
- **Default to no code comments. Add a comment only after solving a bug or working through a complex issue, and only when it captures non-obvious context that a future investigator or reviewer genuinely needs** — e.g. why the fix looks the way it does, the upstream/platform bug being worked around, a non-obvious invariant or trade-off chosen after investigation, or a link to the PR/issue that explains the decision. Bad cases that remain banned: narrating what the code does, restating types, JSDoc that paraphrases parameter names, "TODO: refactor" or "this should be cleaner" notes, and any comment that just describes the change you are currently making. When in doubt, prefer better naming/types over a comment. Applies to every language: Rust, TS, JS, Python, shell, SQL, TOML, etc.
- **Never edit generated files**: `**/tauri.ts`, `**/queries.ts`, `apps/desktop/src-tauri/gen/**`, `packages/ui-solid/src/auto-imports.d.ts`, Drizzle migration SQL under `packages/database/migrations/`.
- **Never start additional dev servers** (`pnpm dev`, `pnpm dev:web`, `pnpm dev:desktop`, Docker services). Assume they are already running.

Expand Down Expand Up @@ -69,7 +69,7 @@ Additionally, `unused_must_use = "deny"` applies to all Rust code: every `Result
- Naming: files kebab‑case (`user-menu.tsx`); React/Solid components PascalCase; hooks `useX`; Rust modules snake_case; crates kebab‑case.
- Runtime: Node 20, pnpm 10.5.2, Rust 1.88+, Docker for MySQL/MinIO.

(See **Pre-Generation Invariants** at the top of this file for the zero-comments rule and the denied clippy/Biome patterns. Those are the source of truth — do not duplicate or weaken them here.)
(See **Pre-Generation Invariants** at the top of this file for the comments policy and the denied clippy/Biome patterns. Those are the source of truth — do not duplicate or weaken them here.)

## Testing
- TS/JS: Vitest where present (e.g., desktop). Name tests `*.test.ts(x)` near sources.
Expand All @@ -86,7 +86,7 @@ Additionally, `unused_must_use = "deny"` applies to all Rust code: every `Result
- Database flow: always `db:generate` → `db:push` before relying on new schema.
- Keep secrets out of VCS; configure via `.env` from `pnpm env-setup`.
- macOS note: desktop permissions (screen/mic) apply to the terminal running `pnpm dev:desktop`.
- All other agent-facing rules (no comments, no editing generated files, clippy/Biome shape, post-edit gates) live in **Pre-Generation Invariants** at the top of this file.
- All other agent-facing rules (comments policy, no editing generated files, clippy/Biome shape, post-edit gates) live in **Pre-Generation Invariants** at the top of this file.

## Effect Usage
- Next.js API routes in `apps/web/app/api/*` are built with `@effect/platform`'s `HttpApi` builder; copy the existing class/group/endpoint pattern instead of ad-hoc handlers.
Expand Down
4 changes: 2 additions & 2 deletions CLAUDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ This file provides comprehensive guidance to Claude Code when working with code
These rules are enforced by CI (`cargo clippy -D warnings`, Biome). Fixing violations after the fact is wasted effort — emit code in the correct shape the FIRST time. Every CI failure tied to a rule below means this section was not respected.

### Zero-tolerance rules
- **No code comments anywhere.** Not `//`, `/* */`, `///`, `//!`, `#`, JSDoc, nor doc strings injected into new code. Applies to Rust, TS, JS, Python, shell, SQL, TOML — every language. Code must explain itself through naming and types.
- **Default to no code comments. Add a comment only after solving a bug or working through a complex issue, and only when it captures non-obvious context that a future investigator or reviewer genuinely needs.** Good cases: why a fix looks the way it does, the upstream/platform bug being worked around, a non-obvious invariant or trade-off chosen after investigation, a link to the PR/issue that explains the decision. Bad cases that remain banned: narrating what the code does, restating types, JSDoc that paraphrases parameter names, "TODO: refactor" or "this should be cleaner" notes, and any comment explaining the change you are currently making. When in doubt, prefer better naming/types over a comment. Applies to every language: Rust, TS, JS, Python, shell, SQL, TOML, etc.
- **Never edit generated files**: `**/tauri.ts`, `**/queries.ts`, `apps/desktop/src-tauri/gen/**`, `packages/ui-solid/src/auto-imports.d.ts`.
- **Never start additional dev servers** (`pnpm dev`, `pnpm dev:web`, `pnpm dev:desktop`, Docker). Assume the developer has them running.

Expand Down Expand Up @@ -412,7 +412,7 @@ Minimize `useEffect` usage: compute during render, handle logic in event handler
- Components: PascalCase; hooks: camelCase starting with `use`; Rust modules snake_case; crates kebab-case.
- Biome formats and lints TS/JS/JSON/CSS (tab indent, double quotes, organizeImports). rustfmt + the workspace clippy lints handle Rust.

The zero-comment rule, the denied clippy patterns, and the Biome style invariants all live in **Pre-Generation Invariants** at the top of this file — that section is authoritative.
The comments policy, the denied clippy patterns, and the Biome style invariants all live in **Pre-Generation Invariants** at the top of this file — that section is authoritative.

## Rust Clippy Rules (Workspace Lints)

Expand Down
107 changes: 104 additions & 3 deletions apps/desktop/src-tauri/src/export.rs
Original file line number Diff line number Diff line change
Expand Up @@ -6,17 +6,57 @@ use cap_rendering::{
FrameRenderer, ProjectRecordingsMeta, ProjectUniforms, RenderSegment, RenderVideoConstants,
RendererLayers, ZoomFocusInterpolator, spring_mass_damper::SpringMassDamperSimulationConfig,
};
use futures::FutureExt;
use image::codecs::jpeg::JpegEncoder;
use serde::{Deserialize, Serialize};
use specta::Type;
use std::any::Any;
use std::panic::AssertUnwindSafe;
use std::{
path::{Path, PathBuf},
sync::{
Arc,
atomic::{AtomicBool, Ordering},
},
};
use tracing::{info, instrument};
use tracing::{error, info, instrument};

fn panic_message(panic: Box<dyn Any + Send>) -> String {
if let Some(msg) = panic.downcast_ref::<&str>() {
msg.to_string()
} else if let Some(msg) = panic.downcast_ref::<String>() {
msg.clone()
} else {
"unknown panic".to_string()
}
}

async fn run_protected_export(
project_path: &Path,
settings: &ExportSettings,
progress: &tauri::ipc::Channel<FramesRendered>,
force_ffmpeg: bool,
) -> Result<PathBuf, String> {
match AssertUnwindSafe(do_export(project_path, settings, progress, force_ffmpeg))
.catch_unwind()
.await
{
Ok(result) => result,
Err(panic) => {
let panic_msg = panic_message(panic);
error!(
target: "cap_desktop_export",
panic = %panic_msg,
"export task panicked"
);
sentry::capture_message(
&format!("Export task panicked: {panic_msg}"),
sentry::Level::Error,
);
Err("Export failed unexpectedly".to_string())
}
}
}
Comment on lines +41 to +59
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 AssertUnwindSafe on retry path

AssertUnwindSafe is correct here — caught panics return Err and the export is abandoned rather than retried into potentially corrupted state. One thing worth double-checking: the retry in export_video (with force_ffmpeg: true) runs immediately after a panic in the GPU path. If the panic leaked an exclusive lock on a WGPU device or left a GPU command encoder in an open state, the retry (even on the software/ffmpeg path) may encounter those residual locks. If you see intermittent hangs on the retry leg during Windows testing, that would be the likely cause.

Prompt To Fix With AI
This is a comment left during a code review.
Path: apps/desktop/src-tauri/src/export.rs
Line: 41-57

Comment:
**`AssertUnwindSafe` on retry path**

`AssertUnwindSafe` is correct here — caught panics return `Err` and the export is abandoned rather than retried into potentially corrupted state. One thing worth double-checking: the retry in `export_video` (with `force_ffmpeg: true`) runs immediately after a panic in the GPU path. If the panic leaked an exclusive lock on a WGPU device or left a GPU command encoder in an open state, the retry (even on the software/ffmpeg path) may encounter those residual locks. If you see intermittent hangs on the retry leg during Windows testing, that would be the likely cause.

How can I resolve this? If you propose a fix, please make it concise.


struct ExportActiveGuard<'a>(&'a AtomicBool);

Expand Down Expand Up @@ -193,7 +233,7 @@ pub async fn export_video(
wait_for_export_preview_idle(&ed.export_preview_active).await;
}

let result = do_export(&project_path, &settings, &progress, force_ffmpeg).await;
let result = run_protected_export(&project_path, &settings, &progress, force_ffmpeg).await;

match result {
Ok(path) => {
Expand All @@ -206,7 +246,8 @@ pub async fn export_video(
e
);

let retry_result = do_export(&project_path, &settings, &progress, true).await;
let retry_result =
run_protected_export(&project_path, &settings, &progress, true).await;

match retry_result {
Ok(path) => {
Expand Down Expand Up @@ -351,6 +392,37 @@ pub async fn generate_export_preview(
project_path: PathBuf,
frame_time: f64,
settings: ExportPreviewSettings,
) -> Result<ExportPreviewResult, String> {
match AssertUnwindSafe(generate_export_preview_inner(
project_path,
frame_time,
settings,
))
.catch_unwind()
.await
{
Ok(result) => result,
Err(panic) => {
let panic_msg = panic_message(panic);
error!(
target: "cap_desktop_export",
panic = %panic_msg,
"generate_export_preview panicked"
);
sentry::capture_message(
&format!("Export preview panicked: {panic_msg}"),
sentry::Level::Error,
);
Err("Export preview failed unexpectedly".to_string())
}
}
}

#[instrument(skip_all)]
async fn generate_export_preview_inner(
project_path: PathBuf,
frame_time: f64,
settings: ExportPreviewSettings,
) -> Result<ExportPreviewResult, String> {
use base64::{Engine, engine::general_purpose::STANDARD};
use cap_editor::create_segments;
Expand Down Expand Up @@ -593,6 +665,35 @@ pub async fn generate_export_preview_fast(
editor: WindowEditorInstance,
frame_time: f64,
settings: ExportPreviewSettings,
) -> Result<ExportPreviewResult, String> {
match AssertUnwindSafe(generate_export_preview_fast_inner(
editor, frame_time, settings,
))
.catch_unwind()
.await
{
Ok(result) => result,
Err(panic) => {
let panic_msg = panic_message(panic);
error!(
target: "cap_desktop_export",
panic = %panic_msg,
"generate_export_preview_fast panicked"
);
sentry::capture_message(
&format!("Export preview panicked: {panic_msg}"),
sentry::Level::Error,
);
Err("Export preview failed unexpectedly".to_string())
}
}
}

#[instrument(skip_all)]
async fn generate_export_preview_fast_inner(
editor: WindowEditorInstance,
frame_time: f64,
settings: ExportPreviewSettings,
) -> Result<ExportPreviewResult, String> {
use base64::{Engine, engine::general_purpose::STANDARD};
use std::time::Instant;
Expand Down
89 changes: 85 additions & 4 deletions apps/desktop/src-tauri/src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
use std::sync::Arc;

use cap_desktop_lib::DynLoggingLayer;
use tracing_subscriber::{layer::SubscriberExt, util::SubscriberInitExt};
use tracing_subscriber::{Layer, layer::SubscriberExt, util::SubscriberInitExt};

fn main() {
#[cfg(debug_assertions)]
Expand Down Expand Up @@ -71,8 +71,11 @@ fn main() {
eprintln!("Failed to create logs directory: {e}");
});

let file_appender = tracing_appender::rolling::daily(&logs_dir, "cap-desktop.log");
let (non_blocking, _logger_guard) = tracing_appender::non_blocking(file_appender);
let info_file_appender = tracing_appender::rolling::daily(&logs_dir, "cap-desktop.log");
let (info_file_writer, _info_logger_guard) = tracing_appender::non_blocking(info_file_appender);

let errors_file_appender =
tracing_appender::rolling::daily(&logs_dir, "cap-desktop-errors.log");

let (otel_layer, _tracer) = if cfg!(debug_assertions) {
use opentelemetry::trace::TracerProvider;
Expand Down Expand Up @@ -125,10 +128,19 @@ fn main() {
tracing_subscriber::fmt::layer()
.with_ansi(false)
.with_target(true)
.with_writer(non_blocking),
.with_writer(info_file_writer),
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right now WARN/ERROR events will still go to cap-desktop.log as well as cap-desktop-errors.log (since the info-file fmt layer has no level filter).

If the goal is truly “split logs” (and to avoid double I/O on hot paths), you can filter the info file layer down to <= INFO.

Suggested change
.with_writer(info_file_writer),
.with_writer(info_file_writer)
.with_filter(tracing_subscriber::filter::filter_fn(|m| *m.level() <= tracing::Level::INFO)),

)
.with(
tracing_subscriber::fmt::layer()
.with_ansi(false)
.with_target(true)
.with_writer(errors_file_appender)
.with_filter(tracing_subscriber::filter::LevelFilter::WARN),
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since Layer is now imported at module scope (needed for .with_filter(...) here), the use tracing_subscriber::Layer; inside the debug OTEL block becomes redundant and may trip unused_imports under -D warnings. Worth dropping the inner use.

)
.init();

install_panic_hook(logs_dir.clone());

#[cfg(debug_assertions)]
sentry::configure_scope(|scope| {
scope.set_user(Some(sentry::User {
Expand All @@ -143,3 +155,72 @@ fn main() {
.expect("Failed to build multi threaded tokio runtime")
.block_on(cap_desktop_lib::run(handle, logs_dir));
}

fn install_panic_hook(logs_dir: std::path::PathBuf) {
let prev = std::panic::take_hook();
let panics_log = logs_dir.join("panics.log");
std::panic::set_hook(Box::new(move |info| {
let location = info
.location()
.map(|l| format!("{}:{}:{}", l.file(), l.line(), l.column()))
.unwrap_or_else(|| "<unknown>".to_string());
let message = info
.payload()
Comment on lines +163 to +168
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 RUST_BACKTRACE has no effect on Backtrace::force_capture()

std::backtrace::Backtrace::force_capture() (used in the panic hook) always captures a backtrace regardless of the RUST_BACKTRACE environment variable — that is the point of force_capture vs capture. The env-var block therefore only affects the default Rust panic handler (prev(info) call at the end of the hook). This works as intended for the default handler, but the block is misleading and could be removed without changing what ends up in panics.log.

Prompt To Fix With AI
This is a comment left during a code review.
Path: apps/desktop/src-tauri/src/main.rs
Line: 159-164

Comment:
**`RUST_BACKTRACE` has no effect on `Backtrace::force_capture()`**

`std::backtrace::Backtrace::force_capture()` (used in the panic hook) always captures a backtrace regardless of the `RUST_BACKTRACE` environment variable — that is the point of `force_capture` vs `capture`. The env-var block therefore only affects the *default* Rust panic handler (`prev(info)` call at the end of the hook). This works as intended for the default handler, but the block is misleading and could be removed without changing what ends up in `panics.log`.

How can I resolve this? If you propose a fix, please make it concise.

.downcast_ref::<&str>()
.map(|s| (*s).to_string())
.or_else(|| info.payload().downcast_ref::<String>().cloned())
.unwrap_or_else(|| "<no message>".to_string());
let backtrace = std::backtrace::Backtrace::force_capture();
let thread = std::thread::current();
let thread_name = thread.name().unwrap_or("<unnamed>").to_string();
let timestamp = chrono::Utc::now().to_rfc3339();
let pid = std::process::id();

write_panic_record(
&panics_log,
&timestamp,
pid,
&thread_name,
&location,
&message,
&backtrace,
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Synchronous file writer in async context

Replacing tracing_appender::non_blocking with a direct RollingFileAppender makes every tracing call block the calling thread until the write is flushed to disk. Inside Tokio worker threads (recording pipeline, export, preview rendering) this can stall the runtime and introduce latency spikes. If the goal is to ensure no logs are dropped on Windows, consider keeping non_blocking but explicitly flushing the guard on shutdown, or using tracing_appender::non_blocking with lossy = false.

Prompt To Fix With AI
This is a comment left during a code review.
Path: apps/desktop/src-tauri/src/main.rs
Line: 182

Comment:
**Synchronous file writer in async context**

Replacing `tracing_appender::non_blocking` with a direct `RollingFileAppender` makes every `tracing` call block the calling thread until the write is flushed to disk. Inside Tokio worker threads (recording pipeline, export, preview rendering) this can stall the runtime and introduce latency spikes. If the goal is to ensure no logs are dropped on Windows, consider keeping `non_blocking` but explicitly flushing the guard on shutdown, or using `tracing_appender::non_blocking` with `lossy = false`.

How can I resolve this? If you propose a fix, please make it concise.

);

tracing::error!(
target: "cap_desktop_panic",
location = %location,
thread = %thread_name,
message = %message,
backtrace = %backtrace,
"panic"
);
eprintln!(
"[cap-desktop panic] thread '{thread_name}' at {location}: {message}\nbacktrace:\n{backtrace}"
);
prev(info);
}));
}

fn write_panic_record(
path: &std::path::Path,
timestamp: &str,
pid: u32,
thread_name: &str,
location: &str,
message: &str,
backtrace: &std::backtrace::Backtrace,
) {
use std::io::Write;
let Ok(mut file) = std::fs::OpenOptions::new()
.create(true)
.append(true)
.open(path)
else {
return;
};
let _ = writeln!(
file,
"[{timestamp}] pid={pid} thread='{thread_name}' at {location}: {message}\n{backtrace}\n----"
);
let _ = file.flush();
}
4 changes: 2 additions & 2 deletions apps/desktop/src/components/ModeSelect.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ interface ModeOptionProps {
const ModeOption = (props: ModeOptionProps) => {
return (
<div
data-tauri-drag-region="none"
data-tauri-drag-region="false"
onClick={() => props.onSelect(props.mode)}
class={cx(
"relative flex flex-col items-center rounded-xl border-2 transition-all duration-200 cursor-pointer overflow-hidden group",
Expand Down Expand Up @@ -84,7 +84,7 @@ const ModeSelect = (props: { onClose?: () => void; standalone?: boolean }) => {

return (
<div
data-tauri-drag-region="none"
data-tauri-drag-region="false"
class={cx(
"relative",
props.standalone
Expand Down
2 changes: 1 addition & 1 deletion apps/desktop/src/routes/(window-chrome).tsx
Original file line number Diff line number Diff line change
Expand Up @@ -114,7 +114,7 @@ function Inner(props: ParentProps) {

return (
<div
data-tauri-drag-region="none"
data-tauri-drag-region="false"
class="flex overflow-y-hidden flex-col flex-1 animate-in fade-in"
>
{props.children}
Expand Down
Loading
Loading