Merge pull request #22 from SteelDynamite/audit/fixes

audit/fixes
This commit is contained in:
SteelDynamite 2026-04-02 17:39:00 +01:00 committed by GitHub
commit 45cfa8e5ec
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
31 changed files with 322 additions and 1348 deletions

View file

@ -32,6 +32,7 @@ Two-crate workspace (`resolver = "2"`, edition 2021) plus a Tauri app:
- **onyx-core** — Pure Rust library. Storage trait with `FileSystemStorage` implementation, `TaskRepository` (main API), data models, config, error types. No CLI/UI dependencies. `keyring` feature-gated behind `keyring-storage` (default on) for Android compatibility. - **onyx-core** — Pure Rust library. Storage trait with `FileSystemStorage` implementation, `TaskRepository` (main API), data models, config, error types. No CLI/UI dependencies. `keyring` feature-gated behind `keyring-storage` (default on) for Android compatibility.
- **onyx-cli** — CLI frontend using clap. Commands are in `src/commands/` (init, workspace, list, task, group). Output formatting in `src/output.rs`. - **onyx-cli** — CLI frontend using clap. Commands are in `src/commands/` (init, workspace, list, task, group). Output formatting in `src/output.rs`.
- **apps/tauri/** — Tauri v2 GUI. Svelte 5 frontend in `src/`, Rust backend in `src-tauri/` with Tauri commands that call into `onyx-core`. `notify` crate feature-gated for Android. - **apps/tauri/** — Tauri v2 GUI. Svelte 5 frontend in `src/`, Rust backend in `src-tauri/` with Tauri commands that call into `onyx-core`. `notify` crate feature-gated for Android.
### Key patterns ### Key patterns
- **Storage trait** (`storage.rs`): Strategy pattern for task persistence. `FileSystemStorage` reads/writes markdown files with YAML frontmatter and JSON metadata files. - **Storage trait** (`storage.rs`): Strategy pattern for task persistence. `FileSystemStorage` reads/writes markdown files with YAML frontmatter and JSON metadata files.

1
Cargo.lock generated
View file

@ -1000,6 +1000,7 @@ dependencies = [
"tokio", "tokio",
"uuid", "uuid",
"wiremock", "wiremock",
"zeroize",
] ]
[[package]] [[package]]

14
PLAN.md
View file

@ -768,7 +768,11 @@ All Android work can be done locally on Linux. iOS must go through CI or a Mac.
--- ---
### Known Blockers ### Tauri GUI
Tauri v2 has mobile support but it's newer and less mature.
#### Known Blockers
**`notify` crate doesn't compile for mobile.** The file-watcher subsystem (`notify` + `notify-debouncer-mini` in `Cargo.toml`) does not support Android or iOS targets. The entire file-watcher initialization path must be gated behind `#[cfg(not(mobile))]` before cross-compilation will succeed. **`notify` crate doesn't compile for mobile.** The file-watcher subsystem (`notify` + `notify-debouncer-mini` in `Cargo.toml`) does not support Android or iOS targets. The entire file-watcher initialization path must be gated behind `#[cfg(not(mobile))]` before cross-compilation will succeed.
@ -852,17 +856,17 @@ npm run tauri ios build
#### Desktop & Mobile #### Desktop & Mobile
- [x] Multiple task lists (folders) - [x] Multiple task lists (folders)
- [x] Switch between lists - [x] Switch between lists
- [ ] Subtasks support - [x] Subtasks support
- [ ] Due dates with date picker - [x] Due dates with date picker
- [ ] Rich markdown editor for task notes - [ ] Rich markdown editor for task notes
- [ ] Move tasks between lists - [x] Move tasks between lists
- [ ] Change storage folder location in settings - [ ] Change storage folder location in settings
- [ ] Search functionality - [ ] Search functionality
- [x] Theme selection (light/dark mode) - [x] Theme selection (light/dark mode)
#### Desktop-Specific #### Desktop-Specific
- [x] Drag & drop reordering - [x] Drag & drop reordering
- [ ] Keyboard shortcuts - [x] Keyboard shortcuts
- [ ] Multiple windows (optional) - [ ] Multiple windows (optional)
#### Mobile-Specific #### Mobile-Specific

View file

@ -198,7 +198,7 @@ cargo test -- --nocapture
## What's Next? ## What's Next?
- **Phase 4**: Mobile support (iOS & Android via Tauri v2) - **Phase 4**: Mobile support (iOS & Android via Tauri v2 mobile)
- **Phase 5**: GUI advanced features (subtasks, search, date picker) - **Phase 5**: GUI advanced features (subtasks, search, date picker)
- **Phase 6**: Mobile polish and platform-specific integrations - **Phase 6**: Mobile polish and platform-specific integrations
- **Phase 7**: Google Tasks importer and unique features - **Phase 7**: Google Tasks importer and unique features

View file

@ -2394,6 +2394,7 @@ dependencies = [
"sha2", "sha2",
"tokio", "tokio",
"uuid", "uuid",
"zeroize",
] ]
[[package]] [[package]]

View file

@ -34,6 +34,11 @@ struct AppState {
repo: Option<TaskRepository>, repo: Option<TaskRepository>,
} }
/// Lock the AppState mutex, converting poisoned locks into an error string.
fn lock_state(state: &Mutex<AppState>) -> Result<std::sync::MutexGuard<'_, AppState>, String> {
state.lock().map_err(|e| format!("State lock poisoned: {}", e))
}
/// Serializable sync result for the frontend. /// Serializable sync result for the frontend.
#[derive(Debug, Serialize, Deserialize, Clone)] #[derive(Debug, Serialize, Deserialize, Clone)]
struct SyncResult { struct SyncResult {
@ -61,7 +66,9 @@ impl From<CoreSyncResult> for SyncResult {
/// Suppress file watcher events for the next second (call before writes). /// Suppress file watcher events for the next second (call before writes).
#[cfg(not(target_os = "android"))] #[cfg(not(target_os = "android"))]
fn mute_watcher(_state: &mut AppState) { fn mute_watcher(_state: &mut AppState) {
*LAST_WRITE.lock().unwrap() = Some(Instant::now()); if let Ok(mut t) = LAST_WRITE.lock() {
*t = Some(Instant::now());
}
} }
#[cfg(target_os = "android")] #[cfg(target_os = "android")]
@ -81,17 +88,27 @@ fn ensure_repo(state: &mut AppState) -> Result<(), String> {
Ok(()) Ok(())
} }
/// Get an immutable reference to the repo, returning an error if not initialized.
fn repo_ref(state: &AppState) -> Result<&TaskRepository, String> {
state.repo.as_ref().ok_or_else(|| "Repository not initialized".to_string())
}
/// Get a mutable reference to the repo, returning an error if not initialized.
fn repo_mut(state: &mut AppState) -> Result<&mut TaskRepository, String> {
state.repo.as_mut().ok_or_else(|| "Repository not initialized".to_string())
}
// ── Config commands ────────────────────────────────────────────────── // ── Config commands ──────────────────────────────────────────────────
#[tauri::command] #[tauri::command]
fn get_config(state: State<'_, Mutex<AppState>>) -> Result<AppConfig, String> { fn get_config(state: State<'_, Mutex<AppState>>) -> Result<AppConfig, String> {
let s = state.lock().unwrap(); let s = lock_state(&state)?;
Ok(s.config.clone()) Ok(s.config.clone())
} }
#[tauri::command] #[tauri::command]
fn save_config(state: State<'_, Mutex<AppState>>) -> Result<(), String> { fn save_config(state: State<'_, Mutex<AppState>>) -> Result<(), String> {
let s = state.lock().unwrap(); let s = lock_state(&state)?;
s.config.save_to_file(&s.config_path).map_err(|e| e.to_string()) s.config.save_to_file(&s.config_path).map_err(|e| e.to_string())
} }
@ -101,7 +118,7 @@ fn add_workspace(
path: String, path: String,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<(), String> { ) -> Result<(), String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
let ws = WorkspaceConfig::new(PathBuf::from(&path)); let ws = WorkspaceConfig::new(PathBuf::from(&path));
s.config.add_workspace(name.clone(), ws); s.config.add_workspace(name.clone(), ws);
s.config s.config
@ -119,7 +136,7 @@ fn set_current_workspace(
name: String, name: String,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<(), String> { ) -> Result<(), String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
s.config s.config
.set_current_workspace(name) .set_current_workspace(name)
.map_err(|e| e.to_string())?; .map_err(|e| e.to_string())?;
@ -134,7 +151,7 @@ fn remove_workspace(
name: String, name: String,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<(), String> { ) -> Result<(), String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
s.config.remove_workspace(&name); s.config.remove_workspace(&name);
s.repo = None; s.repo = None;
s.config s.config
@ -155,11 +172,9 @@ fn init_workspace(path: String) -> Result<(), String> {
#[tauri::command] #[tauri::command]
fn get_lists(state: State<'_, Mutex<AppState>>) -> Result<Vec<TaskList>, String> { fn get_lists(state: State<'_, Mutex<AppState>>) -> Result<Vec<TaskList>, String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
ensure_repo(&mut s)?; ensure_repo(&mut s)?;
s.repo repo_ref(&s)?
.as_ref()
.unwrap()
.get_lists() .get_lists()
.map_err(|e| e.to_string()) .map_err(|e| e.to_string())
} }
@ -169,12 +184,10 @@ fn create_list(
name: String, name: String,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<TaskList, String> { ) -> Result<TaskList, String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
ensure_repo(&mut s)?; ensure_repo(&mut s)?;
mute_watcher(&mut s); mute_watcher(&mut s);
s.repo repo_mut(&mut s)?
.as_mut()
.unwrap()
.create_list(name) .create_list(name)
.map_err(|e| e.to_string()) .map_err(|e| e.to_string())
} }
@ -184,13 +197,11 @@ fn delete_list(
list_id: String, list_id: String,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<(), String> { ) -> Result<(), String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
ensure_repo(&mut s)?; ensure_repo(&mut s)?;
mute_watcher(&mut s); mute_watcher(&mut s);
let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?; let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?;
s.repo repo_mut(&mut s)?
.as_mut()
.unwrap()
.delete_list(id) .delete_list(id)
.map_err(|e| e.to_string()) .map_err(|e| e.to_string())
} }
@ -202,12 +213,10 @@ fn list_tasks(
list_id: String, list_id: String,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<Vec<Task>, String> { ) -> Result<Vec<Task>, String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
ensure_repo(&mut s)?; ensure_repo(&mut s)?;
let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?; let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?;
s.repo repo_ref(&s)?
.as_ref()
.unwrap()
.list_tasks(id) .list_tasks(id)
.map_err(|e| e.to_string()) .map_err(|e| e.to_string())
} }
@ -220,7 +229,7 @@ fn create_task(
parent_id: Option<String>, parent_id: Option<String>,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<Task, String> { ) -> Result<Task, String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
ensure_repo(&mut s)?; ensure_repo(&mut s)?;
mute_watcher(&mut s); mute_watcher(&mut s);
let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?; let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?;
@ -232,9 +241,7 @@ fn create_task(
let parent_uuid = Uuid::parse_str(&pid).map_err(|e| e.to_string())?; let parent_uuid = Uuid::parse_str(&pid).map_err(|e| e.to_string())?;
task.parent_id = Some(parent_uuid); task.parent_id = Some(parent_uuid);
} }
s.repo repo_mut(&mut s)?
.as_mut()
.unwrap()
.create_task(id, task) .create_task(id, task)
.map_err(|e| e.to_string()) .map_err(|e| e.to_string())
} }
@ -245,13 +252,11 @@ fn update_task(
task: Task, task: Task,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<(), String> { ) -> Result<(), String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
ensure_repo(&mut s)?; ensure_repo(&mut s)?;
mute_watcher(&mut s); mute_watcher(&mut s);
let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?; let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?;
s.repo repo_mut(&mut s)?
.as_mut()
.unwrap()
.update_task(id, task) .update_task(id, task)
.map_err(|e| e.to_string()) .map_err(|e| e.to_string())
} }
@ -262,12 +267,12 @@ fn delete_task(
task_id: String, task_id: String,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<(), String> { ) -> Result<(), String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
ensure_repo(&mut s)?; ensure_repo(&mut s)?;
mute_watcher(&mut s); mute_watcher(&mut s);
let lid = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?; let lid = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?;
let tid = Uuid::parse_str(&task_id).map_err(|e| e.to_string())?; let tid = Uuid::parse_str(&task_id).map_err(|e| e.to_string())?;
let repo = s.repo.as_mut().unwrap(); let repo = repo_mut(&mut s)?;
// Cascade-delete subtasks first // Cascade-delete subtasks first
let all_tasks = repo.list_tasks(lid).map_err(|e| e.to_string())?; let all_tasks = repo.list_tasks(lid).map_err(|e| e.to_string())?;
let child_ids: Vec<Uuid> = all_tasks let child_ids: Vec<Uuid> = all_tasks
@ -276,7 +281,7 @@ fn delete_task(
.map(|t| t.id) .map(|t| t.id)
.collect(); .collect();
for child_id in child_ids { for child_id in child_ids {
let _ = repo.delete_task(lid, child_id); repo.delete_task(lid, child_id).map_err(|e| format!("Failed to delete subtask {}: {}", child_id, e))?;
} }
repo.delete_task(lid, tid) repo.delete_task(lid, tid)
.map_err(|e| e.to_string()) .map_err(|e| e.to_string())
@ -288,12 +293,12 @@ fn toggle_task(
task_id: String, task_id: String,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<Task, String> { ) -> Result<Task, String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
ensure_repo(&mut s)?; ensure_repo(&mut s)?;
mute_watcher(&mut s); mute_watcher(&mut s);
let lid = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?; let lid = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?;
let tid = Uuid::parse_str(&task_id).map_err(|e| e.to_string())?; let tid = Uuid::parse_str(&task_id).map_err(|e| e.to_string())?;
let repo = s.repo.as_mut().unwrap(); let repo = repo_mut(&mut s)?;
let mut task = repo.get_task(lid, tid).map_err(|e| e.to_string())?; let mut task = repo.get_task(lid, tid).map_err(|e| e.to_string())?;
match task.status { match task.status {
TaskStatus::Backlog => task.complete(), TaskStatus::Backlog => task.complete(),
@ -322,14 +327,12 @@ fn reorder_task(
new_position: usize, new_position: usize,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<(), String> { ) -> Result<(), String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
ensure_repo(&mut s)?; ensure_repo(&mut s)?;
mute_watcher(&mut s); mute_watcher(&mut s);
let lid = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?; let lid = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?;
let tid = Uuid::parse_str(&task_id).map_err(|e| e.to_string())?; let tid = Uuid::parse_str(&task_id).map_err(|e| e.to_string())?;
s.repo repo_mut(&mut s)?
.as_mut()
.unwrap()
.reorder_task(lid, tid, new_position) .reorder_task(lid, tid, new_position)
.map_err(|e| e.to_string()) .map_err(|e| e.to_string())
} }
@ -343,15 +346,13 @@ fn move_task(
task_id: String, task_id: String,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<(), String> { ) -> Result<(), String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
ensure_repo(&mut s)?; ensure_repo(&mut s)?;
mute_watcher(&mut s); mute_watcher(&mut s);
let from = Uuid::parse_str(&from_list_id).map_err(|e| e.to_string())?; let from = Uuid::parse_str(&from_list_id).map_err(|e| e.to_string())?;
let to = Uuid::parse_str(&to_list_id).map_err(|e| e.to_string())?; let to = Uuid::parse_str(&to_list_id).map_err(|e| e.to_string())?;
let tid = Uuid::parse_str(&task_id).map_err(|e| e.to_string())?; let tid = Uuid::parse_str(&task_id).map_err(|e| e.to_string())?;
s.repo repo_mut(&mut s)?
.as_mut()
.unwrap()
.move_task(from, to, tid) .move_task(from, to, tid)
.map_err(|e| e.to_string()) .map_err(|e| e.to_string())
} }
@ -362,13 +363,11 @@ fn rename_list(
new_name: String, new_name: String,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<(), String> { ) -> Result<(), String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
ensure_repo(&mut s)?; ensure_repo(&mut s)?;
mute_watcher(&mut s); mute_watcher(&mut s);
let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?; let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?;
s.repo repo_mut(&mut s)?
.as_mut()
.unwrap()
.rename_list(id, new_name) .rename_list(id, new_name)
.map_err(|e| e.to_string()) .map_err(|e| e.to_string())
} }
@ -379,13 +378,11 @@ fn set_group_by_due_date(
enabled: bool, enabled: bool,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<(), String> { ) -> Result<(), String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
ensure_repo(&mut s)?; ensure_repo(&mut s)?;
mute_watcher(&mut s); mute_watcher(&mut s);
let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?; let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?;
s.repo repo_mut(&mut s)?
.as_mut()
.unwrap()
.set_group_by_due_date(id, enabled) .set_group_by_due_date(id, enabled)
.map_err(|e| e.to_string()) .map_err(|e| e.to_string())
} }
@ -395,12 +392,10 @@ fn get_group_by_due_date(
list_id: String, list_id: String,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<bool, String> { ) -> Result<bool, String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
ensure_repo(&mut s)?; ensure_repo(&mut s)?;
let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?; let id = Uuid::parse_str(&list_id).map_err(|e| e.to_string())?;
s.repo repo_ref(&s)?
.as_ref()
.unwrap()
.get_group_by_due_date(id) .get_group_by_due_date(id)
.map_err(|e| e.to_string()) .map_err(|e| e.to_string())
} }
@ -413,7 +408,7 @@ fn set_webdav_config(
webdav_url: String, webdav_url: String,
state: State<'_, Mutex<AppState>>, state: State<'_, Mutex<AppState>>,
) -> Result<(), String> { ) -> Result<(), String> {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
if let Some(ws) = s.config.workspaces.get_mut(&workspace_name) { if let Some(ws) = s.config.workspaces.get_mut(&workspace_name) {
ws.webdav_url = Some(webdav_url); ws.webdav_url = Some(webdav_url);
} }
@ -477,7 +472,7 @@ async fn sync_workspace(
// Persist last_sync timestamp to config // Persist last_sync timestamp to config
{ {
let mut s = state.lock().unwrap(); let mut s = lock_state(&state)?;
if let Some(ws) = s.config.workspaces.get_mut(&workspace_name) { if let Some(ws) = s.config.workspaces.get_mut(&workspace_name) {
ws.last_sync = Some(Utc::now()); ws.last_sync = Some(Utc::now());
} }
@ -491,6 +486,10 @@ async fn sync_workspace(
#[cfg(not(target_os = "android"))] #[cfg(not(target_os = "android"))]
fn start_watcher(handle: tauri::AppHandle, path: PathBuf) { fn start_watcher(handle: tauri::AppHandle, path: PathBuf) {
// Stop any existing watcher before starting a new one
if let Ok(mut w) = WATCHER.lock() {
*w = None;
}
let handle = handle.clone(); let handle = handle.clone();
let debouncer = new_debouncer( let debouncer = new_debouncer(
std::time::Duration::from_millis(500), std::time::Duration::from_millis(500),
@ -504,16 +503,22 @@ fn start_watcher(handle: tauri::AppHandle, path: PathBuf) {
}); });
if !has_data_change { return; } if !has_data_change { return; }
// Skip if we wrote recently (self-change suppression) // Skip if we wrote recently (self-change suppression)
if let Some(t) = *LAST_WRITE.lock().unwrap() { if let Ok(guard) = LAST_WRITE.lock() {
if t.elapsed() < std::time::Duration::from_secs(1) { return; } if let Some(t) = *guard {
if t.elapsed() < std::time::Duration::from_secs(1) { return; }
}
} }
let _ = handle.emit("fs-changed", ()); let _ = handle.emit("fs-changed", ());
}, },
); );
match debouncer { match debouncer {
Ok(mut d) => { Ok(mut d) => {
let _ = d.watcher().watch(&path, notify::RecursiveMode::Recursive); if let Err(e) = d.watcher().watch(&path, notify::RecursiveMode::Recursive) {
*WATCHER.lock().unwrap() = Some(d); eprintln!("Failed to watch path {}: {e}", path.display());
}
if let Ok(mut w) = WATCHER.lock() {
*w = Some(d);
}
} }
Err(e) => eprintln!("Failed to start file watcher: {e}"), Err(e) => eprintln!("Failed to start file watcher: {e}"),
} }
@ -545,7 +550,9 @@ pub fn run() {
#[cfg(target_os = "android")] #[cfg(target_os = "android")]
{ {
use tauri::Manager; use tauri::Manager;
app.path().app_data_dir().expect("Failed to get app data dir").join("config.json") app.path().app_data_dir()
.map_err(|e| format!("Failed to get app data dir: {}", e))?
.join("config.json")
} }
#[cfg(not(target_os = "android"))] #[cfg(not(target_os = "android"))]
{ {

View file

@ -23,7 +23,7 @@
} }
], ],
"security": { "security": {
"csp": null "csp": "default-src 'self'; style-src 'self' 'unsafe-inline'; font-src 'self' https://fonts.gstatic.com; connect-src ipc: http://ipc.localhost; object-src 'none'; base-uri 'self'"
} }
}, },
"bundle": { "bundle": {

View file

@ -45,6 +45,13 @@ body {
} }
/* Scrollbar styling */ /* Scrollbar styling */
* {
scrollbar-width: thin;
scrollbar-color: #d1d5db transparent;
}
.dark * {
scrollbar-color: #4b5563 transparent;
}
::-webkit-scrollbar { ::-webkit-scrollbar {
width: 4px; width: 4px;
} }

View file

@ -34,9 +34,8 @@
function debouncedSave(fields: Partial<Task>) { function debouncedSave(fields: Partial<Task>) {
clearTimeout(saveTimer); clearTimeout(saveTimer);
var snapshot = { ...task };
saveTimer = setTimeout(() => { saveTimer = setTimeout(() => {
app.updateTask({ ...snapshot, ...fields, updated_at: new Date().toISOString() }); app.updateTask({ ...task, ...fields, updated_at: new Date().toISOString() });
}, 400); }, 400);
} }

View file

@ -19,6 +19,7 @@
let isCompleted = $derived(task.status === "completed"); let isCompleted = $derived(task.status === "completed");
let justChecked = $state(false); let justChecked = $state(false);
let toggling = $state(false);
$effect(() => { $effect(() => {
const _ = task.status; const _ = task.status;
@ -35,6 +36,8 @@
async function handleToggle(e: MouseEvent) { async function handleToggle(e: MouseEvent) {
e.stopPropagation(); e.stopPropagation();
if (toggling) return;
toggling = true;
justChecked = true; justChecked = true;
await new Promise((r) => setTimeout(r, 300)); await new Promise((r) => setTimeout(r, 300));
transitioning = true; transitioning = true;
@ -42,6 +45,7 @@
await new Promise((r) => setTimeout(r, 200)); await new Promise((r) => setTimeout(r, 200));
justChecked = false; justChecked = false;
await app.toggleTask(task.id); await app.toggleTask(task.id);
toggling = false;
} }
function handleTouchStart(e: TouchEvent) { function handleTouchStart(e: TouchEvent) {
@ -57,14 +61,15 @@
} }
function handleTouchEnd() { function handleTouchEnd() {
if (Math.abs(swipeX) > 100) { if (Math.abs(swipeX) > 100 && !toggling) {
swipeX = 0; swipeX = 0;
swiping = false; swiping = false;
toggling = true;
justChecked = true; justChecked = true;
setTimeout(() => { setTimeout(() => {
transitioning = true; transitioning = true;
animateInIds.add(task.id); animateInIds.add(task.id);
setTimeout(() => { justChecked = false; app.toggleTask(task.id); }, 200); setTimeout(() => { justChecked = false; app.toggleTask(task.id).finally(() => { toggling = false; }); }, 200);
}, 300); }, 300);
return; return;
} }
@ -106,15 +111,16 @@
{/if} {/if}
<!-- Task content --> <!-- Task content -->
<button <!-- svelte-ignore a11y_no_static_element_interactions -->
class="group flex w-full items-start gap-3 bg-surface-light py-3 pr-4 text-left hover:bg-black/5 dark:bg-surface-dark dark:hover:bg-white/5" <div
class="group flex w-full cursor-pointer items-start gap-3 bg-surface-light py-3 pr-4 text-left hover:bg-black/5 dark:bg-surface-dark dark:hover:bg-white/5"
style="padding-left: {1 + depth * 1.5}rem; transform: translateX({swipeX}px); transition: {swiping ? 'none' : 'transform 0.2s ease-out'}" style="padding-left: {1 + depth * 1.5}rem; transform: translateX({swipeX}px); transition: {swiping ? 'none' : 'transform 0.2s ease-out'}"
onclick={() => onopen?.(task)} onclick={() => onopen?.(task)}
> >
<!-- Checkbox --> <!-- Checkbox -->
<!-- svelte-ignore a11y_no_static_element_interactions --> <button
<div
onclick={handleToggle} onclick={handleToggle}
aria-label={isCompleted ? "Restore task" : "Complete task"}
class="-m-2 flex shrink-0 items-center justify-center p-2" class="-m-2 flex shrink-0 items-center justify-center p-2"
> >
<div <div
@ -131,7 +137,7 @@
</svg> </svg>
{/if} {/if}
</div> </div>
</div> </button>
<!-- Content --> <!-- Content -->
<div class="min-w-0 flex-1"> <div class="min-w-0 flex-1">
@ -160,7 +166,7 @@
<svg class="mt-1 h-4 w-4 shrink-0 opacity-0 transition-opacity group-hover:opacity-30" viewBox="0 0 20 20" fill="currentColor"> <svg class="mt-1 h-4 w-4 shrink-0 opacity-0 transition-opacity group-hover:opacity-30" viewBox="0 0 20 20" fill="currentColor">
<path fill-rule="evenodd" d="M7.21 14.77a.75.75 0 01.02-1.06L11.168 10 7.23 6.29a.75.75 0 111.04-1.08l4.5 4.25a.75.75 0 010 1.08l-4.5 4.25a.75.75 0 01-1.06-.02z" /> <path fill-rule="evenodd" d="M7.21 14.77a.75.75 0 01.02-1.06L11.168 10 7.23 6.29a.75.75 0 111.04-1.08l4.5 4.25a.75.75 0 010 1.08l-4.5 4.25a.75.75 0 01-1.06-.02z" />
</svg> </svg>
</button> </div>
</div> </div>
</div> </div>
</div> </div>

View file

@ -44,7 +44,7 @@
showWorkspacePicker = false; showWorkspacePicker = false;
if (showListMenu && listMenuEl && !listMenuEl.contains(e.target as Node)) if (showListMenu && listMenuEl && !listMenuEl.contains(e.target as Node))
showListMenu = false; showListMenu = false;
var target = e.target as HTMLElement; const target = e.target as HTMLElement;
if (wsMenuName && !target.closest("[data-ws-menu]")) wsMenuName = null; if (wsMenuName && !target.closest("[data-ws-menu]")) wsMenuName = null;
} }
@ -58,6 +58,7 @@
let listMenuEl = $state<HTMLDivElement | null>(null); let listMenuEl = $state<HTMLDivElement | null>(null);
let confirmDeleteList = $state(false); let confirmDeleteList = $state(false);
let confirmDeleteCompleted = $state(false); let confirmDeleteCompleted = $state(false);
let confirmRemoveWorkspace = $state<string | null>(null);
let dragId = $state<string | null>(null); let dragId = $state<string | null>(null);
let dragOverId = $state<string | null>(null); let dragOverId = $state<string | null>(null);
let resizing = $state(false); let resizing = $state(false);
@ -270,7 +271,7 @@
{#if wsMenuName === name} {#if wsMenuName === name}
<div class="absolute right-0 top-full z-40 mt-1 min-w-[140px] rounded-lg border border-border-light bg-surface-light py-1 shadow-lg dark:border-border-dark dark:bg-surface-dark"> <div class="absolute right-0 top-full z-40 mt-1 min-w-[140px] rounded-lg border border-border-light bg-surface-light py-1 shadow-lg dark:border-border-dark dark:bg-surface-dark">
<button <button
onclick={() => { wsMenuName = null; if (confirm(`Remove workspace "${name}"? (Files remain on disk)`)) app.removeWorkspace(name); }} onclick={() => { wsMenuName = null; confirmRemoveWorkspace = name; }}
class="flex w-full items-center gap-2 px-3 py-2 text-left text-sm text-danger hover:bg-black/5 dark:hover:bg-white/10" class="flex w-full items-center gap-2 px-3 py-2 text-left text-sm text-danger hover:bg-black/5 dark:hover:bg-white/10"
> >
<svg class="h-4 w-4" viewBox="0 0 20 20" fill="currentColor"> <svg class="h-4 w-4" viewBox="0 0 20 20" fill="currentColor">
@ -651,6 +652,18 @@
/> />
{/if} {/if}
<!-- Remove workspace confirmation -->
{#if confirmRemoveWorkspace}
<ConfirmDialog
message='Remove workspace "{confirmRemoveWorkspace}"?'
detail="Files remain on disk."
confirmText="Remove"
danger
onconfirm={() => { const name = confirmRemoveWorkspace; confirmRemoveWorkspace = null; if (name) app.removeWorkspace(name); }}
oncancel={() => (confirmRemoveWorkspace = null)}
/>
{/if}
<!-- Delete completed tasks confirmation --> <!-- Delete completed tasks confirmation -->
{#if confirmDeleteCompleted} {#if confirmDeleteCompleted}
<ConfirmDialog <ConfirmDialog

View file

@ -8,7 +8,7 @@ import type {
SyncResult, SyncResult,
} from "../types"; } from "../types";
// Listen for file system changes from the backend watcher // Listen for file system changes from the backend watcher.
listen("fs-changed", () => { listen("fs-changed", () => {
loadLists(); loadLists();
}); });
@ -79,7 +79,7 @@ async function addWorkspace(name: string, path: string) {
await invoke("add_workspace", { name, path }); await invoke("add_workspace", { name, path });
config = await invoke<AppConfig>("get_config"); config = await invoke<AppConfig>("get_config");
await loadLists(); await loadLists();
invoke("watch_workspace", { path }).catch(() => {}); invoke("watch_workspace", { path }).catch((e) => console.warn("File watcher failed:", e));
screen = "tasks"; screen = "tasks";
error = null; error = null;
} catch (e) { } catch (e) {
@ -94,7 +94,7 @@ async function switchWorkspace(name: string) {
activeListId = null; activeListId = null;
await loadLists(); await loadLists();
const ws = config?.workspaces[name]; const ws = config?.workspaces[name];
if (ws) invoke("watch_workspace", { path: ws.path }).catch(() => {}); if (ws) invoke("watch_workspace", { path: ws.path }).catch((e) => console.warn("File watcher failed:", e));
error = null; error = null;
} catch (e) { } catch (e) {
error = String(e); error = String(e);
@ -196,7 +196,7 @@ async function toggleTask(taskId: string) {
// Move to top of list locally, then persist order in background // Move to top of list locally, then persist order in background
if (updated.status === "backlog") { if (updated.status === "backlog") {
tasks = [updated, ...tasks.filter((t) => t.id !== taskId)]; tasks = [updated, ...tasks.filter((t) => t.id !== taskId)];
invoke("reorder_task", { listId: activeListId, taskId, newPosition: 0 }).catch(() => {}); invoke("reorder_task", { listId: activeListId, taskId, newPosition: 0 }).catch((e) => { error = String(e); });
} else { } else {
tasks = tasks.map((t) => (t.id === taskId ? updated : t)); tasks = tasks.map((t) => (t.id === taskId ? updated : t));
} }

View file

@ -1,22 +0,0 @@
[package]
name = "bevy-tasks-cli"
version = "0.1.0"
edition = "2021"
description = "CLI frontend for Bevy Tasks, a local-first task management app"
license = "GPL-3.0-or-later"
repository = "https://github.com/SteelDynamite/bevy-tasks"
[[bin]]
name = "bevy-tasks"
path = "src/main.rs"
[dependencies]
bevy-tasks-core = { path = "../bevy-tasks-core" }
clap = { version = "4.5", features = ["derive", "env"] }
colored = "2.0"
anyhow = { workspace = true }
chrono = { workspace = true }
uuid = { workspace = true }
fs_extra = "1.3"
tokio = { workspace = true }
rpassword = "5.0"

View file

@ -1,39 +0,0 @@
use anyhow::{Context, Result};
use crate::output;
use crate::commands::get_repository;
pub fn enable(list_name: String, workspace: Option<String>) -> Result<()> {
let (mut repo, _workspace_name) = get_repository(workspace)?;
let lists = repo.get_lists()
.context("Failed to get lists")?;
let list = lists.iter()
.find(|l| l.title == list_name)
.ok_or_else(|| anyhow::anyhow!("List '{}' not found", list_name))?;
repo.set_group_by_due_date(list.id, true)
.context("Failed to enable grouping")?;
output::success(&format!("Enabled group-by-due-date for list \"{}\"", list_name));
Ok(())
}
pub fn disable(list_name: String, workspace: Option<String>) -> Result<()> {
let (mut repo, _workspace_name) = get_repository(workspace)?;
let lists = repo.get_lists()
.context("Failed to get lists")?;
let list = lists.iter()
.find(|l| l.title == list_name)
.ok_or_else(|| anyhow::anyhow!("List '{}' not found", list_name))?;
repo.set_group_by_due_date(list.id, false)
.context("Failed to disable grouping")?;
output::success(&format!("Disabled group-by-due-date for list \"{}\"", list_name));
Ok(())
}

View file

@ -1,43 +0,0 @@
use anyhow::{Context, Result};
use bevy_tasks_core::{AppConfig, TaskRepository, WorkspaceConfig};
use std::path::PathBuf;
use crate::output;
pub fn execute(path: String, name: String) -> Result<()> {
let path_buf = PathBuf::from(path);
let path_buf = if path_buf.is_relative() {
std::env::current_dir()?.join(path_buf)
} else {
path_buf
};
// Initialize the repository
let mut repo = TaskRepository::init(path_buf.clone())
.context("Failed to initialize tasks folder")?;
// Create default list if it doesn't exist
let lists = repo.get_lists().context("Failed to get lists")?;
if !lists.iter().any(|l| l.title == "My Tasks") {
repo.create_list("My Tasks".to_string())
.context("Failed to create default list")?;
}
// Load or create config
let config_path = AppConfig::get_config_path();
let mut config = AppConfig::load_from_file(&config_path)
.unwrap_or_else(|_| AppConfig::new());
// Add workspace
config.add_workspace(name.clone(), WorkspaceConfig::new(path_buf.clone()));
config.set_current_workspace(name.clone())?;
// Save config
config.save_to_file(&config_path)
.context("Failed to save config")?;
output::success(&format!("Initialized workspace \"{}\" at {}", name, path_buf.display()));
output::success("Created default list \"My Tasks\"");
output::success(&format!("Set \"{}\" as current workspace", name));
Ok(())
}

View file

@ -1,91 +0,0 @@
use anyhow::{Context, Result};
use colored::*;
use bevy_tasks_core::{Task, TaskStatus};
use crate::output;
use crate::commands::get_repository;
fn print_tasks(tasks: &[Task]) {
if tasks.is_empty() {
output::item("No tasks");
return;
}
for task in tasks {
let checkbox = if task.status == TaskStatus::Completed { "[✓]".green() } else { "[ ]".normal() };
let due_str = task.due_date.map(|d| format!(" (due: {})", d.format("%Y-%m-%d")).yellow().to_string()).unwrap_or_default();
output::item(&format!("{} {}{} {}", checkbox, task.title, due_str, task.id.to_string().dimmed()));
}
}
pub fn create(name: String, workspace: Option<String>) -> Result<()> {
let (mut repo, _workspace_name) = get_repository(workspace)?;
repo.create_list(name.clone())
.context("Failed to create list")?;
output::success(&format!("Created list \"{}\"", name));
Ok(())
}
pub fn show(list_name: Option<String>, workspace: Option<String>) -> Result<()> {
let (repo, _workspace_name) = get_repository(workspace)?;
let lists = repo.get_lists()
.context("Failed to get lists")?;
if lists.is_empty() {
output::info("No lists found. Create one with 'bevy-tasks list create <name>'");
return Ok(());
}
// If a specific list is requested, show only that one
if let Some(name) = list_name {
let list = lists.iter()
.find(|l| l.title == name)
.ok_or_else(|| anyhow::anyhow!("List '{}' not found", name))?;
output::header(&format!("{} ({})", list.title, format!("{} tasks", list.tasks.len()).dimmed()));
print_tasks(&list.tasks);
} else {
// Show all lists
for list in &lists {
output::header(&format!("{} ({})", list.title, format!("{} tasks", list.tasks.len()).dimmed()));
print_tasks(&list.tasks);
output::blank();
}
}
Ok(())
}
pub fn delete(name: String, workspace: Option<String>) -> Result<()> {
let (mut repo, _workspace_name) = get_repository(workspace)?;
let lists = repo.get_lists()
.context("Failed to get lists")?;
let list = lists.iter()
.find(|l| l.title == name)
.ok_or_else(|| anyhow::anyhow!("List '{}' not found", name))?;
// Confirm
output::warning(&format!("This will delete list \"{}\" and all its tasks", name));
print!("Continue? (y/n): ");
use std::io::{self, Write};
io::stdout().flush()?;
let mut input = String::new();
io::stdin().read_line(&mut input)?;
if input.trim().to_lowercase() != "y" {
output::info("Cancelled");
return Ok(());
}
repo.delete_list(list.id)
.context("Failed to delete list")?;
output::success(&format!("Deleted list \"{}\"", name));
Ok(())
}

View file

@ -1,43 +0,0 @@
pub mod init;
pub mod workspace;
pub mod list;
pub mod task;
pub mod group;
pub mod sync;
use bevy_tasks_core::{AppConfig, TaskRepository};
use anyhow::{Context, Result};
use std::path::PathBuf;
pub fn get_config_path() -> PathBuf {
AppConfig::get_config_path()
}
pub fn load_config() -> Result<AppConfig> {
let path = get_config_path();
AppConfig::load_from_file(&path).context("Failed to load config")
}
pub fn save_config(config: &AppConfig) -> Result<()> {
let path = get_config_path();
config.save_to_file(&path).context("Failed to save config")
}
pub fn get_repository(workspace_name: Option<String>) -> Result<(TaskRepository, String)> {
let config = load_config()?;
let (name, workspace_config) = if let Some(name) = workspace_name {
let workspace_config = config.get_workspace(&name)
.ok_or_else(|| anyhow::anyhow!("Workspace '{}' not found", name))?;
(name, workspace_config.clone())
} else {
let (name, workspace_config) = config.get_current_workspace()
.context("No workspace set. Use 'bevy-tasks init' to create one.")?;
(name.clone(), workspace_config.clone())
};
let repo = TaskRepository::new(workspace_config.path.clone())
.context(format!("Failed to open workspace '{}'", name))?;
Ok((repo, name))
}

View file

@ -1,229 +0,0 @@
use anyhow::{Context, Result};
use colored::Colorize;
use bevy_tasks_core::sync::{SyncMode, sync_workspace, get_sync_status};
use bevy_tasks_core::webdav::{WebDavClient, store_credentials, load_credentials};
use crate::output;
use super::{load_config, save_config};
/// Run sync setup: prompt for URL, username, password, test connection, store credentials.
pub fn setup(workspace_name: Option<String>) -> Result<()> {
let mut config = load_config()?;
let (name, workspace) = if let Some(name) = workspace_name {
let ws = config.get_workspace(&name)
.ok_or_else(|| anyhow::anyhow!("Workspace '{}' not found", name))?
.clone();
(name, ws)
} else {
let (n, ws) = config.get_current_workspace()
.context("No workspace set. Use 'bevy-tasks init' to create one.")?;
(n.clone(), ws.clone())
};
// Prompt for WebDAV URL
output::header(&format!("WebDAV sync setup for workspace \"{}\"", name.green()));
output::blank();
let url = prompt("WebDAV URL: ")?;
if url.is_empty() {
output::error("URL cannot be empty");
return Ok(());
}
let username = prompt("Username: ")?;
let password = rpassword::read_password_from_tty(Some("Password: "))
.context("Failed to read password")?;
// Test connection
output::blank();
output::info("Testing connection...");
let rt = tokio::runtime::Runtime::new().context("Failed to create async runtime")?;
let client = WebDavClient::new(&url, &username, &password);
match rt.block_on(client.test_connection()) {
Ok(()) => {
output::success("Connection successful!");
}
Err(e) => {
output::error(&format!("Connection failed: {}", e));
return Ok(());
}
}
// Store credentials in keychain
let domain = extract_domain(&url);
match store_credentials(&domain, &username, &password) {
Ok(()) => output::info("Credentials stored in system keychain"),
Err(e) => {
output::warning(&format!(
"Could not store in keychain ({}). Set BEVY_TASKS_WEBDAV_USER and BEVY_TASKS_WEBDAV_PASS env vars instead.",
e
));
}
}
// Update workspace config with WebDAV URL
let mut ws = workspace;
ws.webdav_url = Some(url);
config.add_workspace(name, ws);
save_config(&config)?;
output::success("Sync setup complete. Run 'bevy-tasks sync' to sync.");
Ok(())
}
/// Execute a sync operation.
pub fn execute(mode: SyncMode, workspace_name: Option<String>) -> Result<()> {
let config = load_config()?;
let (name, workspace) = if let Some(name) = workspace_name {
let ws = config.get_workspace(&name)
.ok_or_else(|| anyhow::anyhow!("Workspace '{}' not found", name))?
.clone();
(name, ws)
} else {
let (n, ws) = config.get_current_workspace()
.context("No workspace set. Use 'bevy-tasks init' to create one.")?;
(n.clone(), ws.clone())
};
let url = workspace.webdav_url.as_ref()
.ok_or_else(|| anyhow::anyhow!(
"No WebDAV URL configured for workspace '{}'. Run 'bevy-tasks sync --setup' first.", name
))?;
let domain = extract_domain(url);
let (username, password) = load_credentials(&domain)
.context("Failed to load credentials")?;
let mode_str = match mode {
SyncMode::Full => "Syncing",
SyncMode::Push => "Pushing",
SyncMode::Pull => "Pulling",
};
output::info(&format!("{} workspace \"{}\"...", mode_str, name.green()));
let rt = tokio::runtime::Runtime::new().context("Failed to create async runtime")?;
let result = rt.block_on(sync_workspace(
&workspace.path,
url,
&username,
&password,
mode,
Some(Box::new(|msg: &str| { println!("{}", msg); })),
)).context("Sync failed")?;
// Print summary
let mut parts = Vec::new();
if result.uploaded > 0 { parts.push(format!("{} uploaded", result.uploaded)); }
if result.downloaded > 0 { parts.push(format!("{} downloaded", result.downloaded)); }
if result.deleted_local > 0 { parts.push(format!("{} deleted locally", result.deleted_local)); }
if result.deleted_remote > 0 { parts.push(format!("{} deleted remotely", result.deleted_remote)); }
if result.conflicts > 0 { parts.push(format!("{} conflicts", result.conflicts)); }
if parts.is_empty() {
output::success("Already in sync, nothing to do.");
} else {
let summary = parts.join(", ");
if result.errors.is_empty() {
output::success(&format!("Sync complete: {}", summary));
} else {
output::warning(&format!("Sync complete with errors: {}", summary));
for err in &result.errors {
output::error(err);
}
}
}
Ok(())
}
/// Show sync status for a workspace.
pub fn status(workspace_name: Option<String>, all: bool) -> Result<()> {
let config = load_config()?;
if all {
// Show status for all workspaces that have sync configured
let mut found_any = false;
let mut names: Vec<_> = config.workspaces.keys().cloned().collect();
names.sort();
for name in names {
let ws = config.get_workspace(&name).unwrap();
if ws.webdav_url.is_some() {
found_any = true;
print_workspace_status(&name, &ws.path, ws.webdav_url.as_deref())?;
output::blank();
}
}
if !found_any {
output::info("No workspaces have sync configured. Run 'bevy-tasks sync --setup' to set up.");
}
return Ok(());
}
let (name, workspace) = if let Some(name) = workspace_name {
let ws = config.get_workspace(&name)
.ok_or_else(|| anyhow::anyhow!("Workspace '{}' not found", name))?
.clone();
(name, ws)
} else {
let (n, ws) = config.get_current_workspace()
.context("No workspace set.")?;
(n.clone(), ws.clone())
};
print_workspace_status(&name, &workspace.path, workspace.webdav_url.as_deref())?;
Ok(())
}
fn print_workspace_status(name: &str, path: &std::path::Path, webdav_url: Option<&str>) -> Result<()> {
output::header(&format!("Workspace: {}", name.green()));
if let Some(url) = webdav_url {
output::detail("WebDAV URL", url);
} else {
output::detail("WebDAV", &"not configured".dimmed().to_string());
return Ok(());
}
let info = get_sync_status(path)?;
if let Some(last) = info.last_sync {
output::detail("Last sync", &last.format("%Y-%m-%d %H:%M:%S UTC").to_string());
} else {
output::detail("Last sync", &"never".dimmed().to_string());
}
output::detail("Tracked files", &info.tracked_files.to_string());
output::detail("Pending changes", &info.pending_changes.to_string());
if info.queued_operations > 0 {
output::detail("Queued operations", &format!("{}", info.queued_operations).yellow().to_string());
}
Ok(())
}
/// Extract domain from a URL for credential storage.
fn extract_domain(url: &str) -> String {
url.split("://")
.nth(1)
.unwrap_or(url)
.split('/')
.next()
.unwrap_or(url)
.split(':')
.next()
.unwrap_or(url)
.to_string()
}
/// Prompt the user for text input.
fn prompt(message: &str) -> Result<String> {
use std::io::Write;
print!("{}", message);
std::io::stdout().flush()?;
let mut input = String::new();
std::io::stdin().read_line(&mut input)?;
Ok(input.trim().to_string())
}

View file

@ -1,222 +0,0 @@
use anyhow::{Context, Result};
use bevy_tasks_core::Task;
use chrono::{DateTime, Utc};
use uuid::Uuid;
use crate::output;
use crate::commands::get_repository;
pub fn add(title: String, list_name: Option<String>, due_str: Option<String>, workspace: Option<String>) -> Result<()> {
let (mut repo, _workspace_name) = get_repository(workspace)?;
// Get lists
let lists = repo.get_lists()
.context("Failed to get lists")?;
if lists.is_empty() {
anyhow::bail!("No lists found. Create one with 'bevy-tasks list create <name>'");
}
// Find the target list
let list = if let Some(name) = list_name {
lists.iter()
.find(|l| l.title == name)
.ok_or_else(|| anyhow::anyhow!("List '{}' not found", name))?
} else {
// Use the first list
&lists[0]
};
// Create task
let mut task = Task::new(title.clone());
// Parse due date if provided
if let Some(due_str) = due_str {
let due_date = parse_due_date(&due_str)?;
task.due_date = Some(due_date);
}
// Save task
repo.create_task(list.id, task.clone())
.context("Failed to create task")?;
let due_info = if let Some(due) = task.due_date {
format!("\n Due: {}", due.format("%Y-%m-%d"))
} else {
String::new()
};
output::success(&format!("Created task \"{}\" ({}){}", title, task.id, due_info));
Ok(())
}
pub fn complete(task_id_str: String, workspace: Option<String>) -> Result<()> {
let (mut repo, _workspace_name) = get_repository(workspace)?;
let task_id = Uuid::parse_str(&task_id_str)
.context("Invalid task ID")?;
// Find the task across all lists
let lists = repo.get_lists()?;
let mut found = false;
for list in lists {
if let Some(mut task) = list.tasks.iter().find(|t| t.id == task_id).cloned() {
task.complete();
repo.update_task(list.id, task.clone())
.context("Failed to update task")?;
output::success(&format!("Completed task \"{}\"", task.title));
found = true;
break;
}
}
if !found {
anyhow::bail!("Task not found: {}", task_id_str);
}
Ok(())
}
pub fn delete(task_id_str: String, workspace: Option<String>) -> Result<()> {
let (mut repo, _workspace_name) = get_repository(workspace)?;
let task_id = Uuid::parse_str(&task_id_str)
.context("Invalid task ID")?;
// Find the task across all lists
let lists = repo.get_lists()?;
let mut found = false;
for list in lists {
if let Some(task) = list.tasks.iter().find(|t| t.id == task_id) {
let title = task.title.clone();
output::warning(&format!("This will delete task \"{}\"", title));
print!("Continue? (y/n): ");
use std::io::{self, Write};
io::stdout().flush()?;
let mut input = String::new();
io::stdin().read_line(&mut input)?;
if input.trim().to_lowercase() != "y" {
output::info("Cancelled");
return Ok(());
}
repo.delete_task(list.id, task_id)
.context("Failed to delete task")?;
output::success(&format!("Deleted task \"{}\"", title));
found = true;
break;
}
}
if !found {
anyhow::bail!("Task not found: {}", task_id_str);
}
Ok(())
}
pub fn edit(task_id_str: String, workspace: Option<String>) -> Result<()> {
let (mut repo, _workspace_name) = get_repository(workspace)?;
let task_id = Uuid::parse_str(&task_id_str)
.context("Invalid task ID")?;
// Find the task across all lists
let lists = repo.get_lists()?;
let mut task_list_id = None;
let mut task_to_edit = None;
for list in lists {
if let Some(task) = list.tasks.iter().find(|t| t.id == task_id).cloned() {
task_list_id = Some(list.id);
task_to_edit = Some(task);
break;
}
}
let (list_id, task) = match (task_list_id, task_to_edit) {
(Some(lid), Some(t)) => (lid, t),
_ => anyhow::bail!("Task not found: {}", task_id_str),
};
// Create temporary file with task content
let temp_dir = std::env::temp_dir();
let temp_file = temp_dir.join(format!("bevy-tasks-{}.md", task.id));
// Write current task content to temp file
let content = format!("# {}\n\n{}", task.title, task.description);
std::fs::write(&temp_file, content)?;
// Get editor from environment
let editor = std::env::var("EDITOR").unwrap_or_else(|_| {
if cfg!(windows) {
"notepad".to_string()
} else {
"nano".to_string()
}
});
// Open editor
let status = std::process::Command::new(&editor)
.arg(&temp_file)
.status()
.context(format!("Failed to open editor: {}", editor))?;
if !status.success() {
anyhow::bail!("Editor exited with non-zero status");
}
// Read updated content
let updated_content = std::fs::read_to_string(&temp_file)?;
// Parse the content
let lines: Vec<&str> = updated_content.lines().collect();
let (title, description) = if !lines.is_empty() && lines[0].starts_with("# ") {
let title = lines[0].trim_start_matches("# ").trim().to_string();
let description = if lines.len() > 2 {
lines[2..].join("\n").trim().to_string()
} else {
String::new()
};
(title, description)
} else {
(task.title.clone(), updated_content.trim().to_string())
};
// Update task
let mut updated_task = task.clone();
updated_task.title = title;
updated_task.description = description;
updated_task.updated_at = Utc::now();
repo.update_task(list_id, updated_task.clone())
.context("Failed to update task")?;
// Clean up temp file
std::fs::remove_file(&temp_file).ok();
output::success(&format!("Updated task \"{}\"", updated_task.title));
Ok(())
}
fn parse_due_date(s: &str) -> Result<DateTime<Utc>> {
// Try parsing as date only (YYYY-MM-DD)
if let Ok(naive_date) = chrono::NaiveDate::parse_from_str(s, "%Y-%m-%d") {
let naive_datetime = naive_date.and_hms_opt(0, 0, 0)
.ok_or_else(|| anyhow::anyhow!("Invalid date"))?;
return Ok(DateTime::from_naive_utc_and_offset(naive_datetime, Utc));
}
// Try parsing as full datetime (ISO 8601)
if let Ok(dt) = DateTime::parse_from_rfc3339(s) {
return Ok(dt.with_timezone(&Utc));
}
anyhow::bail!("Invalid date format. Use YYYY-MM-DD or ISO 8601 format (YYYY-MM-DDTHH:MM:SS)")
}

View file

@ -1,209 +0,0 @@
use anyhow::{Context, Result};
use bevy_tasks_core::{TaskRepository, WorkspaceConfig};
use std::path::PathBuf;
use colored::*;
use crate::output;
use crate::commands::{load_config, save_config};
pub fn add(name: String, path: String) -> Result<()> {
let path_buf = PathBuf::from(path);
let path_buf = if path_buf.is_relative() {
std::env::current_dir()?.join(path_buf)
} else {
path_buf
};
// Initialize the repository
let mut repo = TaskRepository::init(path_buf.clone())
.context("Failed to initialize tasks folder")?;
// Create default list if it doesn't exist
let lists = repo.get_lists().context("Failed to get lists")?;
if !lists.iter().any(|l| l.title == "My Tasks") {
repo.create_list("My Tasks".to_string())
.context("Failed to create default list")?;
}
// Load config
let mut config = load_config()?;
// Check if workspace already exists
if config.get_workspace(&name).is_some() {
anyhow::bail!("Workspace '{}' already exists", name);
}
// Add workspace
config.add_workspace(name.clone(), WorkspaceConfig::new(path_buf.clone()));
// Save config
save_config(&config)?;
output::success(&format!("Added workspace \"{}\" at {}", name, path_buf.display()));
output::success("Created default list \"My Tasks\"");
Ok(())
}
pub fn list() -> Result<()> {
let config = load_config()?;
if config.workspaces.is_empty() {
output::info("No workspaces configured. Use 'bevy-tasks init' to create one.");
return Ok(());
}
let current = config.current_workspace.as_deref();
let mut workspaces: Vec<_> = config.workspaces.iter().collect();
workspaces.sort_by(|a, b| a.0.cmp(b.0));
for (name, workspace_config) in workspaces {
let marker = if Some(name.as_str()) == current {
" (current)".green()
} else {
"".normal()
};
output::item(&format!("{}: {}{}", name, workspace_config.path.display(), marker));
}
Ok(())
}
pub fn switch(name: String) -> Result<()> {
let mut config = load_config()?;
// Verify workspace exists
if config.get_workspace(&name).is_none() {
anyhow::bail!("Workspace '{}' not found", name);
}
config.set_current_workspace(name.clone())?;
save_config(&config)?;
output::success(&format!("Switched to workspace \"{}\"", name));
Ok(())
}
pub fn remove(name: String) -> Result<()> {
let mut config = load_config()?;
// Verify workspace exists
if config.get_workspace(&name).is_none() {
anyhow::bail!("Workspace '{}' not found", name);
}
// Confirm
output::warning("This will delete workspace config (files remain on disk)");
print!("Continue? (y/n): ");
use std::io::{self, Write};
io::stdout().flush()?;
let mut input = String::new();
io::stdin().read_line(&mut input)?;
if input.trim().to_lowercase() != "y" {
output::info("Cancelled");
return Ok(());
}
config.remove_workspace(&name);
save_config(&config)?;
output::success(&format!("Removed workspace \"{}\"", name));
Ok(())
}
pub fn retarget(name: String, path: String) -> Result<()> {
let path_buf = PathBuf::from(path);
let path_buf = if path_buf.is_relative() {
std::env::current_dir()?.join(path_buf)
} else {
path_buf
};
let mut config = load_config()?;
// Verify workspace exists
if config.get_workspace(&name).is_none() {
anyhow::bail!("Workspace '{}' not found", name);
}
// Update path
config.add_workspace(name.clone(), WorkspaceConfig::new(path_buf.clone()));
save_config(&config)?;
output::success(&format!("Workspace \"{}\" now points to {}", name, path_buf.display()));
Ok(())
}
pub fn migrate(name: String, new_path: String) -> Result<()> {
let new_path_buf = PathBuf::from(new_path);
let new_path_buf = if new_path_buf.is_relative() {
std::env::current_dir()?.join(new_path_buf)
} else {
new_path_buf
};
let mut config = load_config()?;
// Get current workspace config
let old_path = config.get_workspace(&name)
.ok_or_else(|| anyhow::anyhow!("Workspace '{}' not found", name))?
.path.clone();
// Confirm
output::warning(&format!("This will move all files from {} to {}", old_path.display(), new_path_buf.display()));
print!("Continue? (y/n): ");
use std::io::{self, Write};
io::stdout().flush()?;
let mut input = String::new();
io::stdin().read_line(&mut input)?;
if input.trim().to_lowercase() != "y" {
output::info("Cancelled");
return Ok(());
}
// Create destination directory
std::fs::create_dir_all(&new_path_buf)?;
// Move files
output::info("Moving files...");
let entries = std::fs::read_dir(&old_path)?;
let mut count = 0;
for entry in entries {
let entry = entry?;
let file_name = entry.file_name();
let dest = new_path_buf.join(&file_name);
if entry.path().is_dir() {
let mut options = fs_extra::dir::CopyOptions::new();
options.copy_inside = true;
fs_extra::dir::move_dir(entry.path(), &new_path_buf, &options)?;
output::item(&format!("Moved {}/", file_name.to_string_lossy()));
} else {
std::fs::rename(entry.path(), dest)?;
output::item(&format!("Moved {}", file_name.to_string_lossy()));
}
count += 1;
}
// Remove old directory if empty
if old_path.read_dir()?.next().is_none() {
std::fs::remove_dir(&old_path)?;
}
// Update config
config.add_workspace(name.clone(), WorkspaceConfig::new(new_path_buf.clone()));
save_config(&config)?;
output::success(&format!("Migrated {} items to {}", count, new_path_buf.display()));
output::success(&format!("Workspace \"{}\" now points to {}", name, new_path_buf.display()));
Ok(())
}

View file

@ -1,277 +0,0 @@
mod commands;
mod output;
use anyhow::Result;
use clap::{Parser, Subcommand};
use commands::*;
#[derive(Parser)]
#[command(name = "bevy-tasks")]
#[command(about = "A local-first, cross-platform tasks application", long_about = None)]
struct Cli {
#[command(subcommand)]
command: Commands,
}
#[derive(Subcommand)]
enum Commands {
/// Initialize a new workspace
Init {
/// Path to store tasks
path: String,
/// Name of the workspace
#[arg(short, long)]
name: String,
},
/// Manage workspaces
#[command(subcommand)]
Workspace(WorkspaceCommands),
/// Manage task lists
#[command(subcommand)]
List(ListCommands),
/// Add a new task
Add {
/// Task title
title: String,
/// List to add task to
#[arg(short, long)]
list: Option<String>,
/// Due date (ISO 8601 format: YYYY-MM-DD or YYYY-MM-DDTHH:MM:SS)
#[arg(short, long)]
due: Option<String>,
/// Workspace to use
#[arg(short, long)]
workspace: Option<String>,
},
/// Mark a task as complete
Complete {
/// Task ID
task_id: String,
/// Workspace to use
#[arg(short, long)]
workspace: Option<String>,
},
/// Delete a task
Delete {
/// Task ID
task_id: String,
/// Workspace to use
#[arg(short, long)]
workspace: Option<String>,
},
/// Edit a task
Edit {
/// Task ID
task_id: String,
/// Workspace to use
#[arg(short, long)]
workspace: Option<String>,
},
/// Toggle group-by-due-date for a list
#[command(subcommand)]
Group(GroupCommands),
/// Sync workspace with WebDAV server
Sync {
/// Run initial setup (URL, credentials)
#[arg(long)]
setup: bool,
/// Push-only sync (upload local changes)
#[arg(long, conflicts_with_all = ["pull", "setup", "status"])]
push: bool,
/// Pull-only sync (download remote changes)
#[arg(long, conflicts_with_all = ["push", "setup", "status"])]
pull: bool,
/// Show sync status
#[arg(long, conflicts_with_all = ["push", "pull", "setup"])]
status: bool,
/// Show status for all workspaces (with --status)
#[arg(long, requires = "status")]
all: bool,
/// Workspace to use
#[arg(short, long)]
workspace: Option<String>,
},
}
#[derive(Subcommand)]
enum WorkspaceCommands {
/// Add a new workspace
Add {
/// Name of the workspace
name: String,
/// Path to store tasks
path: String,
},
/// List all workspaces
List,
/// Switch to a different workspace
Switch {
/// Name of the workspace
name: String,
},
/// Remove a workspace
Remove {
/// Name of the workspace
name: String,
},
/// Update workspace path without moving files
Retarget {
/// Name of the workspace
name: String,
/// New path
path: String,
},
/// Move workspace files to a new location
Migrate {
/// Name of the workspace
name: String,
/// New path
path: String,
},
}
#[derive(Subcommand)]
enum ListCommands {
/// Create a new task list
Create {
/// Name of the list
name: String,
/// Workspace to use
#[arg(short, long)]
workspace: Option<String>,
},
/// Show all tasks (or tasks in a specific list)
Show {
/// Name of the list to show
#[arg(short, long)]
list: Option<String>,
/// Workspace to use
#[arg(short, long)]
workspace: Option<String>,
},
/// Delete a task list
Delete {
/// Name of the list to delete
name: String,
/// Workspace to use
#[arg(short, long)]
workspace: Option<String>,
},
}
#[derive(Subcommand)]
enum GroupCommands {
/// Enable group-by-due-date for a list
Enable {
/// Name of the list
#[arg(short, long)]
list: String,
/// Workspace to use
#[arg(short, long)]
workspace: Option<String>,
},
/// Disable group-by-due-date for a list
Disable {
/// Name of the list
#[arg(short, long)]
list: String,
/// Workspace to use
#[arg(short, long)]
workspace: Option<String>,
},
}
fn main() -> Result<()> {
let cli = Cli::parse();
match cli.command {
Commands::Init { path, name } => {
init::execute(path, name)?;
}
Commands::Workspace(cmd) => match cmd {
WorkspaceCommands::Add { name, path } => {
workspace::add(name, path)?;
}
WorkspaceCommands::List => {
workspace::list()?;
}
WorkspaceCommands::Switch { name } => {
workspace::switch(name)?;
}
WorkspaceCommands::Remove { name } => {
workspace::remove(name)?;
}
WorkspaceCommands::Retarget { name, path } => {
workspace::retarget(name, path)?;
}
WorkspaceCommands::Migrate { name, path } => {
workspace::migrate(name, path)?;
}
},
Commands::List(cmd) => match cmd {
ListCommands::Create { name, workspace } => {
list::create(name, workspace)?;
}
ListCommands::Show { list, workspace } => {
list::show(list, workspace)?;
}
ListCommands::Delete { name, workspace } => {
list::delete(name, workspace)?;
}
},
Commands::Add { title, list, due, workspace } => {
task::add(title, list, due, workspace)?;
}
Commands::Complete { task_id, workspace } => {
task::complete(task_id, workspace)?;
}
Commands::Delete { task_id, workspace } => {
task::delete(task_id, workspace)?;
}
Commands::Edit { task_id, workspace } => {
task::edit(task_id, workspace)?;
}
Commands::Group(cmd) => match cmd {
GroupCommands::Enable { list, workspace } => {
group::enable(list, workspace)?;
}
GroupCommands::Disable { list, workspace } => {
group::disable(list, workspace)?;
}
},
Commands::Sync { setup, push, pull, status, all, workspace } => {
if setup {
sync::setup(workspace)?;
} else if status {
sync::status(workspace, all)?;
} else {
let mode = if push {
bevy_tasks_core::sync::SyncMode::Push
} else if pull {
bevy_tasks_core::sync::SyncMode::Pull
} else {
bevy_tasks_core::sync::SyncMode::Full
};
sync::execute(mode, workspace)?;
}
},
}
Ok(())
}

View file

@ -1,33 +0,0 @@
use colored::*;
pub fn success(message: &str) {
println!("{} {}", "".green(), message);
}
pub fn error(message: &str) {
eprintln!("{} {}", "".red(), message);
}
pub fn warning(message: &str) {
println!("{} {}", "".yellow(), message);
}
pub fn info(message: &str) {
println!("{} {}", "".blue(), message);
}
pub fn header(message: &str) {
println!("{}", message.bold());
}
pub fn detail(label: &str, value: &str) {
println!(" {}: {}", label, value);
}
pub fn item(message: &str) {
println!(" {}", message);
}
pub fn blank() {
println!();
}

View file

@ -204,18 +204,20 @@ fn print_workspace_status(name: &str, path: &std::path::Path, webdav_url: Option
Ok(()) Ok(())
} }
/// Extract domain from a URL for credential storage. /// Extract host from a URL for credential storage.
fn extract_domain(url: &str) -> String { fn extract_domain(url: &str) -> String {
url.split("://") // Strip scheme
.nth(1) let after_scheme = url.split("://").nth(1).unwrap_or(url);
.unwrap_or(url) // Strip path
.split('/') let authority = after_scheme.split('/').next().unwrap_or(after_scheme);
.next() // Strip userinfo (user:pass@host)
.unwrap_or(url) let host_port = if let Some(at_pos) = authority.rfind('@') {
.split(':') &authority[at_pos + 1..]
.next() } else {
.unwrap_or(url) authority
.to_string() };
// Strip port
host_port.split(':').next().unwrap_or(host_port).to_string()
} }
/// Prompt the user for text input. /// Prompt the user for text input.

View file

@ -168,33 +168,59 @@ pub fn migrate(name: String, new_path: String) -> Result<()> {
return Ok(()); return Ok(());
} }
// Validate destination
if old_path == new_path_buf {
anyhow::bail!("Source and destination paths are the same");
}
if new_path_buf.exists() && new_path_buf.read_dir()?.next().is_some() {
anyhow::bail!("Destination directory '{}' already contains files", new_path_buf.display());
}
// Create destination directory // Create destination directory
std::fs::create_dir_all(&new_path_buf)?; std::fs::create_dir_all(&new_path_buf)?;
// Move files // Move files, tracking what was moved for rollback
output::info("Moving files..."); output::info("Moving files...");
let entries = std::fs::read_dir(&old_path)?; let entries: Vec<_> = std::fs::read_dir(&old_path)?
let mut count = 0; .collect::<std::result::Result<Vec<_>, _>>()?;
let mut moved: Vec<(std::path::PathBuf, std::path::PathBuf)> = Vec::new();
for entry in entries { let move_result: Result<()> = (|| {
let entry = entry?; for entry in &entries {
let file_name = entry.file_name(); let file_name = entry.file_name();
let dest = new_path_buf.join(&file_name); let dest = new_path_buf.join(&file_name);
if entry.path().is_dir() { if entry.path().is_dir() {
let mut options = fs_extra::dir::CopyOptions::new(); let mut options = fs_extra::dir::CopyOptions::new();
options.copy_inside = true; options.copy_inside = true;
fs_extra::dir::move_dir(entry.path(), &new_path_buf, &options)?; fs_extra::dir::move_dir(entry.path(), &new_path_buf, &options)?;
output::item(&format!("Moved {}/", file_name.to_string_lossy())); } else {
} else { std::fs::rename(entry.path(), &dest)?;
std::fs::rename(entry.path(), dest)?; }
moved.push((entry.path(), dest));
output::item(&format!("Moved {}", file_name.to_string_lossy())); output::item(&format!("Moved {}", file_name.to_string_lossy()));
} }
count += 1; Ok(())
})();
if let Err(e) = move_result {
output::error(&format!("Migration failed: {}. Rolling back...", e));
for (src, dest) in moved.into_iter().rev() {
if dest.exists() {
if dest.is_dir() {
let mut options = fs_extra::dir::CopyOptions::new();
options.copy_inside = true;
let _ = fs_extra::dir::move_dir(&dest, &old_path, &options);
} else {
let _ = std::fs::rename(&dest, &src);
}
}
}
anyhow::bail!("Migration failed and was rolled back: {}", e);
} }
// Remove old directory if empty // Remove old directory if empty
if old_path.read_dir()?.next().is_none() { if old_path.exists() && old_path.read_dir()?.next().is_none() {
std::fs::remove_dir(&old_path)?; std::fs::remove_dir(&old_path)?;
} }
@ -202,7 +228,7 @@ pub fn migrate(name: String, new_path: String) -> Result<()> {
config.add_workspace(name.clone(), WorkspaceConfig::new(new_path_buf.clone())); config.add_workspace(name.clone(), WorkspaceConfig::new(new_path_buf.clone()));
save_config(&config)?; save_config(&config)?;
output::success(&format!("Migrated {} items to {}", count, new_path_buf.display())); output::success(&format!("Migrated {} items to {}", moved.len(), new_path_buf.display()));
output::success(&format!("Workspace \"{}\" now points to {}", name, new_path_buf.display())); output::success(&format!("Workspace \"{}\" now points to {}", name, new_path_buf.display()));
Ok(()) Ok(())

View file

@ -22,6 +22,7 @@ sha2 = { workspace = true }
quick-xml = { workspace = true } quick-xml = { workspace = true }
tokio = { workspace = true } tokio = { workspace = true }
keyring = { version = "3", features = ["apple-native", "windows-native", "sync-secret-service"], optional = true } keyring = { version = "3", features = ["apple-native", "windows-native", "sync-secret-service"], optional = true }
zeroize = "1"
[dev-dependencies] [dev-dependencies]
tempfile = "3.0" tempfile = "3.0"

View file

@ -82,9 +82,9 @@ impl AppConfig {
} }
pub fn get_config_path() -> PathBuf { pub fn get_config_path() -> PathBuf {
let config_dir = directories::ProjectDirs::from("", "", "onyx") directories::ProjectDirs::from("", "", "onyx")
.expect("Failed to determine config directory"); .map(|dirs| dirs.config_dir().join("config.json"))
config_dir.config_dir().join("config.json") .unwrap_or_else(|| PathBuf::from("onyx-config.json"))
} }
} }

View file

@ -75,7 +75,11 @@ impl TaskRepository {
pub fn move_task(&mut self, from_list_id: Uuid, to_list_id: Uuid, task_id: Uuid) -> Result<()> { pub fn move_task(&mut self, from_list_id: Uuid, to_list_id: Uuid, task_id: Uuid) -> Result<()> {
let task = self.storage.read_task(from_list_id, task_id)?; let task = self.storage.read_task(from_list_id, task_id)?;
self.storage.write_task(to_list_id, &task)?; self.storage.write_task(to_list_id, &task)?;
self.storage.delete_task(from_list_id, task_id)?; // If delete from source fails, roll back by removing the copy from destination
if let Err(e) = self.storage.delete_task(from_list_id, task_id) {
let _ = self.storage.delete_task(to_list_id, task_id);
return Err(e);
}
Ok(()) Ok(())
} }

View file

@ -149,8 +149,25 @@ impl FileSystemStorage {
Err(Error::ListNotFound(list_id.to_string())) Err(Error::ListNotFound(list_id.to_string()))
} }
fn list_dir_path_by_name(&self, name: &str) -> PathBuf { fn list_dir_path_by_name(&self, name: &str) -> Result<PathBuf> {
self.root_path.join(name) // Reject names containing path separators or traversal components
if name.contains('/') || name.contains('\\') || name == ".." || name.starts_with("../") || name.starts_with("..\\") {
return Err(Error::InvalidData("Invalid list name: path traversal not allowed".to_string()));
}
let path = self.root_path.join(name);
// Verify resolved path stays within root
let canonical_root = self.root_path.canonicalize()
.map_err(Error::Io)?;
let canonical_path = if path.exists() {
path.canonicalize().map_err(Error::Io)?
} else {
// Parent must exist and be canonicalizable (it's root_path)
canonical_root.join(path.file_name().unwrap_or_default())
};
if !canonical_path.starts_with(&canonical_root) {
return Err(Error::InvalidData("Invalid list name: path escapes workspace".to_string()));
}
Ok(path)
} }
fn sanitize_filename(name: &str) -> String { fn sanitize_filename(name: &str) -> String {
@ -371,7 +388,7 @@ impl Storage for FileSystemStorage {
} }
fn create_list(&mut self, name: String) -> Result<TaskList> { fn create_list(&mut self, name: String) -> Result<TaskList> {
let list_dir = self.list_dir_path_by_name(&name); let list_dir = self.list_dir_path_by_name(&name)?;
if list_dir.exists() { if list_dir.exists() {
return Err(Error::InvalidData(format!("List '{}' already exists", name))); return Err(Error::InvalidData(format!("List '{}' already exists", name)));
@ -473,7 +490,7 @@ impl Storage for FileSystemStorage {
fn rename_list(&mut self, list_id: Uuid, new_name: String) -> Result<()> { fn rename_list(&mut self, list_id: Uuid, new_name: String) -> Result<()> {
let old_dir = self.list_dir_path(list_id)?; let old_dir = self.list_dir_path(list_id)?;
let new_dir = self.list_dir_path_by_name(&new_name); let new_dir = self.list_dir_path_by_name(&new_name)?;
if new_dir.exists() { if new_dir.exists() {
return Err(Error::InvalidData(format!("A list named '{}' already exists", new_name))); return Err(Error::InvalidData(format!("A list named '{}' already exists", new_name)));

View file

@ -129,7 +129,8 @@ pub fn compute_sync_actions(
// Both present, base known: check for changes // Both present, base known: check for changes
(Some(l), Some(r), Some(b)) => { (Some(l), Some(r), Some(b)) => {
let local_changed = l.checksum != b.checksum; let local_changed = l.checksum != b.checksum;
let remote_changed = r.size != b.size || r.last_modified.as_deref() != b.modified_at.as_deref(); // Compare remote vs base using parsed timestamps to avoid format mismatches
let remote_changed = r.size != b.size || !timestamps_equal(r.last_modified.as_deref(), b.modified_at.as_deref());
match (local_changed, remote_changed) { match (local_changed, remote_changed) {
(false, false) => {} // Skip, unchanged (false, false) => {} // Skip, unchanged
@ -173,7 +174,7 @@ pub fn compute_sync_actions(
// Remote present, local gone, base known: local was deleted // Remote present, local gone, base known: local was deleted
(None, Some(_), Some(b)) => { (None, Some(_), Some(b)) => {
let remote_changed = remote.is_some_and(|r| r.size != b.size || r.last_modified.as_deref() != b.modified_at.as_deref()); let remote_changed = remote.is_some_and(|r| r.size != b.size || !timestamps_equal(r.last_modified.as_deref(), b.modified_at.as_deref()));
if remote_changed { if remote_changed {
// deleted locally + modified remotely -> download (remote wins) // deleted locally + modified remotely -> download (remote wins)
actions.push(SyncAction::Download { path: path.to_string() }); actions.push(SyncAction::Download { path: path.to_string() });
@ -197,6 +198,23 @@ pub fn compute_sync_actions(
actions actions
} }
/// Compare two timestamps for equality by parsing both, tolerating format differences.
fn timestamps_equal(a: Option<&str>, b: Option<&str>) -> bool {
match (a, b) {
(None, None) => true,
(Some(a), Some(b)) => {
// Try string equality first (fast path)
if a == b { return true; }
// Parse both and compare as DateTime
match (parse_timestamp(a), parse_timestamp(b)) {
(Some(ta), Some(tb)) => ta == tb,
_ => false,
}
}
_ => false,
}
}
/// Determine if local wins based on timestamps. True means local wins. /// Determine if local wins based on timestamps. True means local wins.
fn local_wins(local_modified: Option<&str>, remote_modified: Option<&str>) -> bool { fn local_wins(local_modified: Option<&str>, remote_modified: Option<&str>) -> bool {
// Try parsing both; if we can't parse, local wins by default // Try parsing both; if we can't parse, local wins by default
@ -258,8 +276,19 @@ impl OfflineQueue {
return Self::default(); return Self::default();
} }
match std::fs::read_to_string(&queue_path) { match std::fs::read_to_string(&queue_path) {
Ok(content) => serde_json::from_str(&content).unwrap_or_default(), Ok(content) => match serde_json::from_str(&content) {
Err(_) => Self::default(), Ok(queue) => queue,
Err(e) => {
eprintln!("Warning: corrupt sync queue, backing up and resetting: {}", e);
let backup = workspace_path.join(".syncqueue.json.bak");
let _ = std::fs::copy(&queue_path, &backup);
Self::default()
}
},
Err(e) => {
eprintln!("Warning: failed to read sync queue: {}", e);
Self::default()
}
} }
} }
@ -338,12 +367,23 @@ pub fn compute_checksum(data: &[u8]) -> String {
format!("{:x}", hasher.finalize()) format!("{:x}", hasher.finalize())
} }
/// Check if a filename is a syncable file (*.md, .listdata.json, .metadata.json). /// Check if a file is syncable: *.md files and metadata files at expected depths.
fn is_syncable(path: &str) -> bool { fn is_syncable(path: &str) -> bool {
let filename = path.rsplit('/').next().unwrap_or(path); let parts: Vec<&str> = path.split('/').collect();
filename.ends_with(".md") let filename = parts.last().copied().unwrap_or(path);
|| filename == ".listdata.json" // .metadata.json only at workspace root (depth 1)
|| filename == ".metadata.json" if filename == ".metadata.json" {
return parts.len() == 1;
}
// .listdata.json only inside a list directory (depth 2)
if filename == ".listdata.json" {
return parts.len() == 2;
}
// .md files inside a list directory (depth 2)
if filename.ends_with(".md") {
return parts.len() == 2;
}
false
} }
/// Scan local workspace files and compute checksums. /// Scan local workspace files and compute checksums.
@ -560,12 +600,11 @@ async fn execute_action(
report: &(dyn Fn(&str) + Send + Sync), report: &(dyn Fn(&str) + Send + Sync),
) -> Result<()> { ) -> Result<()> {
match action { match action {
SyncAction::Upload { path } | SyncAction::ConflictLocalWins { path } => { SyncAction::Upload { path } => {
let local_path = workspace_path.join(path.replace('/', std::path::MAIN_SEPARATOR_STR)); let local_path = workspace_path.join(path.replace('/', std::path::MAIN_SEPARATOR_STR));
let data = std::fs::read(&local_path)?; let data = std::fs::read(&local_path)?;
let checksum = compute_checksum(&data); let checksum = compute_checksum(&data);
// Ensure remote parent directory exists
if let Some(parent) = path_parent(path) { if let Some(parent) = path_parent(path) {
client.ensure_dir(parent).await?; client.ensure_dir(parent).await?;
} }
@ -580,7 +619,25 @@ async fn execute_action(
sync_state.record_file(path, &checksum, modified.as_deref(), data.len() as u64); sync_state.record_file(path, &checksum, modified.as_deref(), data.len() as u64);
} }
SyncAction::Download { path } | SyncAction::ConflictRemoteWins { path } => { SyncAction::ConflictLocalWins { path } => {
let local_path = workspace_path.join(path.replace('/', std::path::MAIN_SEPARATOR_STR));
let data = std::fs::read(&local_path)?;
let checksum = compute_checksum(&data);
if let Some(parent) = path_parent(path) {
client.ensure_dir(parent).await?;
}
report(&format!(" ^ Conflict: uploading local version of {}", path));
client.put_file(path, data.clone()).await?;
let modified = std::fs::metadata(&local_path).ok()
.and_then(|m| m.modified().ok())
.map(|t| { let dt: DateTime<Utc> = t.into(); dt.to_rfc3339() });
sync_state.record_file(path, &checksum, modified.as_deref(), data.len() as u64);
}
SyncAction::Download { path } => {
report(&format!(" v Downloading {}", path)); report(&format!(" v Downloading {}", path));
let data = client.get_file(path).await?; let data = client.get_file(path).await?;
let checksum = compute_checksum(&data); let checksum = compute_checksum(&data);
@ -598,6 +655,29 @@ async fn execute_action(
sync_state.record_file(path, &checksum, modified.as_deref(), data.len() as u64); sync_state.record_file(path, &checksum, modified.as_deref(), data.len() as u64);
} }
SyncAction::ConflictRemoteWins { path } => {
let local_path = workspace_path.join(path.replace('/', std::path::MAIN_SEPARATOR_STR));
// Back up local version before overwriting with remote
if local_path.exists() {
let backup_path = local_path.with_extension("conflict-backup");
let _ = std::fs::copy(&local_path, &backup_path);
report(&format!(" ! Backed up local version to {}", backup_path.display()));
}
report(&format!(" v Conflict: downloading remote version of {}", path));
let data = client.get_file(path).await?;
let checksum = compute_checksum(&data);
if let Some(parent) = local_path.parent() {
std::fs::create_dir_all(parent)?;
}
std::fs::write(&local_path, &data)?;
let modified = std::fs::metadata(&local_path).ok()
.and_then(|m| m.modified().ok())
.map(|t| { let dt: DateTime<Utc> = t.into(); dt.to_rfc3339() });
sync_state.record_file(path, &checksum, modified.as_deref(), data.len() as u64);
}
SyncAction::DeleteLocal { path } => { SyncAction::DeleteLocal { path } => {
report(&format!(" x Deleting local {}", path)); report(&format!(" x Deleting local {}", path));
let local_path = workspace_path.join(path.replace('/', std::path::MAIN_SEPARATOR_STR)); let local_path = workspace_path.join(path.replace('/', std::path::MAIN_SEPARATOR_STR));
@ -1009,14 +1089,20 @@ mod tests {
#[test] #[test]
fn test_is_syncable() { fn test_is_syncable() {
assert!(is_syncable("file.md")); // .md files must be inside a list dir (depth 2)
assert!(is_syncable("My Tasks/Buy groceries.md")); assert!(is_syncable("My Tasks/Buy groceries.md"));
assert!(is_syncable(".listdata.json")); assert!(!is_syncable("file.md")); // root-level md not valid
// .listdata.json inside a list dir (depth 2)
assert!(is_syncable("My Tasks/.listdata.json")); assert!(is_syncable("My Tasks/.listdata.json"));
assert!(!is_syncable(".listdata.json")); // root-level not valid
// .metadata.json only at root (depth 1)
assert!(is_syncable(".metadata.json")); assert!(is_syncable(".metadata.json"));
assert!(!is_syncable("My Tasks/.metadata.json")); // nested not valid
// Non-syncable
assert!(!is_syncable(".syncstate.json")); assert!(!is_syncable(".syncstate.json"));
assert!(!is_syncable("random.txt")); assert!(!is_syncable("random.txt"));
assert!(!is_syncable("image.png")); assert!(!is_syncable("image.png"));
assert!(!is_syncable("a/b/c/deep.md")); // too deep
} }
#[test] #[test]

View file

@ -1,4 +1,5 @@
use reqwest::Client; use reqwest::Client;
use zeroize::Zeroize;
use crate::error::{Error, Result}; use crate::error::{Error, Result};
/// Information about a file on the remote WebDAV server. /// Information about a file on the remote WebDAV server.
@ -18,6 +19,13 @@ pub struct WebDavClient {
_password: String, _password: String,
} }
impl Drop for WebDavClient {
fn drop(&mut self) {
self._password.zeroize();
self._username.zeroize();
}
}
impl WebDavClient { impl WebDavClient {
pub fn new(base_url: &str, username: &str, password: &str) -> Self { pub fn new(base_url: &str, username: &str, password: &str) -> Self {
let base_url = base_url.trim_end_matches('/').to_string(); let base_url = base_url.trim_end_matches('/').to_string();

View file

@ -441,10 +441,9 @@ Key test areas:
## Thread Safety ## Thread Safety
**Note:** The current implementation is not thread-safe. If you need concurrent access: The `Storage` trait requires `Send + Sync`, and `TaskRepository` wraps `Box<dyn Storage + Send + Sync>`, so repository instances can be shared across threads behind a `Mutex`. The Tauri GUI uses `Mutex<AppState>` for this purpose.
1. Use external synchronization (e.g., `Mutex<TaskRepository>`) For concurrent access:
2. Create separate repository instances per thread (file system will handle locking)
3. Consider implementing a service layer with proper locking
Future versions may include built-in concurrency support. 1. Wrap `TaskRepository` in `Mutex` or `RwLock` (the Tauri app does this)
2. Or create separate repository instances per thread (file system handles locking)