Chat about this codebase

AI-powered code exploration

Online

Project Overview

NoteGen is a cross-platform Markdown note-taking app that combines AI-powered writing assistance, Git-based version control, and real-time synchronization. Capture, organize and revisit your ideas—from text and screenshots to audio recordings—across all your devices.

Purpose

Enable seamless, intelligent note creation and management for developers, researchers, and knowledge workers.

Key Features

  • AI-assisted composition, summarization and translation
  • Full Markdown editor with live preview and custom themes
  • Git integration for versioned note history and diffs
  • Multimedia support: images, screen captures, audio recordings, file attachments
  • Tagging, full-text search and customizable folder structures
  • Sync via GitHub, WebDAV or self-hosted servers

Supported Platforms

  • Desktop: macOS, Windows, Linux
  • Web: Chrome, Firefox, Safari, Edge
  • Mobile: iOS, Android

Value Proposition

NoteGen delivers AI-enhanced note workflows with built-in version control and cross-device sync. Maintain a single source of truth for your ideas—anytime, anywhere.

Getting Started

Get NoteGen up and running in minutes. Choose a pre-built installer or build from source.

1. Install Pre-built Release

Visit the Releases page and download the package for your platform:
https://github.com/codexu/note-gen/releases

Supported targets:

  • macOS (dmg: Intel / Apple Silicon)
  • Windows (installer: x64 / arm64)
  • Linux (AppImage, .deb, .rpm)

After download:

  • macOS: open the .dmg and drag NoteGen to Applications.
  • Windows: run the .exe installer.
  • Linux: grant execute on the AppImage or install the .deb/.rpm via your package manager.

2. Run from Source

Prerequisites

  • Node >=18 (npm or Yarn)
  • Rust >=1.60
  • Tauri CLI
    # via Cargo
    cargo install tauri-cli
    # or via npm
    npm install -g @tauri-apps/cli
    

Clone, Install, and Develop

git clone https://github.com/codexu/note-gen.git
cd note-gen
npm install
npm run dev           # starts Next.js frontend on http://localhost:3456
npm run tauri dev     # opens NoteGen desktop window

Tauri watches src-tauri/src; Rust changes recompile automatically.

Production Build

npm run build         # builds and exports Next.js static files
npm run tauri build   # bundles native installers for macOS/Windows/Linux

Find installers in
src-tauri/target/release/bundle

3. Folder Structure at a Glance

.
├── package.json            # project scripts & dependencies
├── next.config.ts          # static export & i18n config
├── tsconfig.json           # TypeScript compiler options
├── public/                 # static images & fonts
├── src/                    # Next.js pages & React components
└── src-tauri/              # Tauri & Rust application
    ├── Cargo.toml          # Rust crate & plugin dependencies
    ├── tauri.conf.json     # window, bundle & updater settings
    └── src/                # Rust source (main.rs, plugins)
└── .github/
    └── workflows/
        └── release.yml     # CI: multi-platform build & release

With this setup, you can explore the frontend in src/, extend Rust APIs in src-tauri/src/, and customize bundling via tauri.conf.json.

Architecture Overview

This section outlines the core layers of the application—desktop (Rust/Tauri), frontend (Next.js), and data (SQLite)—and shows how they interact at runtime.

1. Rust Desktop Layer (src-tauri/src/main.rs)

Sets up the Tauri runtime, plugins, system tray, single‐instance guard and command handlers.

Initialization Snippet

fn main() {
  tauri::Builder::default()
    // Ensure only one instance runs
    .plugin(tauri_plugin_single_instance::init(|app, _, _| {
      if let Some(win) = app.get_window("main") {
        win.show().ok();
        win.set_focus().ok();
      }
    }))
    // System tray with click handlers
    .plugin(
      tauri_plugin_tray::Builder::new()
        .with_menu(tray_menu())
        .with_icon(tray_icon())
        .on_event(tray_event_handler)
        .build()
    )
    // Auto-update via GitHub or custom endpoint
    .plugin(tauri_plugin_updater::Builder::new().build())
    // WebDAV backup/sync
    .plugin(tauri_plugin_webdav::Builder::new().build())
    // Fuzzy search & keyword ranking
    .plugin(tauri_plugin_fuzzy::init())
    .plugin(tauri_plugin_rank::init())
    // Register Tauri commands for chats, notes, tags, vector, etc.
    .invoke_handler(tauri::generate_handler![
      init_databases,
      chat_list,
      chat_create,
      mark_list,
      note_save,
      tag_list,
      vector_search,
      webdav_sync,
      fuzzy_search,
      rank_keywords,
    ])
    .run(tauri::generate_context!())
    .expect("error while running tauri application");
}

Key points

  • Plugin order: single_instance first, then tray, updater, external integrations, custom commands.
  • All business logic lives in commands (Rust functions marked with #[tauri::command]).
  • Frontend invokes commands via @tauri-apps/api/tauri.

2. Frontend Layout (src/app/layout.tsx)

Defines the root HTML structure, global styles, mobile meta tags, i18n context and toast notifications.

RootLayout Component

import React from "react";
import "./globals.css";
import { I18nProvider } from "./i18n";
import { Toaster } from "react-hot-toast";

export default function RootLayout({ children }: { children: React.ReactNode }) {
  return (
    <html lang="en" suppressHydrationWarning>
      <head>
        <meta name="viewport" content="width=device-width, initial-scale=1" />
        <meta name="theme-color" content="#fff" />
        <script src="/fix-markdown-it.js"></script>
      </head>
      <body>
        <I18nProvider>
          {children}
          <Toaster position="bottom-right" />
        </I18nProvider>
      </body>
    </html>
  );
}

Highlights

  • suppressHydrationWarning for SSR/CSR consistency.
  • Injects a small script to patch markdown-it rendering.
  • Wraps pages in I18nProvider for locale switching.
  • Renders a global <Toaster> for notifications.

3. Data Layer (src/db/index.ts)

Manages a singleton SQLite connection, initializes separate databases for chats, marks, notes, tags and vector indexing.

Database Initialization

import { open } from "sqlite";
import sqlite3 from "sqlite3";
import { initChatsDb } from "./chats";
import { initMarksDb } from "./marks";
import { initNotesDb } from "./notes";
import { initTagsDb } from "./tags";
import { initVectorDb } from "./vector";

type DbName = "chats" | "marks" | "notes" | "tags" | "vector";
const dbMap: Partial<Record<DbName, sqlite3.Database>> = {};

export async function initDatabases(dataDir: string) {
  dbMap.chats = await open({ filename: `${dataDir}/chats.db`, driver: sqlite3.Database });
  await initChatsDb(dbMap.chats!);
  dbMap.marks = await open({ filename: `${dataDir}/marks.db`, driver: sqlite3.Database });
  await initMarksDb(dbMap.marks!);
  dbMap.notes = await open({ filename: `${dataDir}/notes.db`, driver: sqlite3.Database });
  await initNotesDb(dbMap.notes!);
  dbMap.tags = await open({ filename: `${dataDir}/tags.db`, driver: sqlite3.Database });
  await initTagsDb(dbMap.tags!);
  dbMap.vector = await open({ filename: `${dataDir}/vector.db`, driver: sqlite3.Database });
  await initVectorDb(dbMap.vector!);
}

export function getDb(name: DbName) {
  const db = dbMap[name];
  if (!db) throw new Error(`Database ${name} not initialized`);
  return db;
}

Usage in commands:

#[tauri::command]
async fn chat_list() -> Vec<Chat> {
  // calls into JS-bridge which then uses getDb("chats") under the hood
}

Call Flow

  1. Startup
    • main.rs launches Tauri, invokes init_databases(data_dir).
    • Each sub-DB creates tables and indices.
  2. UI Rendering
    • Next.js serves pages; layout.tsx sets up global context.
  3. User Action
    • UI calls invoke("chat_list") or other commands via @tauri-apps/api/tauri.
  4. Command Execution
    • Rust handler uses getDb(…) to query SQLite, applies business logic, returns JSON.
  5. UI Update
    • Frontend receives results, updates components, shows a toast if needed.

This layered architecture ensures clear separation between UI, backend logic and data persistence, while enabling tight integration through Tauri’s command bridge.

Core Concepts

Core abstractions that power note-gen:

  • Workspace path utilities: resolve file system paths in default or custom workspaces
  • RAG indexing: chunk markdown files, compute embeddings, and upsert into a vector store
  • Sync API: upload notes and assets to GitHub, Gitee or GitLab with built-in base64 conversion and error handling
  • Shortcut management: define, register and emit global keyboard shortcuts via Electron IPC

Workspace Path Utilities

Provide a unified way to resolve file paths inside the user’s workspace—whether it’s the default AppData/article folder or a custom directory—so FS operations (read, write, stat, mkdir, etc.) work seamlessly.

getWorkspacePath

Retrieve the root workspace path and detect if it’s user-customized.
Signature:

async function getWorkspacePath(): Promise<{ path: string; isCustom: boolean }>;
  • If store.json contains a workspacePath, returns that absolute path with isCustom: true.
  • Otherwise returns { path: 'article', isCustom: false } relative to BaseDirectory.AppData.

Example:

const { path: root, isCustom } = await getWorkspacePath();
// root: "/Users/alice/myNotes" or "article"

getFilePathOptions

Build arguments for Tauri FS calls given a path relative to the workspace root.
Signature:

async function getFilePathOptions(relativePath: string)
  : Promise<{ path: string; baseDir?: BaseDirectory }>;
  • Custom workspace: { path: absolutePath }
  • Default workspace: { path: article/${relativePath}, baseDir: BaseDirectory.AppData }

Use for reading or writing files:

import { readTextFile, writeTextFile } from '@tauri-apps/plugin-fs';

async function loadDoc(rel: string) {
  const opts = await getFilePathOptions(rel);
  return opts.baseDir
    ? await readTextFile(opts.path, { baseDir: opts.baseDir })
    : await readTextFile(opts.path);
}

async function saveDoc(rel: string, content: string) {
  const opts = await getFilePathOptions(rel);
  const args = opts.baseDir ? { baseDir: opts.baseDir } : {};
  await writeTextFile(opts.path, content, args);
}

getGenericPathOptions

Like getFilePathOptions but supports arbitrary AppData subfolders (e.g. images).
Signature:

async function getGenericPathOptions(
  path: string,
  prefix?: string
): Promise<{ path: string; baseDir?: BaseDirectory }>;
  • Custom: always returns an absolute path under the custom root, prepending prefix/ if provided.
  • Default: { path: ${prefix}/${path}, baseDir: AppData } unless path already starts with prefix.

Example—saving an image:

import { writeBinaryFile } from '@tauri-apps/plugin-fs';

async function saveImage(relPath: string, data: Uint8Array) {
  const opts = await getGenericPathOptions(relPath, 'image');
  const args = opts.baseDir ? { baseDir: opts.baseDir } : {};
  await writeBinaryFile(opts.path, data, args);
}

toWorkspaceRelativePath

Convert any absolute or default-prefixed path back to a path relative to the workspace root.
Signature:

async function toWorkspaceRelativePath(path: string): Promise<string>;
  • Default mode: strips leading article/ segments.
  • Custom mode: removes the absolute custom root prefix (and any leading slash).
  • Leaves already-relative paths untouched.

Example—normalizing results from Tauri’s readDir:

import { readDir } from '@tauri-apps/plugin-fs';
import { getWorkspacePath, toWorkspaceRelativePath } from '@/lib/workspace';

async function listMarkdownFiles() {
  const { path: root, isCustom } = await getWorkspacePath();
  const raw = isCustom
    ? await readDir(root)
    : await readDir('article', { baseDir: BaseDirectory.AppData });

  const relPaths = await Promise.all(
    raw
      .filter(e => e.isFile && e.name.endsWith('.md'))
      .map(async e => {
        const full = isCustom
          ? `${root}/${e.name}`
          : `article/${e.name}`;
        return toWorkspaceRelativePath(full);
      })
  );
  return relPaths; // e.g. ['notes/todo.md', 'readme.md']
}

RAG Indexing: Splitting and Indexing Markdown Files

Explain how markdown files are chunked into manageable pieces and indexed in the vector database by computing embeddings for each chunk.

chunkText

Split long text into overlapping chunks, using paragraph then sentence boundaries.
Signature:

function chunkText(
  text: string,
  chunkSize: number = 1000,
  chunkOverlap: number = 200
): string[];
  • If text.length ≤ chunkSize, returns [text].
  • Splits on \n\n (paragraphs), accumulates until adding the next paragraph would exceed chunkSize.
  • When full, pushes the chunk and carries over up to chunkOverlap characters from its end.
  • If a single paragraph > chunkSize, splits further by sentence boundaries (., ?, !).

Example:

const longMd = await readTextFile('notes.md');
const chunks = chunkText(longMd, 800, 150);
console.log(chunks.length); // e.g. 5

processMarkdownFile

Read a markdown file, split into chunks, compute embeddings, and upsert each chunk into the vector store.
Signature:

async function processMarkdownFile(
  filePath: string,
  fileContent?: string
): Promise<boolean>;

Workflow:

  1. Determine workspace path and load file content.
  2. Read ragChunkSize and ragChunkOverlap from store.json.
  3. const chunks = chunkText(content, chunkSize, chunkOverlap)
  4. Derive filename from filePath.
  5. await deleteVectorDocumentsByFilename(filename)
  6. For each chunk: a. const embedding = await fetchEmbedding(chunk)
    b. await upsertVectorDocument({ filename, chunk_id: i, content: chunk, embedding: JSON.stringify(embedding), updated_at: Date.now() })
  7. Return true if successful, else log and return false.

Example:

import { processMarkdownFile } from '@/lib/rag';

const success = await processMarkdownFile('article/guide.md');
if (success) console.log('Indexed guide.md successfully');
else console.error('Indexing failed for guide.md');

Sync: Uploading Files to Remote Repositories

Provide a simple API to create or update a file in GitHub, Gitee, or GitLab. Automatically handles base64 conversion, branch, proxy, and error reporting.

Common Helpers

import { fileToBase64 } from '@/lib/github';       // same in gitee.ts & gitlab.ts
import { decodeBase64ToString } from '@/lib/github';

// Convert a browser File into base64 (no data URL prefix)
const base64 = await fileToBase64(myFile);

// Decode a base64‐encoded UTF-8 string
const text = decodeBase64ToString(base64);

GitHub Example

import { uploadFile as uploadToGitHub } from '@/lib/github';
import { RepoNames } from '@/lib/github.types';

async function syncImage(file: File) {
  const ext = file.name.split('.').pop()!;
  const content = await fileToBase64(file);
  const result = await uploadToGitHub({
    ext,
    file: content,
    repo: RepoNames.sync,
    path: 'images',                // optional folder
    message: 'Add new image 📷'
  });
  if (result) {
    console.log('GitHub URL:', result.data.content.html_url);
  }
}

Gitee Example

import { uploadFile as uploadToGitee } from '@/lib/gitee';
import { RepoNames } from '@/lib/github.types';

async function syncDocument(file: File) {
  const ext = file.name.split('.').pop()!;
  const content = await fileToBase64(file);
  const response = await uploadToGitee({
    ext,
    file: content,
    repo: RepoNames.sync,
    filename: 'report.pdf',        // override UUID name
    path: 'docs',
    message: 'Publish monthly report'
  });
  if (response) {
    console.log('Gitee URL:', response.data.content.download_url);
  }
}

GitLab Example

import { uploadFile as uploadToGitLab } from '@/lib/gitlab';
import { RepoNames } from '@/lib/github.types';

async function syncScript(file: File) {
  const ext = file.name.split('.').pop()!;
  const content = await fileToBase64(file);
  const res = await uploadToGitLab({
    ext,
    file: content,
    repo: RepoNames.sync,
    path: 'scripts',
    message: 'Update script v2'
  });
  if (res) {
    console.log('GitLab file committed:', res.data.file_path);
  }
}
Key Parameters
  • ext: file extension without dot
  • file: base64-encoded content
  • filename?: override auto-generated <UUID>.<ext>
  • sha?: existing file SHA/commit for updates
  • message?: commit message
  • repo: enum name of your sync repository
  • path?: folder path in repo; spaces convert to underscores
Error Handling
  • Displays a toast on network/API errors
  • Returns null (GitHub/Gitee) or false (GitLab) for recoverable failures
  • Throws unexpected errors for catch-all logic

Keyboard Shortcut Configuration

Define, register and emit application-wide keyboard shortcuts using the ShortcutSettings, ShortcutDefault and EmitterShortcutEvents enums.

Config Enums

// src/config/shortcut.ts
export enum ShortcutSettings {
  screenshot = "shotcut-screenshot",
  text       = "shotcut-text",
  pin        = "window-pin",
  link       = "shotcut-link",
}
export enum ShortcutDefault {
  screenshot = "Control+Shift+S",
  text       = "Control+Shift+T",
  pin        = "Control+Shift+P",
  link       = "Control+Shift+L",
}

// src/config/emitters.ts
export enum EmitterShortcutEvents {
  screenshot = "screenshot-shortcut-register",
  text       = "text-shortcut-register",
  pin        = "window-pin-register",
  link       = "link-shortcut-register",
}

Main Process: Registering Shortcuts

  1. Read stored accelerator or fallback to default
  2. Call Electron’s globalShortcut.register
  3. Emit IPC event on trigger
import { app, BrowserWindow, globalShortcut } from "electron";
import store from "./store";
import { ShortcutSettings, ShortcutDefault } from "./config/shortcut";
import { EmitterShortcutEvents } from "./config/emitters";

let mainWindow: BrowserWindow;

function registerAllShortcuts() {
  (Object.keys(ShortcutDefault) as Array<keyof typeof ShortcutDefault>)
    .forEach(action => {
      const settingKey = ShortcutSettings[action];
      const accelerator = store.get(settingKey, ShortcutDefault[action]);
      globalShortcut.register(accelerator, () => {
        mainWindow.webContents.send(EmitterShortcutEvents[action]);
      });
    });
}

app.on("ready", () => {
  mainWindow = new BrowserWindow({ /* ... */ });
  registerAllShortcuts();
});

Renderer Process: Handling Shortcut Events

import { ipcRenderer } from "electron";
import { EmitterShortcutEvents } from "../config/emitters";

ipcRenderer.on(EmitterShortcutEvents.screenshot, () => {
  startScreenshotMode();
});

ipcRenderer.on(EmitterShortcutEvents.text, () => {
  openTextCapture();
});

Updating Shortcuts at Runtime

When a user changes a shortcut:

function updateShortcut(action: keyof typeof ShortcutDefault, newAccel: string) {
  const old = store.get(ShortcutSettings[action], ShortcutDefault[action]);
  globalShortcut.unregister(old);

  store.set(ShortcutSettings[action], newAccel);
  globalShortcut.register(newAccel, () => {
    mainWindow.webContents.send(EmitterShortcutEvents[action]);
  });
}

Wrap registration/unregistration in try/catch to handle invalid accelerators or conflicts.

Configuration Guide

This guide maps each configurable screen in the app to its underlying storage key (in store.json) or environment variable and shows how to read and update settings via src/stores/setting.ts.


Initialization

Load persisted settings on app start.
Storage file: @tauri-apps/plugin-storestore.json

import { useEffect } from 'react'
import useSettingStore from '@/stores/setting'

export function App() {
  useEffect(() => {
    // Loads all keys from store.json into Zustand and applies env-var fallbacks
    useSettingStore.getState().initSettingData()
  }, [])

  return <YourAppUI />
}

1. General Settings Screen

Manage theme and UI language.

UI Field Store Key Env Var Setter
Theme theme setTheme(value)
Language language setLanguage(code)
import useSettingStore from '@/stores/setting'

function GeneralSettings() {
  const theme    = useSettingStore(s => s.theme)
  const setTheme = useSettingStore(s => s.setTheme)

  const lang      = useSettingStore(s => s.language)
  const setLang   = useSettingStore(s => s.setLanguage)

  return (
    <>
      <label>Theme
        <select value={theme} onChange={e => setTheme(e.target.value)}>
          <option value="light">Light</option>
          <option value="dark">Dark</option>
        </select>
      </label>

      <label>Language
        <select value={lang} onChange={e => setLang(e.target.value)}>
          <option value="en">English</option>
          <option value="zh">中文</option>
        </select>
      </label>
    </>
  )
}

2. AI Models Screen

Configure multiple AI services and select primary models.

UI Section Store Key Description
Model list aiModelList Array of { key, model, modelType, baseURL, apiKey }
Primary chat model chatPrimaryModel Key of selected chat model
Primary embedding model embeddingPrimaryModel Key of selected embedding model
import useSettingStore, { AiConfig } from '@/stores/setting'

async function addNewModel() {
  const newModel: AiConfig = {
    key: 'my-azure-openai',
    model: 'gpt-4',
    modelType: 'chat',
    baseURL: 'https://my-azure-endpoint.openai.azure.com',
    apiKey: 'AZURE_KEY'
  }
  await useSettingStore.getState().addAiModel(newModel)
  // Optionally set as primary:
  useSettingStore.getState().setChatPrimaryModel(newModel.key)
}

3. Backup Provider Selection

Choose one of the supported providers.

UI Field Store Key Setter
Backup provider backupProvider setBackupProvider(provider)

Supported values: 'webdav', 'github', 'gitee', 'gitlab'

const provider    = useSettingStore(s => s.backupProvider)
const setProvider = useSettingStore(s => s.setBackupProvider)

<select value={provider} onChange={e => setProvider(e.target.value)}>
  <option value="webdav">WebDAV</option>
  <option value="github">GitHub</option>
  <option value="gitee">Gitee</option>
  <option value="gitlab">GitLab</option>
</select>

WebDAV settings use a separate useWebDAVStore (see WebDAV section).


4. GitHub Integration Screen

UI Field Store Key Env Var Setter
Access Token githubAccessToken GITHUB_ACCESS_TOKEN setGithubAccessToken(token)
Repository githubRepo GITHUB_REPO setGithubRepo(owner/repo)
Branch githubBranch GITHUB_BRANCH setGithubBranch(name)
Path prefix githubPath GITHUB_PATH setGithubPath(path)
import useSettingStore from '@/stores/setting'

function GitHubSettings() {
  const token = useSettingStore(s => s.githubAccessToken)
  const setToken = useSettingStore(s => s.setGithubAccessToken)

  return (
    <>
      <input
        type="password"
        value={token}
        onChange={e => setToken(e.target.value)}
        placeholder="GitHub Personal Access Token"
      />
      {/* Repo, branch, path inputs analogous */}
    </>
  )
}

5. Gitee Integration Screen

UI Field Store Key Env Var Setter
Access Token giteeAccessToken GITEE_ACCESS_TOKEN setGiteeAccessToken(token)
Repository giteeRepo GITEE_REPO setGiteeRepo(owner/repo)
Branch giteeBranch GITEE_BRANCH setGiteeBranch(name)
Path prefix giteePath GITEE_PATH setGiteePath(path)

Implementation mirrors GitHub above.


6. GitLab Integration Screen

UI Field Store Key Env Var Setter
Access Token gitlabAccessToken GITLAB_ACCESS_TOKEN setGitlabAccessToken(token)
Repository gitlabRepo GITLAB_REPO setGitlabRepo(owner/repo)
Branch gitlabBranch GITLAB_BRANCH setGitlabBranch(name)
Path prefix gitlabPath GITLAB_PATH setGitlabPath(path)

Implementation mirrors GitHub above.


Practical Tips

  • Always call initSettingData() once on mount.
  • Reading a value: const val = useSettingStore(s => s.key).
  • Updating: await useSettingStore.getState().setKey(value).
  • All setters persist to disk and notify subscribers.
  • Provide env-var defaults for CI or private deployments; store values override env.

Backup & Sync

Manage cloud‐ and inter‐device data with built-in GitHub, Gitee, and WebDAV integrations in the codexu/note-gen app.


GitHub Sync API Wrapper

Provide a set of wrapper functions to perform create, read, update, and delete operations on GitHub repositories and files using the GitHub REST API.

Key Features

  • Automatically reads accessToken, githubUsername and optional proxy from @tauri-apps/plugin-store
  • Uses @tauri-apps/plugin-http’s fetch to standardize headers, status checks, and error handling
  • Core methods:
    • Authentication & user info (getUserInfo)
    • Repository check/create (checkSyncRepoState, createSyncRepo)
    • File upload (uploadFile)
    • List files (getFiles)
    • Delete file (deleteFile)
    • File commit history (getFileCommits)
    • Latest release retrieval (getRelease)

1. Authenticate and Get User Info

import { getUserInfo } from '@/lib/github';

async function initGitHub(token?: string) {
  const resp = await getUserInfo(token);
  if (!resp) throw new Error('GitHub authentication failed');
  console.log('Logged in as', resp.data.login);
}

2. Check or Create Sync Repository

import { checkSyncRepoState, createSyncRepo } from '@/lib/github';

async function ensureRepo(name: string) {
  const repo = await checkSyncRepoState(name);
  return repo || await createSyncRepo(name, true); // true = private
}

3. Upload a File

Converts a File object to Base64, then uploads:

import { fileToBase64, uploadFile } from '@/lib/github';
import { RepoNames }          from '@/lib/github.types';

async function uploadImage(file: File) {
  const base64 = await fileToBase64(file); // strips data URI prefix
  const result = await uploadFile({
    ext: file.name.split('.').pop() || 'png',
    file: base64,
    filename: 'my-image',                // optional, defaults to UUID
    message: 'Add new image',            // commit message
    repo: RepoNames.sync,                // must be an enum value
    path: 'uploads'                      // optional subdirectory
  });
  console.log('Upload URL:', result?.data.content.html_url);
}

4. List Files in a Directory

import { getFiles } from '@/lib/github';
import { RepoNames } from '@/lib/github.types';

async function listUploads() {
  const files = await getFiles({ repo: RepoNames.sync, path: 'uploads' });
  if (!files) return console.log('No files found');
  files.forEach(f => console.log(f.name, f.download_url));
}

5. Delete a File

import { deleteFile } from '@/lib/github';
import { RepoNames }   from '@/lib/github.types';

async function removeFile(path: string, sha: string) {
  const success = await deleteFile({ repo: RepoNames.sync, path, sha });
  console.log(success ? 'Deletion succeeded' : 'Deletion failed');
}

6. Get File Commit History

import { getFileCommits } from '@/lib/github';
import { RepoNames }       from '@/lib/github.types';

async function showHistory(path: string) {
  const commits = await getFileCommits({ repo: RepoNames.sync, path });
  commits?.forEach(c => console.log(c.sha, c.commit.message));
}

7. Fetch Latest Release

import { getRelease } from '@/lib/github';

async function checkLatestRelease() {
  const release = await getRelease();
  if (release) console.log('Latest version:', release.tag_name);
}

Practical Tips

  • Call getUserInfo on app startup to initialize githubUsername.
  • Use proxy in settings to route all API calls.
  • Always checkSyncRepoState before createSyncRepo.
  • Spaces in filenames/paths convert to underscores to avoid conflicts.

Gitee File Upload (uploadFile)

Create or update a file in a Gitee repository. Handles token loading, proxy settings, method selection (POST vs PUT), and error reporting via toast.

Function Signature

export async function uploadFile(options: {
  ext: string;                 // without dot
  file: string;                // Base64-encoded content
  filename?: string;           // defaults to UUID.ext
  sha?: string;                // existing file SHA for update
  message?: string;            // commit message
  repo: RepoNames.sync;        // target enum value
  path?: string;               // optional subdirectory
}): Promise<GiteeResponse<any> | null | false>

How It Works

  • Reads giteeAccessToken, giteeUsername, and optional proxy from @tauri-apps/plugin-store.
  • Generates a UUID filename when none is provided.
  • Replaces spaces with underscores in filenames/paths.
  • Chooses POST (create) or PUT (update when sha present).
  • Sends { access_token, content, message, branch: 'master', sha }.
  • Returns parsed data on success, null on client errors (400), or false on failure (with toast).

Prerequisites

  • In store.json set:
    • giteeAccessToken: your Gitee token
    • giteeUsername: your Gitee login
    • proxy (optional): HTTP(S) proxy URL

Example

import { fileToBase64 } from '@/lib/gitee';
import { uploadFile }    from '@/lib/gitee';
import { RepoNames }      from '@/lib/github.types';

async function handleFileInput(e: Event) {
  const input = e.target as HTMLInputElement;
  if (!input.files?.length) return;
  const file = input.files[0];
  const base64 = await fileToBase64(file);

  const ext = file.name.split('.').pop() || 'bin';
  const result = await uploadFile({
    ext,
    file: base64,
    filename: file.name,
    message: `Add ${file.name}`,
    repo: RepoNames.sync,
    path: 'images'
  });

  if (result?.data.content.download_url) {
    console.log('Uploaded to:', result.data.content.download_url);
  } else {
    console.warn('Upload failed or returned null');
  }
}

Practical Tips

  • To update: fetch SHA via getFiles, then pass sha to uploadFile.
  • Default branch is master; change in source if using main.
  • Network or non-404 errors trigger toast and return false.
  • Extend filename sanitization regex as needed.

WebDAV Backup Command (webdav_backup)

A Tauri command that uploads all Markdown files from the local workspace to a WebDAV directory, auto-creating missing remote folders. Returns a “success_count/total_count” string or an error.

Signature

#[tauri::command]
pub async fn webdav_backup(
  url: String,
  username: String,
  password: String,
  path: String,
  app: AppHandle,
) -> Result<String, String>

Workflow

  1. Client Setup:
    • create_client(url, username, password)reqwest_dav::Client with Basic auth.
  2. Ensure Remote Base Directory:
    • client.list(path, Depth::Infinity) → if fails, client.mkcol(path).
  3. Discover Local Markdown:
    • get_workspace_info(app) reads workspacePath from store.json (default: "article").
    • get_markdown_files() returns (relative_path, content) for each .md file.
  4. Upload Loop:
    • For each file:
      • Build remote_path = format!("{}/{}", path.trim_end_matches('/'), relative_path).
      • Ensure parent directory exists (list → mkcol).
      • client.put(remote_path, content_bytes).
      • Track successes, abort on fatal error.
  5. Result:
    • Ok("{}/{}", success_count, total_files) or Err(...).

Rust Usage Example

let result = webdav_backup(
  "https://example.com/webdav".into(),
  "user".into(),
  "pass".into(),
  "backups/notes".into(),
  app_handle.clone()
).await?;
println!("Uploaded: {}", result); // e.g. "12/12"

Invoke from Frontend

import { invoke } from '@tauri-apps/api';

const count = await invoke<string>('webdav_backup', {
  url, username, password, path
});
// count === "uploaded/total"

Practical Tips

  • Do not include leading slash in path; command strips it.
  • Missing remote directories auto-created.
  • Large workspaces may require a UI spinner (backupState).
  • Errors bubble up with context (e.g. failed mkcol, upload error).

AI & Processing Engine

Centralize AI/ML features—chat completion, embeddings, reranking, keyword extraction, OCR, screenshot capture and local RAG search—so your app can call a single module for all AI-powered workflows.

Chat Completion (src/lib/ai.ts)

Provide conversational AI via OpenAI-compatible chat models with error handling and optional streaming.

Signature

async function chatCompletion(
  messages: ChatMessage[], 
  options?: { modelKey?: string; stream?: boolean }
): Promise<string | AsyncIterable<string>>;

Usage

import { chatCompletion } from '@/lib/ai';

async function askBot() {
  const messages = [
    { role: 'system', content: 'You are an assistant.' },
    { role: 'user', content: 'Summarize the benefits of TypeScript.' }
  ];
  const reply = await chatCompletion(messages);
  console.log('Bot:', reply);
}

// Streaming example
(async () => {
  const stream = await chatCompletion(messages, { stream: true });
  for await (const chunk of stream as AsyncIterable<string>) {
    process.stdout.write(chunk);
  }
})();

Embeddings (src/lib/ai.ts)

Generate semantic vectors for text, used in similarity search or RAG pipelines.

Signature

async function fetchEmbedding(text: string): Promise<number[] | null>;

Usage

import { fetchEmbedding } from '@/lib/ai';
import { initVectorDb, upsertVectorDocument } from '@/db/vector';

async function indexFile(filename: string, content: string) {
  await initVectorDb();
  const embedding = await fetchEmbedding(content);
  if (!embedding) throw new Error('Embedding failed');
  await upsertVectorDocument({
    filename,
    chunk_id: 0,
    content,
    embedding: JSON.stringify(embedding),
    updated_at: Date.now(),
  });
}

Document Reranking (src/lib/ai.ts)

Reorder candidate passages by relevance to a query using a rerank model.

Signature

async function rerankDocuments(
  query: string, 
  candidates: string[]
): Promise<{ score: number; text: string }[]>;

Usage

import { rerankDocuments } from '@/lib/ai';

async function refineResults() {
  const query = 'What is vector similarity?';
  const docs = ['Vectors measure...', 'Similarity is...','Math principles...'];
  const ranked = await rerankDocuments(query, docs);
  console.log(ranked[0]); // highest relevance
}

Keyword Extraction (src-tauri/src/keywords.rs)

Extract top-weighted keywords from text using Jieba + TextRank via a Tauri command.

Rust Command

#[tauri::command]
pub fn keywords(text: String, top_n: usize) -> Vec<Keyword> { … }

JS Invocation

import { invoke } from '@tauri-apps/api/tauri';

interface Keyword { word: string; weight: f32; }

async function extractKeywords(text: string) {
  const list = await invoke<Keyword[]>('keywords', { text, top_n: 5 });
  return list;
}

extractKeywords('Deep learning enables…').then(console.log);

OCR Utility (src/lib/ocr.ts)

Perform OCR on images saved in AppData using Tesseract.js with configurable languages and a 30s timeout.

import ocr from '@/lib/ocr';

async function readImageText(path: string) {
  const text = await ocr(path);
  if (text.startsWith('OCR')) console.error('Error:', text);
  else console.log('Extracted:', text);
}

readImageText('screenshots/latest.png');

Screenshot Capture (src-tauri/src/screenshot.rs)

Capture all visible, non-minimized windows; save PNGs to AppData/temp_screenshot; return metadata.

Rust Struct

#[derive(Serialize, Deserialize)]
pub struct ScreenshotImage {
  pub name: String,
  pub path: String,
  pub width: u32,
  pub height: u32,
  pub x: i32,
  pub y: i32,
  pub z: i32,
}

JS Invocation

import { invoke } from '@tauri-apps/api/tauri';

async function captureAllWindows() {
  const images = await invoke<ScreenshotImage[]>('screenshot');
  images.forEach(img => {
    const el = document.createElement('img');
    el.src = `file://${img.path}`;
    el.width = img.width / 4;
    document.body.appendChild(el);
  });
}

captureAllWindows();

Local RAG Search (src/lib/rag.ts)

Ingest Markdown into a vector store and retrieve context snippets for queries.

Core Functions

• initVectorDb(): setup SQLite vector table
• processMarkdownFile(path: string): split, embed & store chunks
• getContextForQuery(query: string): return top passages with context
• handleFileUpdate(path: string): refresh file in vector store

Example: Ingest & Query

import {
  initVectorDb,
  processMarkdownFile,
  getContextForQuery
} from '@/lib/rag';

async function buildAndSearch() {
  await initVectorDb();
  await processMarkdownFile('/notes/guide.md');
  const context = await getContextForQuery('How to configure Tauri?');
  console.log('Context:', context);
}

buildAndSearch();
## Backend API Reference

### Fuzzy Search Commands

Purpose  
Provide a high-performance, parallelized fuzzy search over arbitrary `SearchItem` collections in your Tauri backend. Exposes two invokable commands:
- `fuzzy_search` – standard execution  
- `fuzzy_search_parallel` – alias that currently delegates to `fuzzy_search`

Core Types  
• **SearchItem**  
  - Represents an indexed record with optional fields: `id`, `title`, `desc`, `article`, `path`, `url`, `search_type`  
  - On client request can carry back `score?: number` and `matches?: MatchInfo[]`  
• **MatchInfo**  
  - `{ key: string; indices: [number,number][]; value: string }`  
  - Describes which characters in which field matched  
• **FuzzySearchResult**  
  - `{ item: SearchItem; refindex: number; score: number; matches: MatchInfo[] }`  
  - `refindex` is the original index in the `items` array

Command Signature  
```rust
#[tauri::command]
pub fn fuzzy_search(
  items: Vec<SearchItem>,
  query: String,
  keys: Vec<String>,
  threshold: f64,
  include_score: bool,
  include_matches: bool,
) -> Vec<FuzzySearchResult> { … }

Parameters

  • items: array of records to search
  • query: search pattern (empty yields [])
  • keys: subset of ["desc","title","article","path","search_type"] to test
  • threshold: normalized match-score cutoff (0.0–1.0)
  • include_score: if false, result.score and item.score are zeroed/cleared
  • include_matches: if false, match spans are dropped

Behavior

  1. Builds a SkimMatcherV2 per invocation.
  2. For each item and each requested key:
    • runs fuzzy_indices(text, query)
    • computes normalized_score = abs(score) / query.len()
    • skips if below threshold
    • collects best raw score and all matching spans
  3. Filters out non-matching items, sorts descending by score.
  4. Clears scores/matches based on flags.

TypeScript Invocation Example

import { invoke } from '@tauri-apps/api/tauri';

interface MatchInfo { key: string; indices: [number,number][]; value: string; }
interface SearchItem { id?: string; title?: string; desc?: string; /*…*/ }
interface FuzzySearchResult {
  item: SearchItem;
  refindex: number;
  score: number;
  matches: MatchInfo[];
}

async function runSearch() {
  const items: SearchItem[] = [ /* your indexed data */ ];
  const results = await invoke<FuzzySearchResult[]>('fuzzy_search', {
    items,
    query: 'rust module',
    keys: ['title','desc'],
    threshold: 0.5,
    include_score: true,
    include_matches: true
  });

  console.log(results);
}

Practical Tips

  • Tune threshold to control fuzziness: lower allows looser matches.
  • Select only needed fields in keys to reduce work.
  • Use include_matches=false when you only care about top-N scoring.
  • fuzzy_search_parallel currently aliases fuzzy_search, swap in a parallel version if needed.

Custom Tauri HTTP Fetch Wrapper

Purpose
Provide a drop-in replacement for the browser fetch API that routes requests through Tauri’s plugin-http, supporting native timeouts, proxies, max redirections, and proper header encoding/decoding.

API Signature

async function fetch(
  input: string,
  init?: RequestInit & ClientOptions
): Promise<Response>

Key Features

  • Uses invoke('plugin:http|fetch') under the hood
  • Supports maxRedirections, connectTimeout, proxy, danger flags
  • Honors and propagates AbortSignal
  • Encodes request body as Uint8Array
  • URI-encodes response headers to ensure validity
  • Returns a standard Response with overridden .url and .headers

Usage Example

import { fetch } from 'src/lib/encode-fetch';

async function postData() {
  const controller = new AbortController();
  const timeout = setTimeout(() => controller.abort(), 3000);

  try {
    const res = await fetch('https://api.example.com/data', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({ foo: 'bar' }),
      connectTimeout: 5000,               // TCP connect timeout in ms
      maxRedirections: 5,                 // max HTTP redirects
      proxy: 'http://localhost:8888',     // optional proxy
      danger: { acceptInvalidCerts: true},// skip TLS validation
      signal: controller.signal
    });

    clearTimeout(timeout);

    if (!res.ok) {
      console.error(`HTTP ${res.status}: ${res.statusText}`);
      return;
    }

    const data = await res.json();
    console.log('Received:', data);
  } catch (e: any) {
    if (e.message === 'Request canceled') {
      console.warn('Fetch aborted or timed out');
    } else {
      console.error('Fetch error:', e);
    }
  }
}

Abort Handling

  • Checks signal.aborted before dispatch and after header mapping
  • On abort, calls invoke('plugin:http|fetch_cancel', { rid })
  • Throws Error('Request canceled') consistent with standard fetch

Custom Options

Option Type Description
connectTimeout number ms to wait for TCP connect
maxRedirections number how many 3xx redirects to follow
proxy string proxy URL (e.g., http://proxy.local:3128)
danger object plugin-specific flags (e.g., TLS skip opts)

Under the Hood

  1. Strip custom fields from init to avoid native Request confusion
  2. Build a Request to extract headers & body as ArrayBuffer
  3. Map headers to [name,value][] and encode with encodeURI
  4. Invoke Tauri plugin methods:
    • plugin:http|fetch to start (returns request ID)
    • plugin:http|fetch_send to send headers & receive status
    • plugin:http|fetch_read_body to retrieve body
  5. Construct a standard Response, override .url and .headers

Practical Tips

  • Always provide a signal if you need cancellable requests
  • Clean up timeouts/AbortController to avoid dangling plugin requests
  • Use danger sparingly – only in trusted dev environments
  • Check res.ok/res.status before consuming body
  • Proxy support is useful for debugging via local interceptors (e.g., Charles, Fiddler)

Quality & Delivery

Maintain code health through consistent linting standards and automated releases.

ESLint Configuration (.eslintrc.json)

Purpose: Enforce Next.js and TypeScript linting defaults while allowing scoped overrides.

This project’s .eslintrc.json:

{
  "extends": ["next/core-web-vitals", "next/typescript"],
  "rules": {
    "react-hooks/exhaustive-deps": "off",
    "@typescript-eslint/no-explicit-any": "off"
  },
  "ignorePatterns": ["docs/**"]
}

Key sections:

  1. Extends
    next/core-web-vitals: React rules tuned for Core Web Vitals
    next/typescript: TypeScript support for Next.js

  2. Rules Overrides
    react-hooks/exhaustive-deps: off
    – Skip exhaustive-deps checks when you manage dependencies manually.
    – Enable per-file with:
    js /* eslint react-hooks/exhaustive-deps: "warn" */
    @typescript-eslint/no-explicit-any: off
    – Allow any in scripts or third-party integrations.
    – Scope via overrides:
    jsonc { "overrides": [ { "files": ["scripts/**/*.ts"], "rules": { "@typescript-eslint/no-explicit-any": "off" } } ] }

  3. ignorePatterns
    docs/**: exclude documentation from linting
    • Extend patterns (e.g., build/, public/) as needed

Practical Usage

npm run lint
npx eslint src/components/Button.tsx --fix

To add a new plugin rule:

  1. npm install -D eslint-plugin-import
  2. Update .eslintrc.json:
    {
      "plugins": ["import"],
      "rules": {
        "import/order": ["error", { "groups": ["builtin","external","internal"] }]
      }
    }
    

Extracting and Propagating Tauri App Version

Purpose: Capture the semantic version from tauri-apps/tauri-action in publish-tauri and expose it for downstream jobs.

How It Works

  1. Declare appVersion as a job output.
  2. On Ubuntu, read steps.tauri-action.outputs.appVersion.
  3. Write to $GITHUB_OUTPUT for use by other jobs.

Key Snippet:

jobs:
  publish-tauri:
    strategy:
      matrix:
        include:
          - platform: 'ubuntu-24.04'
            args: ''
          - platform: 'windows-latest'
            args: '--msix'
          - platform: 'macos-latest'
            args: ''
    runs-on: ${{ matrix.platform }}
    outputs:
      appVersion: ${{ steps.set_output.outputs.appVersion }}
    steps:
      - uses: actions/checkout@v4
      - uses: tauri-apps/tauri-action@v0
        id: tauri-action
        with:
          # Tauri config here
      - name: Extract version
        if: matrix.platform == 'ubuntu-24.04'
        run: echo "appVersion=${{ steps.tauri-action.outputs.appVersion }}" >> $GITHUB_OUTPUT
        id: set_output

Usage in Downstream Job:

jobs:
  upgradeLink-upload:
    needs: publish-tauri
    runs-on: ubuntu-latest
    steps:
      - uses: toolsetlink/upgradelink-action@v5
        with:
          source-url: >
            https://github.com/codexu/note-gen/releases/
            download/note-gen-v${{ needs.publish-tauri.outputs.appVersion }}/latest.json
          access-key: ${{ secrets.UPGRADE_LINK_ACCESS_KEY }}
          tauri-key: ${{ secrets.UPGRADE_LINK_TAURI_KEY }}
          github-token: ${{ secrets.GITHUB_TOKEN }}

Practical Tips
• Restrict write-back to one OS to avoid race conditions
• Match the id of the tauri-action step when reading its outputs
• Inject the version via ${{ needs.publish-tauri.outputs.appVersion }}

Contributing

Thank you for improving NoteGen! This guide covers how to file issues, propose features, open pull requests, our branch strategy, and coding style standards.

Filing Issues

Use GitHub Issues to report bugs or request help.

  1. Check existing issues for similar reports.
  2. Click New Issue and choose the Bug Report template.
  3. Provide:
    • A descriptive title.
    • Steps to reproduce.
    • Expected vs. actual behavior.
    • Environment details (OS, Node version).
  4. Label issues appropriately (bug, question).

Proposing Features

Create a feature request using the Feature Request template:

  1. Click New Issue > Feature Request.
  2. Describe:
    • The problem you want to solve.
    • Proposed API or UX changes.
    • Any relevant examples or mockups.
  3. Use the enhancement label.

Discussion happens right in the issue thread.

Opening Pull Requests

  1. Fork the repository and clone your fork:
    git clone git@github.com:your-username/note-gen.git
    cd note-gen
    
  2. Add upstream remote:
    git remote add upstream git@github.com:codexu/note-gen.git
    
  3. Sync and create a feature branch:
    git fetch upstream
    git checkout develop
    git pull upstream develop
    git checkout -b feature/your-feature-name
    
  4. Make your changes; run tests and linters:
    npm install
    npm test
    npm run lint
    
  5. Commit with a clear, conventional message:
    feat(notes): add support for markdown export
    
  6. Push and open a PR against the develop branch:
    git push -u origin feature/your-feature-name
    
  7. Fill out the PR template:
    • Summary of changes.
    • Related issues (e.g., “Closes #123”).
    • Screenshots or logs if relevant.
  8. Request reviews from teammates; address feedback promptly.

Branch Strategy

We use a simplified Git Flow:

  • main: always production-ready.
  • develop: integration branch for features.
  • feature/: new features and enhancements.
  • fix/: bug fixes.
  • hotfix/: urgent production fixes off main.

Release process:

  1. Merge completed features into develop.
  2. Create a release branch (release/vX.Y.Z), fix minor bugs.
  3. Merge release into main and tag.
  4. Merge main back into develop.

Branch names example:
feature/user-auth, fix/login-crash, hotfix/critical-bug

Coding Style

NoteGen uses ESLint and Prettier to enforce consistent code.

  • Indentation: 2 spaces
  • Quotes: single (')
  • Semicolons: required
  • Line length: 100 characters

Run checks locally:

npm run lint        # ESLint
npm run format      # Prettier
npm test            # Jest tests

ESLint and Prettier configs live in the repo root (.eslintrc.js, .prettierrc). Ensure all checks pass before submitting a PR.

Thank you for contributing to NoteGen!

License

This project is released under the MIT License. It grants broad permissions and disclaims all warranties. You must include the full license text in any distributions or derivative works.

Summary

  • Grants free permission to use, copy, modify, merge, publish, distribute, sublicense and/or sell copies.
  • Requires inclusion of the original copyright and license notice.
  • Disclaims warranties and limits liability.

Obligations

  • Bundle the MIT License text in your source or documentation.
  • Retain the following in every copy or substantial portion:
    • Copyright notice
    • Permission notice

Including the License

In package.json

Specify the license identifier so tooling recognizes it:

{
  "name": "note-gen",
  "version": "1.0.0",
  "license": "MIT",
  "files": [
    "dist/",
    "LICENSE"
  ]
}

In source files

Add a brief header to each file:

/**
 * note-gen v1.0.0
 * Copyright (c) 2025 CodexU
 * Released under the MIT License
 */

Redistributing

Ensure your published bundle or Docker image includes the LICENSE file at its root:

  • npm: The files field in package.json
  • Docker: COPY LICENSE /app/
  • TAR/GZIP: Preserve the LICENSE alongside your release artifacts.
\n \n \n \n {children}\n \n \n \n \n );\n}\n```\n\nHighlights \n- `suppressHydrationWarning` for SSR/CSR consistency. \n- Injects a small script to patch markdown-it rendering. \n- Wraps pages in `I18nProvider` for locale switching. \n- Renders a global `` for notifications.\n\n### 3. Data Layer (src/db/index.ts)\n\nManages a singleton SQLite connection, initializes separate databases for chats, marks, notes, tags and vector indexing.\n\n#### Database Initialization\n```ts\nimport { open } from \"sqlite\";\nimport sqlite3 from \"sqlite3\";\nimport { initChatsDb } from \"./chats\";\nimport { initMarksDb } from \"./marks\";\nimport { initNotesDb } from \"./notes\";\nimport { initTagsDb } from \"./tags\";\nimport { initVectorDb } from \"./vector\";\n\ntype DbName = \"chats\" | \"marks\" | \"notes\" | \"tags\" | \"vector\";\nconst dbMap: Partial> = {};\n\nexport async function initDatabases(dataDir: string) {\n dbMap.chats = await open({ filename: `${dataDir}/chats.db`, driver: sqlite3.Database });\n await initChatsDb(dbMap.chats!);\n dbMap.marks = await open({ filename: `${dataDir}/marks.db`, driver: sqlite3.Database });\n await initMarksDb(dbMap.marks!);\n dbMap.notes = await open({ filename: `${dataDir}/notes.db`, driver: sqlite3.Database });\n await initNotesDb(dbMap.notes!);\n dbMap.tags = await open({ filename: `${dataDir}/tags.db`, driver: sqlite3.Database });\n await initTagsDb(dbMap.tags!);\n dbMap.vector = await open({ filename: `${dataDir}/vector.db`, driver: sqlite3.Database });\n await initVectorDb(dbMap.vector!);\n}\n\nexport function getDb(name: DbName) {\n const db = dbMap[name];\n if (!db) throw new Error(`Database ${name} not initialized`);\n return db;\n}\n```\n\nUsage in commands:\n```rust\n#[tauri::command]\nasync fn chat_list() -> Vec {\n // calls into JS-bridge which then uses getDb(\"chats\") under the hood\n}\n```\n\n### Call Flow\n\n1. **Startup** \n - `main.rs` launches Tauri, invokes `init_databases(data_dir)`. \n - Each sub-DB creates tables and indices. \n2. **UI Rendering** \n - Next.js serves pages; `layout.tsx` sets up global context. \n3. **User Action** \n - UI calls `invoke(\"chat_list\")` or other commands via `@tauri-apps/api/tauri`. \n4. **Command Execution** \n - Rust handler uses `getDb(…)` to query SQLite, applies business logic, returns JSON. \n5. **UI Update** \n - Frontend receives results, updates components, shows a toast if needed. \n\nThis layered architecture ensures clear separation between UI, backend logic and data persistence, while enabling tight integration through Tauri’s command bridge.\n## Core Concepts\n\nCore abstractions that power note-gen:\n\n- Workspace path utilities: resolve file system paths in default or custom workspaces \n- RAG indexing: chunk markdown files, compute embeddings, and upsert into a vector store \n- Sync API: upload notes and assets to GitHub, Gitee or GitLab with built-in base64 conversion and error handling \n- Shortcut management: define, register and emit global keyboard shortcuts via Electron IPC \n\n### Workspace Path Utilities\n\nProvide a unified way to resolve file paths inside the user’s workspace—whether it’s the default `AppData/article` folder or a custom directory—so FS operations (read, write, stat, mkdir, etc.) work seamlessly.\n\n#### getWorkspacePath \nRetrieve the root workspace path and detect if it’s user-customized. \nSignature: \n```ts\nasync function getWorkspacePath(): Promise<{ path: string; isCustom: boolean }>;\n``` \n- If `store.json` contains a `workspacePath`, returns that absolute path with `isCustom: true`. \n- Otherwise returns `{ path: 'article', isCustom: false }` relative to `BaseDirectory.AppData`. \n\nExample: \n```ts\nconst { path: root, isCustom } = await getWorkspacePath();\n// root: \"/Users/alice/myNotes\" or \"article\"\n```\n\n#### getFilePathOptions \nBuild arguments for Tauri FS calls given a path relative to the workspace root. \nSignature: \n```ts\nasync function getFilePathOptions(relativePath: string)\n : Promise<{ path: string; baseDir?: BaseDirectory }>;\n``` \n- Custom workspace: `{ path: absolutePath }` \n- Default workspace: `{ path: `article/${relativePath}`, baseDir: BaseDirectory.AppData }` \n\nUse for reading or writing files: \n```ts\nimport { readTextFile, writeTextFile } from '@tauri-apps/plugin-fs';\n\nasync function loadDoc(rel: string) {\n const opts = await getFilePathOptions(rel);\n return opts.baseDir\n ? await readTextFile(opts.path, { baseDir: opts.baseDir })\n : await readTextFile(opts.path);\n}\n\nasync function saveDoc(rel: string, content: string) {\n const opts = await getFilePathOptions(rel);\n const args = opts.baseDir ? { baseDir: opts.baseDir } : {};\n await writeTextFile(opts.path, content, args);\n}\n```\n\n#### getGenericPathOptions \nLike `getFilePathOptions` but supports arbitrary AppData subfolders (e.g. images). \nSignature: \n```ts\nasync function getGenericPathOptions(\n path: string,\n prefix?: string\n): Promise<{ path: string; baseDir?: BaseDirectory }>;\n``` \n- Custom: always returns an absolute path under the custom root, prepending `prefix/` if provided. \n- Default: `{ path: `${prefix}/${path}`, baseDir: AppData }` unless `path` already starts with `prefix`. \n\nExample—saving an image: \n```ts\nimport { writeBinaryFile } from '@tauri-apps/plugin-fs';\n\nasync function saveImage(relPath: string, data: Uint8Array) {\n const opts = await getGenericPathOptions(relPath, 'image');\n const args = opts.baseDir ? { baseDir: opts.baseDir } : {};\n await writeBinaryFile(opts.path, data, args);\n}\n```\n\n#### toWorkspaceRelativePath \nConvert any absolute or default-prefixed path back to a path relative to the workspace root. \nSignature: \n```ts\nasync function toWorkspaceRelativePath(path: string): Promise;\n``` \n- Default mode: strips leading `article/` segments. \n- Custom mode: removes the absolute custom root prefix (and any leading slash). \n- Leaves already-relative paths untouched. \n\nExample—normalizing results from Tauri’s `readDir`: \n```ts\nimport { readDir } from '@tauri-apps/plugin-fs';\nimport { getWorkspacePath, toWorkspaceRelativePath } from '@/lib/workspace';\n\nasync function listMarkdownFiles() {\n const { path: root, isCustom } = await getWorkspacePath();\n const raw = isCustom\n ? await readDir(root)\n : await readDir('article', { baseDir: BaseDirectory.AppData });\n\n const relPaths = await Promise.all(\n raw\n .filter(e => e.isFile && e.name.endsWith('.md'))\n .map(async e => {\n const full = isCustom\n ? `${root}/${e.name}`\n : `article/${e.name}`;\n return toWorkspaceRelativePath(full);\n })\n );\n return relPaths; // e.g. ['notes/todo.md', 'readme.md']\n}\n```\n\n---\n\n### RAG Indexing: Splitting and Indexing Markdown Files\n\nExplain how markdown files are chunked into manageable pieces and indexed in the vector database by computing embeddings for each chunk.\n\n#### chunkText \nSplit long text into overlapping chunks, using paragraph then sentence boundaries. \nSignature: \n```ts\nfunction chunkText(\n text: string,\n chunkSize: number = 1000,\n chunkOverlap: number = 200\n): string[];\n``` \n- If `text.length ≤ chunkSize`, returns `[text]`. \n- Splits on `\\n\\n` (paragraphs), accumulates until adding the next paragraph would exceed `chunkSize`. \n- When full, pushes the chunk and carries over up to `chunkOverlap` characters from its end. \n- If a single paragraph > `chunkSize`, splits further by sentence boundaries (`.`, `?`, `!`). \n\nExample: \n```ts\nconst longMd = await readTextFile('notes.md');\nconst chunks = chunkText(longMd, 800, 150);\nconsole.log(chunks.length); // e.g. 5\n```\n\n#### processMarkdownFile \nRead a markdown file, split into chunks, compute embeddings, and upsert each chunk into the vector store. \nSignature: \n```ts\nasync function processMarkdownFile(\n filePath: string,\n fileContent?: string\n): Promise;\n``` \nWorkflow: \n1. Determine workspace path and load file content. \n2. Read `ragChunkSize` and `ragChunkOverlap` from `store.json`. \n3. `const chunks = chunkText(content, chunkSize, chunkOverlap)` \n4. Derive `filename` from `filePath`. \n5. `await deleteVectorDocumentsByFilename(filename)` \n6. For each `chunk`:\n a. `const embedding = await fetchEmbedding(chunk)` \n b. `await upsertVectorDocument({ filename, chunk_id: i, content: chunk, embedding: JSON.stringify(embedding), updated_at: Date.now() })` \n7. Return `true` if successful, else log and return `false`. \n\nExample: \n```ts\nimport { processMarkdownFile } from '@/lib/rag';\n\nconst success = await processMarkdownFile('article/guide.md');\nif (success) console.log('Indexed guide.md successfully');\nelse console.error('Indexing failed for guide.md');\n```\n\n---\n\n### Sync: Uploading Files to Remote Repositories\n\nProvide a simple API to create or update a file in GitHub, Gitee, or GitLab. Automatically handles base64 conversion, branch, proxy, and error reporting.\n\n#### Common Helpers \n```ts\nimport { fileToBase64 } from '@/lib/github'; // same in gitee.ts & gitlab.ts\nimport { decodeBase64ToString } from '@/lib/github';\n\n// Convert a browser File into base64 (no data URL prefix)\nconst base64 = await fileToBase64(myFile);\n\n// Decode a base64‐encoded UTF-8 string\nconst text = decodeBase64ToString(base64);\n```\n\n#### GitHub Example \n```ts\nimport { uploadFile as uploadToGitHub } from '@/lib/github';\nimport { RepoNames } from '@/lib/github.types';\n\nasync function syncImage(file: File) {\n const ext = file.name.split('.').pop()!;\n const content = await fileToBase64(file);\n const result = await uploadToGitHub({\n ext,\n file: content,\n repo: RepoNames.sync,\n path: 'images', // optional folder\n message: 'Add new image 📷'\n });\n if (result) {\n console.log('GitHub URL:', result.data.content.html_url);\n }\n}\n```\n\n#### Gitee Example \n```ts\nimport { uploadFile as uploadToGitee } from '@/lib/gitee';\nimport { RepoNames } from '@/lib/github.types';\n\nasync function syncDocument(file: File) {\n const ext = file.name.split('.').pop()!;\n const content = await fileToBase64(file);\n const response = await uploadToGitee({\n ext,\n file: content,\n repo: RepoNames.sync,\n filename: 'report.pdf', // override UUID name\n path: 'docs',\n message: 'Publish monthly report'\n });\n if (response) {\n console.log('Gitee URL:', response.data.content.download_url);\n }\n}\n```\n\n#### GitLab Example \n```ts\nimport { uploadFile as uploadToGitLab } from '@/lib/gitlab';\nimport { RepoNames } from '@/lib/github.types';\n\nasync function syncScript(file: File) {\n const ext = file.name.split('.').pop()!;\n const content = await fileToBase64(file);\n const res = await uploadToGitLab({\n ext,\n file: content,\n repo: RepoNames.sync,\n path: 'scripts',\n message: 'Update script v2'\n });\n if (res) {\n console.log('GitLab file committed:', res.data.file_path);\n }\n}\n```\n\n##### Key Parameters \n- `ext`: file extension without dot \n- `file`: base64-encoded content \n- `filename?`: override auto-generated `.` \n- `sha?`: existing file SHA/commit for updates \n- `message?`: commit message \n- `repo`: enum name of your sync repository \n- `path?`: folder path in repo; spaces convert to underscores \n\n##### Error Handling \n- Displays a toast on network/API errors \n- Returns `null` (GitHub/Gitee) or `false` (GitLab) for recoverable failures \n- Throws unexpected errors for catch-all logic \n\n---\n\n### Keyboard Shortcut Configuration\n\nDefine, register and emit application-wide keyboard shortcuts using the `ShortcutSettings`, `ShortcutDefault` and `EmitterShortcutEvents` enums.\n\n#### Config Enums \n```ts\n// src/config/shortcut.ts\nexport enum ShortcutSettings {\n screenshot = \"shotcut-screenshot\",\n text = \"shotcut-text\",\n pin = \"window-pin\",\n link = \"shotcut-link\",\n}\nexport enum ShortcutDefault {\n screenshot = \"Control+Shift+S\",\n text = \"Control+Shift+T\",\n pin = \"Control+Shift+P\",\n link = \"Control+Shift+L\",\n}\n\n// src/config/emitters.ts\nexport enum EmitterShortcutEvents {\n screenshot = \"screenshot-shortcut-register\",\n text = \"text-shortcut-register\",\n pin = \"window-pin-register\",\n link = \"link-shortcut-register\",\n}\n```\n\n#### Main Process: Registering Shortcuts \n1. Read stored accelerator or fallback to default \n2. Call Electron’s `globalShortcut.register` \n3. Emit IPC event on trigger \n```ts\nimport { app, BrowserWindow, globalShortcut } from \"electron\";\nimport store from \"./store\";\nimport { ShortcutSettings, ShortcutDefault } from \"./config/shortcut\";\nimport { EmitterShortcutEvents } from \"./config/emitters\";\n\nlet mainWindow: BrowserWindow;\n\nfunction registerAllShortcuts() {\n (Object.keys(ShortcutDefault) as Array)\n .forEach(action => {\n const settingKey = ShortcutSettings[action];\n const accelerator = store.get(settingKey, ShortcutDefault[action]);\n globalShortcut.register(accelerator, () => {\n mainWindow.webContents.send(EmitterShortcutEvents[action]);\n });\n });\n}\n\napp.on(\"ready\", () => {\n mainWindow = new BrowserWindow({ /* ... */ });\n registerAllShortcuts();\n});\n```\n\n#### Renderer Process: Handling Shortcut Events \n```ts\nimport { ipcRenderer } from \"electron\";\nimport { EmitterShortcutEvents } from \"../config/emitters\";\n\nipcRenderer.on(EmitterShortcutEvents.screenshot, () => {\n startScreenshotMode();\n});\n\nipcRenderer.on(EmitterShortcutEvents.text, () => {\n openTextCapture();\n});\n```\n\n#### Updating Shortcuts at Runtime \nWhen a user changes a shortcut:\n```ts\nfunction updateShortcut(action: keyof typeof ShortcutDefault, newAccel: string) {\n const old = store.get(ShortcutSettings[action], ShortcutDefault[action]);\n globalShortcut.unregister(old);\n\n store.set(ShortcutSettings[action], newAccel);\n globalShortcut.register(newAccel, () => {\n mainWindow.webContents.send(EmitterShortcutEvents[action]);\n });\n}\n```\nWrap registration/unregistration in try/catch to handle invalid accelerators or conflicts.\n## Configuration Guide\n\nThis guide maps each configurable screen in the app to its underlying storage key (in store.json) or environment variable and shows how to read and update settings via `src/stores/setting.ts`.\n\n---\n\n### Initialization\n\nLoad persisted settings on app start. \nStorage file: `@tauri-apps/plugin-store` → `store.json`\n\n```tsx\nimport { useEffect } from 'react'\nimport useSettingStore from '@/stores/setting'\n\nexport function App() {\n useEffect(() => {\n // Loads all keys from store.json into Zustand and applies env-var fallbacks\n useSettingStore.getState().initSettingData()\n }, [])\n\n return \n}\n```\n\n---\n\n### 1. General Settings Screen\n\nManage theme and UI language.\n\n| UI Field | Store Key | Env Var | Setter |\n|----------|------------|---------|---------------------|\n| Theme | `theme` | — | `setTheme(value)` |\n| Language | `language` | — | `setLanguage(code)` |\n\n```tsx\nimport useSettingStore from '@/stores/setting'\n\nfunction GeneralSettings() {\n const theme = useSettingStore(s => s.theme)\n const setTheme = useSettingStore(s => s.setTheme)\n\n const lang = useSettingStore(s => s.language)\n const setLang = useSettingStore(s => s.setLanguage)\n\n return (\n <>\n \n\n \n \n )\n}\n```\n\n---\n\n### 2. AI Models Screen\n\nConfigure multiple AI services and select primary models.\n\n| UI Section | Store Key | Description |\n|---------------------------|-----------------------|--------------------------------------------------|\n| Model list | `aiModelList` | Array of `{ key, model, modelType, baseURL, apiKey }` |\n| Primary chat model | `chatPrimaryModel` | Key of selected chat model |\n| Primary embedding model | `embeddingPrimaryModel` | Key of selected embedding model |\n\n```ts\nimport useSettingStore, { AiConfig } from '@/stores/setting'\n\nasync function addNewModel() {\n const newModel: AiConfig = {\n key: 'my-azure-openai',\n model: 'gpt-4',\n modelType: 'chat',\n baseURL: 'https://my-azure-endpoint.openai.azure.com',\n apiKey: 'AZURE_KEY'\n }\n await useSettingStore.getState().addAiModel(newModel)\n // Optionally set as primary:\n useSettingStore.getState().setChatPrimaryModel(newModel.key)\n}\n```\n\n---\n\n### 3. Backup Provider Selection\n\nChoose one of the supported providers. \n| UI Field | Store Key | Setter |\n|-------------------|------------------|---------------------------------|\n| Backup provider | `backupProvider` | `setBackupProvider(provider)` |\n\nSupported values: `'webdav'`, `'github'`, `'gitee'`, `'gitlab'`\n\n```tsx\nconst provider = useSettingStore(s => s.backupProvider)\nconst setProvider = useSettingStore(s => s.setBackupProvider)\n\n\n```\n\n> WebDAV settings use a separate `useWebDAVStore` (see WebDAV section).\n\n---\n\n### 4. GitHub Integration Screen\n\n| UI Field | Store Key | Env Var | Setter |\n|----------------|-----------------------|----------------------|--------------------------------|\n| Access Token | `githubAccessToken` | `GITHUB_ACCESS_TOKEN`| `setGithubAccessToken(token)` |\n| Repository | `githubRepo` | `GITHUB_REPO` | `setGithubRepo(owner/repo)` |\n| Branch | `githubBranch` | `GITHUB_BRANCH` | `setGithubBranch(name)` |\n| Path prefix | `githubPath` | `GITHUB_PATH` | `setGithubPath(path)` |\n\n```tsx\nimport useSettingStore from '@/stores/setting'\n\nfunction GitHubSettings() {\n const token = useSettingStore(s => s.githubAccessToken)\n const setToken = useSettingStore(s => s.setGithubAccessToken)\n\n return (\n <>\n setToken(e.target.value)}\n placeholder=\"GitHub Personal Access Token\"\n />\n {/* Repo, branch, path inputs analogous */}\n \n )\n}\n```\n\n---\n\n### 5. Gitee Integration Screen\n\n| UI Field | Store Key | Env Var | Setter |\n|----------------|----------------------|-----------------------|--------------------------------|\n| Access Token | `giteeAccessToken` | `GITEE_ACCESS_TOKEN` | `setGiteeAccessToken(token)` |\n| Repository | `giteeRepo` | `GITEE_REPO` | `setGiteeRepo(owner/repo)` |\n| Branch | `giteeBranch` | `GITEE_BRANCH` | `setGiteeBranch(name)` |\n| Path prefix | `giteePath` | `GITEE_PATH` | `setGiteePath(path)` |\n\nImplementation mirrors GitHub above.\n\n---\n\n### 6. GitLab Integration Screen\n\n| UI Field | Store Key | Env Var | Setter |\n|----------------|-----------------------|------------------------|----------------------------------|\n| Access Token | `gitlabAccessToken` | `GITLAB_ACCESS_TOKEN` | `setGitlabAccessToken(token)` |\n| Repository | `gitlabRepo` | `GITLAB_REPO` | `setGitlabRepo(owner/repo)` |\n| Branch | `gitlabBranch` | `GITLAB_BRANCH` | `setGitlabBranch(name)` |\n| Path prefix | `gitlabPath` | `GITLAB_PATH` | `setGitlabPath(path)` |\n\nImplementation mirrors GitHub above.\n\n---\n\n## Practical Tips\n\n- Always call `initSettingData()` once on mount. \n- Reading a value: `const val = useSettingStore(s => s.key)`. \n- Updating: `await useSettingStore.getState().setKey(value)`. \n- All setters persist to disk and notify subscribers. \n- Provide env-var defaults for CI or private deployments; store values override env.\n## Backup & Sync\n\nManage cloud‐ and inter‐device data with built-in GitHub, Gitee, and WebDAV integrations in the codexu/note-gen app.\n\n---\n\n### GitHub Sync API Wrapper\n\nProvide a set of wrapper functions to perform create, read, update, and delete operations on GitHub repositories and files using the GitHub REST API.\n\nKey Features \n- Automatically reads `accessToken`, `githubUsername` and optional `proxy` from `@tauri-apps/plugin-store` \n- Uses `@tauri-apps/plugin-http`’s `fetch` to standardize headers, status checks, and error handling \n- Core methods:\n - Authentication & user info (`getUserInfo`)\n - Repository check/create (`checkSyncRepoState`, `createSyncRepo`)\n - File upload (`uploadFile`)\n - List files (`getFiles`)\n - Delete file (`deleteFile`)\n - File commit history (`getFileCommits`)\n - Latest release retrieval (`getRelease`)\n\n#### 1. Authenticate and Get User Info\n\n```ts\nimport { getUserInfo } from '@/lib/github';\n\nasync function initGitHub(token?: string) {\n const resp = await getUserInfo(token);\n if (!resp) throw new Error('GitHub authentication failed');\n console.log('Logged in as', resp.data.login);\n}\n```\n\n#### 2. Check or Create Sync Repository\n\n```ts\nimport { checkSyncRepoState, createSyncRepo } from '@/lib/github';\n\nasync function ensureRepo(name: string) {\n const repo = await checkSyncRepoState(name);\n return repo || await createSyncRepo(name, true); // true = private\n}\n```\n\n#### 3. Upload a File\n\nConverts a `File` object to Base64, then uploads:\n\n```ts\nimport { fileToBase64, uploadFile } from '@/lib/github';\nimport { RepoNames } from '@/lib/github.types';\n\nasync function uploadImage(file: File) {\n const base64 = await fileToBase64(file); // strips data URI prefix\n const result = await uploadFile({\n ext: file.name.split('.').pop() || 'png',\n file: base64,\n filename: 'my-image', // optional, defaults to UUID\n message: 'Add new image', // commit message\n repo: RepoNames.sync, // must be an enum value\n path: 'uploads' // optional subdirectory\n });\n console.log('Upload URL:', result?.data.content.html_url);\n}\n```\n\n#### 4. List Files in a Directory\n\n```ts\nimport { getFiles } from '@/lib/github';\nimport { RepoNames } from '@/lib/github.types';\n\nasync function listUploads() {\n const files = await getFiles({ repo: RepoNames.sync, path: 'uploads' });\n if (!files) return console.log('No files found');\n files.forEach(f => console.log(f.name, f.download_url));\n}\n```\n\n#### 5. Delete a File\n\n```ts\nimport { deleteFile } from '@/lib/github';\nimport { RepoNames } from '@/lib/github.types';\n\nasync function removeFile(path: string, sha: string) {\n const success = await deleteFile({ repo: RepoNames.sync, path, sha });\n console.log(success ? 'Deletion succeeded' : 'Deletion failed');\n}\n```\n\n#### 6. Get File Commit History\n\n```ts\nimport { getFileCommits } from '@/lib/github';\nimport { RepoNames } from '@/lib/github.types';\n\nasync function showHistory(path: string) {\n const commits = await getFileCommits({ repo: RepoNames.sync, path });\n commits?.forEach(c => console.log(c.sha, c.commit.message));\n}\n```\n\n#### 7. Fetch Latest Release\n\n```ts\nimport { getRelease } from '@/lib/github';\n\nasync function checkLatestRelease() {\n const release = await getRelease();\n if (release) console.log('Latest version:', release.tag_name);\n}\n```\n\nPractical Tips \n- Call `getUserInfo` on app startup to initialize `githubUsername`. \n- Use `proxy` in settings to route all API calls. \n- Always `checkSyncRepoState` before `createSyncRepo`. \n- Spaces in filenames/paths convert to underscores to avoid conflicts.\n\n---\n\n### Gitee File Upload (`uploadFile`)\n\nCreate or update a file in a Gitee repository. Handles token loading, proxy settings, method selection (POST vs PUT), and error reporting via toast.\n\nFunction Signature\n\n```ts\nexport async function uploadFile(options: {\n ext: string; // without dot\n file: string; // Base64-encoded content\n filename?: string; // defaults to UUID.ext\n sha?: string; // existing file SHA for update\n message?: string; // commit message\n repo: RepoNames.sync; // target enum value\n path?: string; // optional subdirectory\n}): Promise | null | false>\n```\n\nHow It Works \n- Reads `giteeAccessToken`, `giteeUsername`, and optional `proxy` from `@tauri-apps/plugin-store`. \n- Generates a UUID filename when none is provided. \n- Replaces spaces with underscores in filenames/paths. \n- Chooses POST (create) or PUT (update when `sha` present). \n- Sends `{ access_token, content, message, branch: 'master', sha }`. \n- Returns parsed `data` on success, `null` on client errors (400), or `false` on failure (with toast).\n\nPrerequisites \n- In `store.json` set:\n - `giteeAccessToken`: your Gitee token \n - `giteeUsername`: your Gitee login \n - `proxy` (optional): HTTP(S) proxy URL \n\nExample\n\n```ts\nimport { fileToBase64 } from '@/lib/gitee';\nimport { uploadFile } from '@/lib/gitee';\nimport { RepoNames } from '@/lib/github.types';\n\nasync function handleFileInput(e: Event) {\n const input = e.target as HTMLInputElement;\n if (!input.files?.length) return;\n const file = input.files[0];\n const base64 = await fileToBase64(file);\n\n const ext = file.name.split('.').pop() || 'bin';\n const result = await uploadFile({\n ext,\n file: base64,\n filename: file.name,\n message: `Add ${file.name}`,\n repo: RepoNames.sync,\n path: 'images'\n });\n\n if (result?.data.content.download_url) {\n console.log('Uploaded to:', result.data.content.download_url);\n } else {\n console.warn('Upload failed or returned null');\n }\n}\n```\n\nPractical Tips \n- To update: fetch SHA via `getFiles`, then pass `sha` to `uploadFile`. \n- Default branch is `master`; change in source if using `main`. \n- Network or non-404 errors trigger toast and return `false`. \n- Extend filename sanitization regex as needed.\n\n---\n\n### WebDAV Backup Command (`webdav_backup`)\n\nA Tauri command that uploads all Markdown files from the local workspace to a WebDAV directory, auto-creating missing remote folders. Returns a “success_count/total_count” string or an error.\n\nSignature\n\n```rust\n#[tauri::command]\npub async fn webdav_backup(\n url: String,\n username: String,\n password: String,\n path: String,\n app: AppHandle,\n) -> Result\n```\n\nWorkflow \n1. **Client Setup**: \n - `create_client(url, username, password)` → `reqwest_dav::Client` with Basic auth. \n2. **Ensure Remote Base Directory**: \n - `client.list(path, Depth::Infinity)` → if fails, `client.mkcol(path)`. \n3. **Discover Local Markdown**: \n - `get_workspace_info(app)` reads `workspacePath` from `store.json` (default: `\"article\"`). \n - `get_markdown_files()` returns `(relative_path, content)` for each `.md` file. \n4. **Upload Loop**: \n - For each file: \n - Build `remote_path = format!(\"{}/{}\", path.trim_end_matches('/'), relative_path)`. \n - Ensure parent directory exists (list → mkcol). \n - `client.put(remote_path, content_bytes)`. \n - Track successes, abort on fatal error. \n5. **Result**: \n - `Ok(\"{}/{}\", success_count, total_files)` or `Err(...)`.\n\nRust Usage Example\n\n```rust\nlet result = webdav_backup(\n \"https://example.com/webdav\".into(),\n \"user\".into(),\n \"pass\".into(),\n \"backups/notes\".into(),\n app_handle.clone()\n).await?;\nprintln!(\"Uploaded: {}\", result); // e.g. \"12/12\"\n```\n\nInvoke from Frontend\n\n```ts\nimport { invoke } from '@tauri-apps/api';\n\nconst count = await invoke('webdav_backup', {\n url, username, password, path\n});\n// count === \"uploaded/total\"\n```\n\nPractical Tips \n- Do not include leading slash in `path`; command strips it. \n- Missing remote directories auto-created. \n- Large workspaces may require a UI spinner (`backupState`). \n- Errors bubble up with context (e.g. failed `mkcol`, upload error).\n## AI & Processing Engine\n\nCentralize AI/ML features—chat completion, embeddings, reranking, keyword extraction, OCR, screenshot capture and local RAG search—so your app can call a single module for all AI-powered workflows.\n\n### Chat Completion (src/lib/ai.ts) \nProvide conversational AI via OpenAI-compatible chat models with error handling and optional streaming.\n\n#### Signature \n```ts\nasync function chatCompletion(\n messages: ChatMessage[], \n options?: { modelKey?: string; stream?: boolean }\n): Promise>;\n```\n\n#### Usage \n```ts\nimport { chatCompletion } from '@/lib/ai';\n\nasync function askBot() {\n const messages = [\n { role: 'system', content: 'You are an assistant.' },\n { role: 'user', content: 'Summarize the benefits of TypeScript.' }\n ];\n const reply = await chatCompletion(messages);\n console.log('Bot:', reply);\n}\n\n// Streaming example\n(async () => {\n const stream = await chatCompletion(messages, { stream: true });\n for await (const chunk of stream as AsyncIterable) {\n process.stdout.write(chunk);\n }\n})();\n```\n\n### Embeddings (src/lib/ai.ts) \nGenerate semantic vectors for text, used in similarity search or RAG pipelines.\n\n#### Signature \n```ts\nasync function fetchEmbedding(text: string): Promise;\n```\n\n#### Usage \n```ts\nimport { fetchEmbedding } from '@/lib/ai';\nimport { initVectorDb, upsertVectorDocument } from '@/db/vector';\n\nasync function indexFile(filename: string, content: string) {\n await initVectorDb();\n const embedding = await fetchEmbedding(content);\n if (!embedding) throw new Error('Embedding failed');\n await upsertVectorDocument({\n filename,\n chunk_id: 0,\n content,\n embedding: JSON.stringify(embedding),\n updated_at: Date.now(),\n });\n}\n```\n\n### Document Reranking (src/lib/ai.ts) \nReorder candidate passages by relevance to a query using a rerank model.\n\n#### Signature \n```ts\nasync function rerankDocuments(\n query: string, \n candidates: string[]\n): Promise<{ score: number; text: string }[]>;\n```\n\n#### Usage \n```ts\nimport { rerankDocuments } from '@/lib/ai';\n\nasync function refineResults() {\n const query = 'What is vector similarity?';\n const docs = ['Vectors measure...', 'Similarity is...','Math principles...'];\n const ranked = await rerankDocuments(query, docs);\n console.log(ranked[0]); // highest relevance\n}\n```\n\n### Keyword Extraction (src-tauri/src/keywords.rs) \nExtract top-weighted keywords from text using Jieba + TextRank via a Tauri command.\n\n#### Rust Command \n```rust\n#[tauri::command]\npub fn keywords(text: String, top_n: usize) -> Vec { … }\n```\n\n#### JS Invocation \n```ts\nimport { invoke } from '@tauri-apps/api/tauri';\n\ninterface Keyword { word: string; weight: f32; }\n\nasync function extractKeywords(text: string) {\n const list = await invoke('keywords', { text, top_n: 5 });\n return list;\n}\n\nextractKeywords('Deep learning enables…').then(console.log);\n```\n\n### OCR Utility (src/lib/ocr.ts) \nPerform OCR on images saved in AppData using Tesseract.js with configurable languages and a 30s timeout.\n\n```ts\nimport ocr from '@/lib/ocr';\n\nasync function readImageText(path: string) {\n const text = await ocr(path);\n if (text.startsWith('OCR')) console.error('Error:', text);\n else console.log('Extracted:', text);\n}\n\nreadImageText('screenshots/latest.png');\n```\n\n### Screenshot Capture (src-tauri/src/screenshot.rs) \nCapture all visible, non-minimized windows; save PNGs to AppData/temp_screenshot; return metadata.\n\n#### Rust Struct \n```rust\n#[derive(Serialize, Deserialize)]\npub struct ScreenshotImage {\n pub name: String,\n pub path: String,\n pub width: u32,\n pub height: u32,\n pub x: i32,\n pub y: i32,\n pub z: i32,\n}\n```\n\n#### JS Invocation \n```ts\nimport { invoke } from '@tauri-apps/api/tauri';\n\nasync function captureAllWindows() {\n const images = await invoke('screenshot');\n images.forEach(img => {\n const el = document.createElement('img');\n el.src = `file://${img.path}`;\n el.width = img.width / 4;\n document.body.appendChild(el);\n });\n}\n\ncaptureAllWindows();\n```\n\n### Local RAG Search (src/lib/rag.ts) \nIngest Markdown into a vector store and retrieve context snippets for queries.\n\n#### Core Functions \n• initVectorDb(): setup SQLite vector table \n• processMarkdownFile(path: string): split, embed & store chunks \n• getContextForQuery(query: string): return top passages with context \n• handleFileUpdate(path: string): refresh file in vector store \n\n#### Example: Ingest & Query \n```ts\nimport {\n initVectorDb,\n processMarkdownFile,\n getContextForQuery\n} from '@/lib/rag';\n\nasync function buildAndSearch() {\n await initVectorDb();\n await processMarkdownFile('/notes/guide.md');\n const context = await getContextForQuery('How to configure Tauri?');\n console.log('Context:', context);\n}\n\nbuildAndSearch();\n## Backend API Reference\n\n### Fuzzy Search Commands\n\nPurpose \nProvide a high-performance, parallelized fuzzy search over arbitrary `SearchItem` collections in your Tauri backend. Exposes two invokable commands:\n- `fuzzy_search` – standard execution \n- `fuzzy_search_parallel` – alias that currently delegates to `fuzzy_search`\n\nCore Types \n• **SearchItem** \n - Represents an indexed record with optional fields: `id`, `title`, `desc`, `article`, `path`, `url`, `search_type` \n - On client request can carry back `score?: number` and `matches?: MatchInfo[]` \n• **MatchInfo** \n - `{ key: string; indices: [number,number][]; value: string }` \n - Describes which characters in which field matched \n• **FuzzySearchResult** \n - `{ item: SearchItem; refindex: number; score: number; matches: MatchInfo[] }` \n - `refindex` is the original index in the `items` array\n\nCommand Signature \n```rust\n#[tauri::command]\npub fn fuzzy_search(\n items: Vec,\n query: String,\n keys: Vec,\n threshold: f64,\n include_score: bool,\n include_matches: bool,\n) -> Vec { … }\n```\n\nParameters \n- `items`: array of records to search \n- `query`: search pattern (empty yields `[]`) \n- `keys`: subset of `[\"desc\",\"title\",\"article\",\"path\",\"search_type\"]` to test \n- `threshold`: normalized match-score cutoff (0.0–1.0) \n- `include_score`: if `false`, result.score and item.score are zeroed/cleared \n- `include_matches`: if `false`, match spans are dropped\n\nBehavior \n1. Builds a `SkimMatcherV2` per invocation. \n2. For each item and each requested key: \n - runs `fuzzy_indices(text, query)` \n - computes `normalized_score = abs(score) / query.len()` \n - skips if below `threshold` \n - collects best raw `score` and all matching spans \n3. Filters out non-matching items, sorts descending by `score`. \n4. Clears scores/matches based on flags.\n\nTypeScript Invocation Example \n```typescript\nimport { invoke } from '@tauri-apps/api/tauri';\n\ninterface MatchInfo { key: string; indices: [number,number][]; value: string; }\ninterface SearchItem { id?: string; title?: string; desc?: string; /*…*/ }\ninterface FuzzySearchResult {\n item: SearchItem;\n refindex: number;\n score: number;\n matches: MatchInfo[];\n}\n\nasync function runSearch() {\n const items: SearchItem[] = [ /* your indexed data */ ];\n const results = await invoke('fuzzy_search', {\n items,\n query: 'rust module',\n keys: ['title','desc'],\n threshold: 0.5,\n include_score: true,\n include_matches: true\n });\n\n console.log(results);\n}\n```\n\nPractical Tips \n- Tune `threshold` to control fuzziness: lower allows looser matches. \n- Select only needed fields in `keys` to reduce work. \n- Use `include_matches=false` when you only care about top-N scoring. \n- `fuzzy_search_parallel` currently aliases `fuzzy_search`, swap in a parallel version if needed.\n\n---\n\n### Custom Tauri HTTP Fetch Wrapper\n\nPurpose \nProvide a drop-in replacement for the browser `fetch` API that routes requests through Tauri’s `plugin-http`, supporting native timeouts, proxies, max redirections, and proper header encoding/decoding.\n\nAPI Signature \n```ts\nasync function fetch(\n input: string,\n init?: RequestInit & ClientOptions\n): Promise\n```\n\nKey Features \n- Uses `invoke('plugin:http|fetch')` under the hood \n- Supports `maxRedirections`, `connectTimeout`, `proxy`, `danger` flags \n- Honors and propagates `AbortSignal` \n- Encodes request body as `Uint8Array` \n- URI-encodes response headers to ensure validity \n- Returns a standard `Response` with overridden `.url` and `.headers`\n\nUsage Example \n```ts\nimport { fetch } from 'src/lib/encode-fetch';\n\nasync function postData() {\n const controller = new AbortController();\n const timeout = setTimeout(() => controller.abort(), 3000);\n\n try {\n const res = await fetch('https://api.example.com/data', {\n method: 'POST',\n headers: { 'Content-Type': 'application/json' },\n body: JSON.stringify({ foo: 'bar' }),\n connectTimeout: 5000, // TCP connect timeout in ms\n maxRedirections: 5, // max HTTP redirects\n proxy: 'http://localhost:8888', // optional proxy\n danger: { acceptInvalidCerts: true},// skip TLS validation\n signal: controller.signal\n });\n\n clearTimeout(timeout);\n\n if (!res.ok) {\n console.error(`HTTP ${res.status}: ${res.statusText}`);\n return;\n }\n\n const data = await res.json();\n console.log('Received:', data);\n } catch (e: any) {\n if (e.message === 'Request canceled') {\n console.warn('Fetch aborted or timed out');\n } else {\n console.error('Fetch error:', e);\n }\n }\n}\n```\n\nAbort Handling \n- Checks `signal.aborted` before dispatch and after header mapping \n- On abort, calls `invoke('plugin:http|fetch_cancel', { rid })` \n- Throws `Error('Request canceled')` consistent with standard fetch\n\nCustom Options \n| Option | Type | Description |\n|------------------|------------|---------------------------------------------|\n| `connectTimeout` | number | ms to wait for TCP connect |\n| `maxRedirections`| number | how many 3xx redirects to follow |\n| `proxy` | string | proxy URL (e.g., `http://proxy.local:3128`) |\n| `danger` | object | plugin-specific flags (e.g., TLS skip opts) |\n\nUnder the Hood \n1. Strip custom fields from `init` to avoid native `Request` confusion \n2. Build a `Request` to extract headers & body as `ArrayBuffer` \n3. Map headers to `[name,value][]` and encode with `encodeURI` \n4. Invoke Tauri plugin methods: \n - `plugin:http|fetch` to start (returns request ID) \n - `plugin:http|fetch_send` to send headers & receive status \n - `plugin:http|fetch_read_body` to retrieve body \n5. Construct a standard `Response`, override `.url` and `.headers`\n\nPractical Tips \n- Always provide a `signal` if you need cancellable requests \n- Clean up timeouts/`AbortController` to avoid dangling plugin requests \n- Use `danger` sparingly – only in trusted dev environments \n- Check `res.ok`/`res.status` before consuming body \n- Proxy support is useful for debugging via local interceptors (e.g., Charles, Fiddler)\n## Quality & Delivery\n\nMaintain code health through consistent linting standards and automated releases.\n\n### ESLint Configuration (.eslintrc.json) \nPurpose: Enforce Next.js and TypeScript linting defaults while allowing scoped overrides.\n\nThis project’s `.eslintrc.json`:\n\n```json\n{\n \"extends\": [\"next/core-web-vitals\", \"next/typescript\"],\n \"rules\": {\n \"react-hooks/exhaustive-deps\": \"off\",\n \"@typescript-eslint/no-explicit-any\": \"off\"\n },\n \"ignorePatterns\": [\"docs/**\"]\n}\n```\n\nKey sections:\n\n1. Extends \n • `next/core-web-vitals`: React rules tuned for Core Web Vitals \n • `next/typescript`: TypeScript support for Next.js \n\n2. Rules Overrides \n • `react-hooks/exhaustive-deps: off` \n – Skip exhaustive-deps checks when you manage dependencies manually. \n – Enable per-file with: \n ```js\n /* eslint react-hooks/exhaustive-deps: \"warn\" */\n ``` \n • `@typescript-eslint/no-explicit-any: off` \n – Allow `any` in scripts or third-party integrations. \n – Scope via overrides: \n ```jsonc\n {\n \"overrides\": [\n {\n \"files\": [\"scripts/**/*.ts\"],\n \"rules\": { \"@typescript-eslint/no-explicit-any\": \"off\" }\n }\n ]\n }\n ```\n\n3. ignorePatterns \n • `docs/**`: exclude documentation from linting \n • Extend patterns (e.g., `build/`, `public/`) as needed \n\nPractical Usage \n```bash\nnpm run lint\nnpx eslint src/components/Button.tsx --fix\n``` \nTo add a new plugin rule: \n1. npm install -D eslint-plugin-import \n2. Update `.eslintrc.json`: \n ```jsonc\n {\n \"plugins\": [\"import\"],\n \"rules\": {\n \"import/order\": [\"error\", { \"groups\": [\"builtin\",\"external\",\"internal\"] }]\n }\n }\n ```\n\n### Extracting and Propagating Tauri App Version \nPurpose: Capture the semantic version from `tauri-apps/tauri-action` in `publish-tauri` and expose it for downstream jobs.\n\nHow It Works \n1. Declare `appVersion` as a job output. \n2. On Ubuntu, read `steps.tauri-action.outputs.appVersion`. \n3. Write to `$GITHUB_OUTPUT` for use by other jobs.\n\nKey Snippet:\n\n```yaml\njobs:\n publish-tauri:\n strategy:\n matrix:\n include:\n - platform: 'ubuntu-24.04'\n args: ''\n - platform: 'windows-latest'\n args: '--msix'\n - platform: 'macos-latest'\n args: ''\n runs-on: ${{ matrix.platform }}\n outputs:\n appVersion: ${{ steps.set_output.outputs.appVersion }}\n steps:\n - uses: actions/checkout@v4\n - uses: tauri-apps/tauri-action@v0\n id: tauri-action\n with:\n # Tauri config here\n - name: Extract version\n if: matrix.platform == 'ubuntu-24.04'\n run: echo \"appVersion=${{ steps.tauri-action.outputs.appVersion }}\" >> $GITHUB_OUTPUT\n id: set_output\n```\n\nUsage in Downstream Job:\n\n```yaml\njobs:\n upgradeLink-upload:\n needs: publish-tauri\n runs-on: ubuntu-latest\n steps:\n - uses: toolsetlink/upgradelink-action@v5\n with:\n source-url: >\n https://github.com/codexu/note-gen/releases/\n download/note-gen-v${{ needs.publish-tauri.outputs.appVersion }}/latest.json\n access-key: ${{ secrets.UPGRADE_LINK_ACCESS_KEY }}\n tauri-key: ${{ secrets.UPGRADE_LINK_TAURI_KEY }}\n github-token: ${{ secrets.GITHUB_TOKEN }}\n```\n\nPractical Tips \n• Restrict write-back to one OS to avoid race conditions \n• Match the `id` of the `tauri-action` step when reading its outputs \n• Inject the version via `${{ needs.publish-tauri.outputs.appVersion }}`\n## Contributing\n\nThank you for improving NoteGen! This guide covers how to file issues, propose features, open pull requests, our branch strategy, and coding style standards.\n\n### Filing Issues\n\nUse GitHub Issues to report bugs or request help.\n\n1. Check existing issues for similar reports.\n2. Click **New Issue** and choose the *Bug Report* template.\n3. Provide:\n - A descriptive title.\n - Steps to reproduce.\n - Expected vs. actual behavior.\n - Environment details (OS, Node version).\n4. Label issues appropriately (bug, question).\n\n### Proposing Features\n\nCreate a feature request using the *Feature Request* template:\n\n1. Click **New Issue** > *Feature Request*.\n2. Describe:\n - The problem you want to solve.\n - Proposed API or UX changes.\n - Any relevant examples or mockups.\n3. Use the `enhancement` label.\n\nDiscussion happens right in the issue thread.\n\n### Opening Pull Requests\n\n1. Fork the repository and clone your fork:\n ```bash\n git clone git@github.com:your-username/note-gen.git\n cd note-gen\n ```\n2. Add upstream remote:\n ```bash\n git remote add upstream git@github.com:codexu/note-gen.git\n ```\n3. Sync and create a feature branch:\n ```bash\n git fetch upstream\n git checkout develop\n git pull upstream develop\n git checkout -b feature/your-feature-name\n ```\n4. Make your changes; run tests and linters:\n ```bash\n npm install\n npm test\n npm run lint\n ```\n5. Commit with a clear, conventional message:\n ```\n feat(notes): add support for markdown export\n ```\n6. Push and open a PR against the `develop` branch:\n ```bash\n git push -u origin feature/your-feature-name\n ```\n7. Fill out the PR template:\n - Summary of changes.\n - Related issues (e.g., “Closes #123”).\n - Screenshots or logs if relevant.\n8. Request reviews from teammates; address feedback promptly.\n\n### Branch Strategy\n\nWe use a simplified Git Flow:\n\n- **main**: always production-ready.\n- **develop**: integration branch for features.\n- **feature/**: new features and enhancements.\n- **fix/**: bug fixes.\n- **hotfix/**: urgent production fixes off `main`.\n\nRelease process:\n\n1. Merge completed features into `develop`.\n2. Create a release branch (`release/vX.Y.Z`), fix minor bugs.\n3. Merge release into `main` and tag.\n4. Merge `main` back into `develop`.\n\nBranch names example: \nfeature/user-auth, fix/login-crash, hotfix/critical-bug\n\n### Coding Style\n\nNoteGen uses ESLint and Prettier to enforce consistent code.\n\n- Indentation: 2 spaces \n- Quotes: single (`'`) \n- Semicolons: required \n- Line length: 100 characters \n\nRun checks locally:\n\n```bash\nnpm run lint # ESLint\nnpm run format # Prettier\nnpm test # Jest tests\n```\n\nESLint and Prettier configs live in the repo root (`.eslintrc.js`, `.prettierrc`). Ensure all checks pass before submitting a PR.\n\nThank you for contributing to NoteGen!\n## License\n\nThis project is released under the MIT License. It grants broad permissions and disclaims all warranties. You must include the full license text in any distributions or derivative works.\n\n### Summary\n- Grants free permission to use, copy, modify, merge, publish, distribute, sublicense and/or sell copies.\n- Requires inclusion of the original copyright and license notice.\n- Disclaims warranties and limits liability.\n\n### Obligations\n- Bundle the MIT License text in your source or documentation.\n- Retain the following in every copy or substantial portion:\n - Copyright notice\n - Permission notice\n\n### Including the License\n\n#### In package.json\nSpecify the license identifier so tooling recognizes it:\n```json\n{\n \"name\": \"note-gen\",\n \"version\": \"1.0.0\",\n \"license\": \"MIT\",\n \"files\": [\n \"dist/\",\n \"LICENSE\"\n ]\n}\n```\n\n#### In source files\nAdd a brief header to each file:\n```javascript\n/**\n * note-gen v1.0.0\n * Copyright (c) 2025 CodexU\n * Released under the MIT License\n */\n```\n\n#### Redistributing\nEnsure your published bundle or Docker image includes the LICENSE file at its root:\n- npm: The `files` field in package.json\n- Docker: `COPY LICENSE /app/`\n- TAR/GZIP: Preserve the LICENSE alongside your release artifacts."; document.addEventListener('DOMContentLoaded', function() { const contentDiv = AppUtils.dom.query('#documentation-content'); const tocNav = AppUtils.dom.query('#toc-nav'); const tocNavMobile = AppUtils.dom.query('#toc-nav-mobile'); // Initialize mobile menu functionality MobileMenu.init(); // Apply content styling and initialize TOC if (contentDiv) { ContentStyling.applyProseStyles(contentDiv); TableOfContents.init(contentDiv, tocNav, tocNavMobile); } // Setup quick actions QuickActions.init(tocNav, contentDiv); });