Project Overview
NoteGen is a cross-platform Markdown note-taking app that combines AI-powered writing assistance, Git-based version control, and real-time synchronization. Capture, organize and revisit your ideas—from text and screenshots to audio recordings—across all your devices.
Purpose
Enable seamless, intelligent note creation and management for developers, researchers, and knowledge workers.
Key Features
- AI-assisted composition, summarization and translation
- Full Markdown editor with live preview and custom themes
- Git integration for versioned note history and diffs
- Multimedia support: images, screen captures, audio recordings, file attachments
- Tagging, full-text search and customizable folder structures
- Sync via GitHub, WebDAV or self-hosted servers
Supported Platforms
- Desktop: macOS, Windows, Linux
- Web: Chrome, Firefox, Safari, Edge
- Mobile: iOS, Android
Value Proposition
NoteGen delivers AI-enhanced note workflows with built-in version control and cross-device sync. Maintain a single source of truth for your ideas—anytime, anywhere.
Getting Started
Get NoteGen up and running in minutes. Choose a pre-built installer or build from source.
1. Install Pre-built Release
Visit the Releases page and download the package for your platform:
https://github.com/codexu/note-gen/releases
Supported targets:
- macOS (dmg: Intel / Apple Silicon)
- Windows (installer: x64 / arm64)
- Linux (AppImage,
.deb
,.rpm
)
After download:
- macOS: open the
.dmg
and drag NoteGen to Applications. - Windows: run the
.exe
installer. - Linux: grant execute on the AppImage or install the
.deb
/.rpm
via your package manager.
2. Run from Source
Prerequisites
- Node >=18 (npm or Yarn)
- Rust >=1.60
- Tauri CLI
# via Cargo cargo install tauri-cli # or via npm npm install -g @tauri-apps/cli
Clone, Install, and Develop
git clone https://github.com/codexu/note-gen.git
cd note-gen
npm install
npm run dev # starts Next.js frontend on http://localhost:3456
npm run tauri dev # opens NoteGen desktop window
Tauri watches src-tauri/src
; Rust changes recompile automatically.
Production Build
npm run build # builds and exports Next.js static files
npm run tauri build # bundles native installers for macOS/Windows/Linux
Find installers insrc-tauri/target/release/bundle
3. Folder Structure at a Glance
.
├── package.json # project scripts & dependencies
├── next.config.ts # static export & i18n config
├── tsconfig.json # TypeScript compiler options
├── public/ # static images & fonts
├── src/ # Next.js pages & React components
└── src-tauri/ # Tauri & Rust application
├── Cargo.toml # Rust crate & plugin dependencies
├── tauri.conf.json # window, bundle & updater settings
└── src/ # Rust source (main.rs, plugins)
└── .github/
└── workflows/
└── release.yml # CI: multi-platform build & release
With this setup, you can explore the frontend in src/
, extend Rust APIs in src-tauri/src/
, and customize bundling via tauri.conf.json
.
Architecture Overview
This section outlines the core layers of the application—desktop (Rust/Tauri), frontend (Next.js), and data (SQLite)—and shows how they interact at runtime.
1. Rust Desktop Layer (src-tauri/src/main.rs)
Sets up the Tauri runtime, plugins, system tray, single‐instance guard and command handlers.
Initialization Snippet
fn main() {
tauri::Builder::default()
// Ensure only one instance runs
.plugin(tauri_plugin_single_instance::init(|app, _, _| {
if let Some(win) = app.get_window("main") {
win.show().ok();
win.set_focus().ok();
}
}))
// System tray with click handlers
.plugin(
tauri_plugin_tray::Builder::new()
.with_menu(tray_menu())
.with_icon(tray_icon())
.on_event(tray_event_handler)
.build()
)
// Auto-update via GitHub or custom endpoint
.plugin(tauri_plugin_updater::Builder::new().build())
// WebDAV backup/sync
.plugin(tauri_plugin_webdav::Builder::new().build())
// Fuzzy search & keyword ranking
.plugin(tauri_plugin_fuzzy::init())
.plugin(tauri_plugin_rank::init())
// Register Tauri commands for chats, notes, tags, vector, etc.
.invoke_handler(tauri::generate_handler![
init_databases,
chat_list,
chat_create,
mark_list,
note_save,
tag_list,
vector_search,
webdav_sync,
fuzzy_search,
rank_keywords,
])
.run(tauri::generate_context!())
.expect("error while running tauri application");
}
Key points
- Plugin order:
single_instance
first, then tray, updater, external integrations, custom commands. - All business logic lives in commands (Rust functions marked with
#[tauri::command]
). - Frontend invokes commands via
@tauri-apps/api/tauri
.
2. Frontend Layout (src/app/layout.tsx)
Defines the root HTML structure, global styles, mobile meta tags, i18n context and toast notifications.
RootLayout Component
import React from "react";
import "./globals.css";
import { I18nProvider } from "./i18n";
import { Toaster } from "react-hot-toast";
export default function RootLayout({ children }: { children: React.ReactNode }) {
return (
<html lang="en" suppressHydrationWarning>
<head>
<meta name="viewport" content="width=device-width, initial-scale=1" />
<meta name="theme-color" content="#fff" />
<script src="/fix-markdown-it.js"></script>
</head>
<body>
<I18nProvider>
{children}
<Toaster position="bottom-right" />
</I18nProvider>
</body>
</html>
);
}
Highlights
suppressHydrationWarning
for SSR/CSR consistency.- Injects a small script to patch markdown-it rendering.
- Wraps pages in
I18nProvider
for locale switching. - Renders a global
<Toaster>
for notifications.
3. Data Layer (src/db/index.ts)
Manages a singleton SQLite connection, initializes separate databases for chats, marks, notes, tags and vector indexing.
Database Initialization
import { open } from "sqlite";
import sqlite3 from "sqlite3";
import { initChatsDb } from "./chats";
import { initMarksDb } from "./marks";
import { initNotesDb } from "./notes";
import { initTagsDb } from "./tags";
import { initVectorDb } from "./vector";
type DbName = "chats" | "marks" | "notes" | "tags" | "vector";
const dbMap: Partial<Record<DbName, sqlite3.Database>> = {};
export async function initDatabases(dataDir: string) {
dbMap.chats = await open({ filename: `${dataDir}/chats.db`, driver: sqlite3.Database });
await initChatsDb(dbMap.chats!);
dbMap.marks = await open({ filename: `${dataDir}/marks.db`, driver: sqlite3.Database });
await initMarksDb(dbMap.marks!);
dbMap.notes = await open({ filename: `${dataDir}/notes.db`, driver: sqlite3.Database });
await initNotesDb(dbMap.notes!);
dbMap.tags = await open({ filename: `${dataDir}/tags.db`, driver: sqlite3.Database });
await initTagsDb(dbMap.tags!);
dbMap.vector = await open({ filename: `${dataDir}/vector.db`, driver: sqlite3.Database });
await initVectorDb(dbMap.vector!);
}
export function getDb(name: DbName) {
const db = dbMap[name];
if (!db) throw new Error(`Database ${name} not initialized`);
return db;
}
Usage in commands:
#[tauri::command]
async fn chat_list() -> Vec<Chat> {
// calls into JS-bridge which then uses getDb("chats") under the hood
}
Call Flow
- Startup
main.rs
launches Tauri, invokesinit_databases(data_dir)
.- Each sub-DB creates tables and indices.
- UI Rendering
- Next.js serves pages;
layout.tsx
sets up global context.
- Next.js serves pages;
- User Action
- UI calls
invoke("chat_list")
or other commands via@tauri-apps/api/tauri
.
- UI calls
- Command Execution
- Rust handler uses
getDb(…)
to query SQLite, applies business logic, returns JSON.
- Rust handler uses
- UI Update
- Frontend receives results, updates components, shows a toast if needed.
This layered architecture ensures clear separation between UI, backend logic and data persistence, while enabling tight integration through Tauri’s command bridge.
Core Concepts
Core abstractions that power note-gen:
- Workspace path utilities: resolve file system paths in default or custom workspaces
- RAG indexing: chunk markdown files, compute embeddings, and upsert into a vector store
- Sync API: upload notes and assets to GitHub, Gitee or GitLab with built-in base64 conversion and error handling
- Shortcut management: define, register and emit global keyboard shortcuts via Electron IPC
Workspace Path Utilities
Provide a unified way to resolve file paths inside the user’s workspace—whether it’s the default AppData/article
folder or a custom directory—so FS operations (read, write, stat, mkdir, etc.) work seamlessly.
getWorkspacePath
Retrieve the root workspace path and detect if it’s user-customized.
Signature:
async function getWorkspacePath(): Promise<{ path: string; isCustom: boolean }>;
- If
store.json
contains aworkspacePath
, returns that absolute path withisCustom: true
. - Otherwise returns
{ path: 'article', isCustom: false }
relative toBaseDirectory.AppData
.
Example:
const { path: root, isCustom } = await getWorkspacePath();
// root: "/Users/alice/myNotes" or "article"
getFilePathOptions
Build arguments for Tauri FS calls given a path relative to the workspace root.
Signature:
async function getFilePathOptions(relativePath: string)
: Promise<{ path: string; baseDir?: BaseDirectory }>;
- Custom workspace:
{ path: absolutePath }
- Default workspace:
{ path:
article/${relativePath}, baseDir: BaseDirectory.AppData }
Use for reading or writing files:
import { readTextFile, writeTextFile } from '@tauri-apps/plugin-fs';
async function loadDoc(rel: string) {
const opts = await getFilePathOptions(rel);
return opts.baseDir
? await readTextFile(opts.path, { baseDir: opts.baseDir })
: await readTextFile(opts.path);
}
async function saveDoc(rel: string, content: string) {
const opts = await getFilePathOptions(rel);
const args = opts.baseDir ? { baseDir: opts.baseDir } : {};
await writeTextFile(opts.path, content, args);
}
getGenericPathOptions
Like getFilePathOptions
but supports arbitrary AppData subfolders (e.g. images).
Signature:
async function getGenericPathOptions(
path: string,
prefix?: string
): Promise<{ path: string; baseDir?: BaseDirectory }>;
- Custom: always returns an absolute path under the custom root, prepending
prefix/
if provided. - Default:
{ path:
${prefix}/${path}, baseDir: AppData }
unlesspath
already starts withprefix
.
Example—saving an image:
import { writeBinaryFile } from '@tauri-apps/plugin-fs';
async function saveImage(relPath: string, data: Uint8Array) {
const opts = await getGenericPathOptions(relPath, 'image');
const args = opts.baseDir ? { baseDir: opts.baseDir } : {};
await writeBinaryFile(opts.path, data, args);
}
toWorkspaceRelativePath
Convert any absolute or default-prefixed path back to a path relative to the workspace root.
Signature:
async function toWorkspaceRelativePath(path: string): Promise<string>;
- Default mode: strips leading
article/
segments. - Custom mode: removes the absolute custom root prefix (and any leading slash).
- Leaves already-relative paths untouched.
Example—normalizing results from Tauri’s readDir
:
import { readDir } from '@tauri-apps/plugin-fs';
import { getWorkspacePath, toWorkspaceRelativePath } from '@/lib/workspace';
async function listMarkdownFiles() {
const { path: root, isCustom } = await getWorkspacePath();
const raw = isCustom
? await readDir(root)
: await readDir('article', { baseDir: BaseDirectory.AppData });
const relPaths = await Promise.all(
raw
.filter(e => e.isFile && e.name.endsWith('.md'))
.map(async e => {
const full = isCustom
? `${root}/${e.name}`
: `article/${e.name}`;
return toWorkspaceRelativePath(full);
})
);
return relPaths; // e.g. ['notes/todo.md', 'readme.md']
}
RAG Indexing: Splitting and Indexing Markdown Files
Explain how markdown files are chunked into manageable pieces and indexed in the vector database by computing embeddings for each chunk.
chunkText
Split long text into overlapping chunks, using paragraph then sentence boundaries.
Signature:
function chunkText(
text: string,
chunkSize: number = 1000,
chunkOverlap: number = 200
): string[];
- If
text.length ≤ chunkSize
, returns[text]
. - Splits on
\n\n
(paragraphs), accumulates until adding the next paragraph would exceedchunkSize
. - When full, pushes the chunk and carries over up to
chunkOverlap
characters from its end. - If a single paragraph >
chunkSize
, splits further by sentence boundaries (.
,?
,!
).
Example:
const longMd = await readTextFile('notes.md');
const chunks = chunkText(longMd, 800, 150);
console.log(chunks.length); // e.g. 5
processMarkdownFile
Read a markdown file, split into chunks, compute embeddings, and upsert each chunk into the vector store.
Signature:
async function processMarkdownFile(
filePath: string,
fileContent?: string
): Promise<boolean>;
Workflow:
- Determine workspace path and load file content.
- Read
ragChunkSize
andragChunkOverlap
fromstore.json
. const chunks = chunkText(content, chunkSize, chunkOverlap)
- Derive
filename
fromfilePath
. await deleteVectorDocumentsByFilename(filename)
- For each
chunk
: a.const embedding = await fetchEmbedding(chunk)
b.await upsertVectorDocument({ filename, chunk_id: i, content: chunk, embedding: JSON.stringify(embedding), updated_at: Date.now() })
- Return
true
if successful, else log and returnfalse
.
Example:
import { processMarkdownFile } from '@/lib/rag';
const success = await processMarkdownFile('article/guide.md');
if (success) console.log('Indexed guide.md successfully');
else console.error('Indexing failed for guide.md');
Sync: Uploading Files to Remote Repositories
Provide a simple API to create or update a file in GitHub, Gitee, or GitLab. Automatically handles base64 conversion, branch, proxy, and error reporting.
Common Helpers
import { fileToBase64 } from '@/lib/github'; // same in gitee.ts & gitlab.ts
import { decodeBase64ToString } from '@/lib/github';
// Convert a browser File into base64 (no data URL prefix)
const base64 = await fileToBase64(myFile);
// Decode a base64‐encoded UTF-8 string
const text = decodeBase64ToString(base64);
GitHub Example
import { uploadFile as uploadToGitHub } from '@/lib/github';
import { RepoNames } from '@/lib/github.types';
async function syncImage(file: File) {
const ext = file.name.split('.').pop()!;
const content = await fileToBase64(file);
const result = await uploadToGitHub({
ext,
file: content,
repo: RepoNames.sync,
path: 'images', // optional folder
message: 'Add new image 📷'
});
if (result) {
console.log('GitHub URL:', result.data.content.html_url);
}
}
Gitee Example
import { uploadFile as uploadToGitee } from '@/lib/gitee';
import { RepoNames } from '@/lib/github.types';
async function syncDocument(file: File) {
const ext = file.name.split('.').pop()!;
const content = await fileToBase64(file);
const response = await uploadToGitee({
ext,
file: content,
repo: RepoNames.sync,
filename: 'report.pdf', // override UUID name
path: 'docs',
message: 'Publish monthly report'
});
if (response) {
console.log('Gitee URL:', response.data.content.download_url);
}
}
GitLab Example
import { uploadFile as uploadToGitLab } from '@/lib/gitlab';
import { RepoNames } from '@/lib/github.types';
async function syncScript(file: File) {
const ext = file.name.split('.').pop()!;
const content = await fileToBase64(file);
const res = await uploadToGitLab({
ext,
file: content,
repo: RepoNames.sync,
path: 'scripts',
message: 'Update script v2'
});
if (res) {
console.log('GitLab file committed:', res.data.file_path);
}
}
Key Parameters
ext
: file extension without dotfile
: base64-encoded contentfilename?
: override auto-generated<UUID>.<ext>
sha?
: existing file SHA/commit for updatesmessage?
: commit messagerepo
: enum name of your sync repositorypath?
: folder path in repo; spaces convert to underscores
Error Handling
- Displays a toast on network/API errors
- Returns
null
(GitHub/Gitee) orfalse
(GitLab) for recoverable failures - Throws unexpected errors for catch-all logic
Keyboard Shortcut Configuration
Define, register and emit application-wide keyboard shortcuts using the ShortcutSettings
, ShortcutDefault
and EmitterShortcutEvents
enums.
Config Enums
// src/config/shortcut.ts
export enum ShortcutSettings {
screenshot = "shotcut-screenshot",
text = "shotcut-text",
pin = "window-pin",
link = "shotcut-link",
}
export enum ShortcutDefault {
screenshot = "Control+Shift+S",
text = "Control+Shift+T",
pin = "Control+Shift+P",
link = "Control+Shift+L",
}
// src/config/emitters.ts
export enum EmitterShortcutEvents {
screenshot = "screenshot-shortcut-register",
text = "text-shortcut-register",
pin = "window-pin-register",
link = "link-shortcut-register",
}
Main Process: Registering Shortcuts
- Read stored accelerator or fallback to default
- Call Electron’s
globalShortcut.register
- Emit IPC event on trigger
import { app, BrowserWindow, globalShortcut } from "electron";
import store from "./store";
import { ShortcutSettings, ShortcutDefault } from "./config/shortcut";
import { EmitterShortcutEvents } from "./config/emitters";
let mainWindow: BrowserWindow;
function registerAllShortcuts() {
(Object.keys(ShortcutDefault) as Array<keyof typeof ShortcutDefault>)
.forEach(action => {
const settingKey = ShortcutSettings[action];
const accelerator = store.get(settingKey, ShortcutDefault[action]);
globalShortcut.register(accelerator, () => {
mainWindow.webContents.send(EmitterShortcutEvents[action]);
});
});
}
app.on("ready", () => {
mainWindow = new BrowserWindow({ /* ... */ });
registerAllShortcuts();
});
Renderer Process: Handling Shortcut Events
import { ipcRenderer } from "electron";
import { EmitterShortcutEvents } from "../config/emitters";
ipcRenderer.on(EmitterShortcutEvents.screenshot, () => {
startScreenshotMode();
});
ipcRenderer.on(EmitterShortcutEvents.text, () => {
openTextCapture();
});
Updating Shortcuts at Runtime
When a user changes a shortcut:
function updateShortcut(action: keyof typeof ShortcutDefault, newAccel: string) {
const old = store.get(ShortcutSettings[action], ShortcutDefault[action]);
globalShortcut.unregister(old);
store.set(ShortcutSettings[action], newAccel);
globalShortcut.register(newAccel, () => {
mainWindow.webContents.send(EmitterShortcutEvents[action]);
});
}
Wrap registration/unregistration in try/catch to handle invalid accelerators or conflicts.
Configuration Guide
This guide maps each configurable screen in the app to its underlying storage key (in store.json) or environment variable and shows how to read and update settings via src/stores/setting.ts
.
Initialization
Load persisted settings on app start.
Storage file: @tauri-apps/plugin-store
→ store.json
import { useEffect } from 'react'
import useSettingStore from '@/stores/setting'
export function App() {
useEffect(() => {
// Loads all keys from store.json into Zustand and applies env-var fallbacks
useSettingStore.getState().initSettingData()
}, [])
return <YourAppUI />
}
1. General Settings Screen
Manage theme and UI language.
UI Field | Store Key | Env Var | Setter |
---|---|---|---|
Theme | theme |
— | setTheme(value) |
Language | language |
— | setLanguage(code) |
import useSettingStore from '@/stores/setting'
function GeneralSettings() {
const theme = useSettingStore(s => s.theme)
const setTheme = useSettingStore(s => s.setTheme)
const lang = useSettingStore(s => s.language)
const setLang = useSettingStore(s => s.setLanguage)
return (
<>
<label>Theme
<select value={theme} onChange={e => setTheme(e.target.value)}>
<option value="light">Light</option>
<option value="dark">Dark</option>
</select>
</label>
<label>Language
<select value={lang} onChange={e => setLang(e.target.value)}>
<option value="en">English</option>
<option value="zh">中文</option>
</select>
</label>
</>
)
}
2. AI Models Screen
Configure multiple AI services and select primary models.
UI Section | Store Key | Description |
---|---|---|
Model list | aiModelList |
Array of { key, model, modelType, baseURL, apiKey } |
Primary chat model | chatPrimaryModel |
Key of selected chat model |
Primary embedding model | embeddingPrimaryModel |
Key of selected embedding model |
import useSettingStore, { AiConfig } from '@/stores/setting'
async function addNewModel() {
const newModel: AiConfig = {
key: 'my-azure-openai',
model: 'gpt-4',
modelType: 'chat',
baseURL: 'https://my-azure-endpoint.openai.azure.com',
apiKey: 'AZURE_KEY'
}
await useSettingStore.getState().addAiModel(newModel)
// Optionally set as primary:
useSettingStore.getState().setChatPrimaryModel(newModel.key)
}
3. Backup Provider Selection
Choose one of the supported providers.
UI Field | Store Key | Setter |
---|---|---|
Backup provider | backupProvider |
setBackupProvider(provider) |
Supported values: 'webdav'
, 'github'
, 'gitee'
, 'gitlab'
const provider = useSettingStore(s => s.backupProvider)
const setProvider = useSettingStore(s => s.setBackupProvider)
<select value={provider} onChange={e => setProvider(e.target.value)}>
<option value="webdav">WebDAV</option>
<option value="github">GitHub</option>
<option value="gitee">Gitee</option>
<option value="gitlab">GitLab</option>
</select>
WebDAV settings use a separate
useWebDAVStore
(see WebDAV section).
4. GitHub Integration Screen
UI Field | Store Key | Env Var | Setter |
---|---|---|---|
Access Token | githubAccessToken |
GITHUB_ACCESS_TOKEN |
setGithubAccessToken(token) |
Repository | githubRepo |
GITHUB_REPO |
setGithubRepo(owner/repo) |
Branch | githubBranch |
GITHUB_BRANCH |
setGithubBranch(name) |
Path prefix | githubPath |
GITHUB_PATH |
setGithubPath(path) |
import useSettingStore from '@/stores/setting'
function GitHubSettings() {
const token = useSettingStore(s => s.githubAccessToken)
const setToken = useSettingStore(s => s.setGithubAccessToken)
return (
<>
<input
type="password"
value={token}
onChange={e => setToken(e.target.value)}
placeholder="GitHub Personal Access Token"
/>
{/* Repo, branch, path inputs analogous */}
</>
)
}
5. Gitee Integration Screen
UI Field | Store Key | Env Var | Setter |
---|---|---|---|
Access Token | giteeAccessToken |
GITEE_ACCESS_TOKEN |
setGiteeAccessToken(token) |
Repository | giteeRepo |
GITEE_REPO |
setGiteeRepo(owner/repo) |
Branch | giteeBranch |
GITEE_BRANCH |
setGiteeBranch(name) |
Path prefix | giteePath |
GITEE_PATH |
setGiteePath(path) |
Implementation mirrors GitHub above.
6. GitLab Integration Screen
UI Field | Store Key | Env Var | Setter |
---|---|---|---|
Access Token | gitlabAccessToken |
GITLAB_ACCESS_TOKEN |
setGitlabAccessToken(token) |
Repository | gitlabRepo |
GITLAB_REPO |
setGitlabRepo(owner/repo) |
Branch | gitlabBranch |
GITLAB_BRANCH |
setGitlabBranch(name) |
Path prefix | gitlabPath |
GITLAB_PATH |
setGitlabPath(path) |
Implementation mirrors GitHub above.
Practical Tips
- Always call
initSettingData()
once on mount. - Reading a value:
const val = useSettingStore(s => s.key)
. - Updating:
await useSettingStore.getState().setKey(value)
. - All setters persist to disk and notify subscribers.
- Provide env-var defaults for CI or private deployments; store values override env.
Backup & Sync
Manage cloud‐ and inter‐device data with built-in GitHub, Gitee, and WebDAV integrations in the codexu/note-gen app.
GitHub Sync API Wrapper
Provide a set of wrapper functions to perform create, read, update, and delete operations on GitHub repositories and files using the GitHub REST API.
Key Features
- Automatically reads
accessToken
,githubUsername
and optionalproxy
from@tauri-apps/plugin-store
- Uses
@tauri-apps/plugin-http
’sfetch
to standardize headers, status checks, and error handling - Core methods:
- Authentication & user info (
getUserInfo
) - Repository check/create (
checkSyncRepoState
,createSyncRepo
) - File upload (
uploadFile
) - List files (
getFiles
) - Delete file (
deleteFile
) - File commit history (
getFileCommits
) - Latest release retrieval (
getRelease
)
- Authentication & user info (
1. Authenticate and Get User Info
import { getUserInfo } from '@/lib/github';
async function initGitHub(token?: string) {
const resp = await getUserInfo(token);
if (!resp) throw new Error('GitHub authentication failed');
console.log('Logged in as', resp.data.login);
}
2. Check or Create Sync Repository
import { checkSyncRepoState, createSyncRepo } from '@/lib/github';
async function ensureRepo(name: string) {
const repo = await checkSyncRepoState(name);
return repo || await createSyncRepo(name, true); // true = private
}
3. Upload a File
Converts a File
object to Base64, then uploads:
import { fileToBase64, uploadFile } from '@/lib/github';
import { RepoNames } from '@/lib/github.types';
async function uploadImage(file: File) {
const base64 = await fileToBase64(file); // strips data URI prefix
const result = await uploadFile({
ext: file.name.split('.').pop() || 'png',
file: base64,
filename: 'my-image', // optional, defaults to UUID
message: 'Add new image', // commit message
repo: RepoNames.sync, // must be an enum value
path: 'uploads' // optional subdirectory
});
console.log('Upload URL:', result?.data.content.html_url);
}
4. List Files in a Directory
import { getFiles } from '@/lib/github';
import { RepoNames } from '@/lib/github.types';
async function listUploads() {
const files = await getFiles({ repo: RepoNames.sync, path: 'uploads' });
if (!files) return console.log('No files found');
files.forEach(f => console.log(f.name, f.download_url));
}
5. Delete a File
import { deleteFile } from '@/lib/github';
import { RepoNames } from '@/lib/github.types';
async function removeFile(path: string, sha: string) {
const success = await deleteFile({ repo: RepoNames.sync, path, sha });
console.log(success ? 'Deletion succeeded' : 'Deletion failed');
}
6. Get File Commit History
import { getFileCommits } from '@/lib/github';
import { RepoNames } from '@/lib/github.types';
async function showHistory(path: string) {
const commits = await getFileCommits({ repo: RepoNames.sync, path });
commits?.forEach(c => console.log(c.sha, c.commit.message));
}
7. Fetch Latest Release
import { getRelease } from '@/lib/github';
async function checkLatestRelease() {
const release = await getRelease();
if (release) console.log('Latest version:', release.tag_name);
}
Practical Tips
- Call
getUserInfo
on app startup to initializegithubUsername
. - Use
proxy
in settings to route all API calls. - Always
checkSyncRepoState
beforecreateSyncRepo
. - Spaces in filenames/paths convert to underscores to avoid conflicts.
Gitee File Upload (uploadFile
)
Create or update a file in a Gitee repository. Handles token loading, proxy settings, method selection (POST vs PUT), and error reporting via toast.
Function Signature
export async function uploadFile(options: {
ext: string; // without dot
file: string; // Base64-encoded content
filename?: string; // defaults to UUID.ext
sha?: string; // existing file SHA for update
message?: string; // commit message
repo: RepoNames.sync; // target enum value
path?: string; // optional subdirectory
}): Promise<GiteeResponse<any> | null | false>
How It Works
- Reads
giteeAccessToken
,giteeUsername
, and optionalproxy
from@tauri-apps/plugin-store
. - Generates a UUID filename when none is provided.
- Replaces spaces with underscores in filenames/paths.
- Chooses POST (create) or PUT (update when
sha
present). - Sends
{ access_token, content, message, branch: 'master', sha }
. - Returns parsed
data
on success,null
on client errors (400), orfalse
on failure (with toast).
Prerequisites
- In
store.json
set:giteeAccessToken
: your Gitee tokengiteeUsername
: your Gitee loginproxy
(optional): HTTP(S) proxy URL
Example
import { fileToBase64 } from '@/lib/gitee';
import { uploadFile } from '@/lib/gitee';
import { RepoNames } from '@/lib/github.types';
async function handleFileInput(e: Event) {
const input = e.target as HTMLInputElement;
if (!input.files?.length) return;
const file = input.files[0];
const base64 = await fileToBase64(file);
const ext = file.name.split('.').pop() || 'bin';
const result = await uploadFile({
ext,
file: base64,
filename: file.name,
message: `Add ${file.name}`,
repo: RepoNames.sync,
path: 'images'
});
if (result?.data.content.download_url) {
console.log('Uploaded to:', result.data.content.download_url);
} else {
console.warn('Upload failed or returned null');
}
}
Practical Tips
- To update: fetch SHA via
getFiles
, then passsha
touploadFile
. - Default branch is
master
; change in source if usingmain
. - Network or non-404 errors trigger toast and return
false
. - Extend filename sanitization regex as needed.
WebDAV Backup Command (webdav_backup
)
A Tauri command that uploads all Markdown files from the local workspace to a WebDAV directory, auto-creating missing remote folders. Returns a “success_count/total_count” string or an error.
Signature
#[tauri::command]
pub async fn webdav_backup(
url: String,
username: String,
password: String,
path: String,
app: AppHandle,
) -> Result<String, String>
Workflow
- Client Setup:
create_client(url, username, password)
→reqwest_dav::Client
with Basic auth.
- Ensure Remote Base Directory:
client.list(path, Depth::Infinity)
→ if fails,client.mkcol(path)
.
- Discover Local Markdown:
get_workspace_info(app)
readsworkspacePath
fromstore.json
(default:"article"
).get_markdown_files()
returns(relative_path, content)
for each.md
file.
- Upload Loop:
- For each file:
- Build
remote_path = format!("{}/{}", path.trim_end_matches('/'), relative_path)
. - Ensure parent directory exists (list → mkcol).
client.put(remote_path, content_bytes)
.- Track successes, abort on fatal error.
- Build
- For each file:
- Result:
Ok("{}/{}", success_count, total_files)
orErr(...)
.
Rust Usage Example
let result = webdav_backup(
"https://example.com/webdav".into(),
"user".into(),
"pass".into(),
"backups/notes".into(),
app_handle.clone()
).await?;
println!("Uploaded: {}", result); // e.g. "12/12"
Invoke from Frontend
import { invoke } from '@tauri-apps/api';
const count = await invoke<string>('webdav_backup', {
url, username, password, path
});
// count === "uploaded/total"
Practical Tips
- Do not include leading slash in
path
; command strips it. - Missing remote directories auto-created.
- Large workspaces may require a UI spinner (
backupState
). - Errors bubble up with context (e.g. failed
mkcol
, upload error).
AI & Processing Engine
Centralize AI/ML features—chat completion, embeddings, reranking, keyword extraction, OCR, screenshot capture and local RAG search—so your app can call a single module for all AI-powered workflows.
Chat Completion (src/lib/ai.ts)
Provide conversational AI via OpenAI-compatible chat models with error handling and optional streaming.
Signature
async function chatCompletion(
messages: ChatMessage[],
options?: { modelKey?: string; stream?: boolean }
): Promise<string | AsyncIterable<string>>;
Usage
import { chatCompletion } from '@/lib/ai';
async function askBot() {
const messages = [
{ role: 'system', content: 'You are an assistant.' },
{ role: 'user', content: 'Summarize the benefits of TypeScript.' }
];
const reply = await chatCompletion(messages);
console.log('Bot:', reply);
}
// Streaming example
(async () => {
const stream = await chatCompletion(messages, { stream: true });
for await (const chunk of stream as AsyncIterable<string>) {
process.stdout.write(chunk);
}
})();
Embeddings (src/lib/ai.ts)
Generate semantic vectors for text, used in similarity search or RAG pipelines.
Signature
async function fetchEmbedding(text: string): Promise<number[] | null>;
Usage
import { fetchEmbedding } from '@/lib/ai';
import { initVectorDb, upsertVectorDocument } from '@/db/vector';
async function indexFile(filename: string, content: string) {
await initVectorDb();
const embedding = await fetchEmbedding(content);
if (!embedding) throw new Error('Embedding failed');
await upsertVectorDocument({
filename,
chunk_id: 0,
content,
embedding: JSON.stringify(embedding),
updated_at: Date.now(),
});
}
Document Reranking (src/lib/ai.ts)
Reorder candidate passages by relevance to a query using a rerank model.
Signature
async function rerankDocuments(
query: string,
candidates: string[]
): Promise<{ score: number; text: string }[]>;
Usage
import { rerankDocuments } from '@/lib/ai';
async function refineResults() {
const query = 'What is vector similarity?';
const docs = ['Vectors measure...', 'Similarity is...','Math principles...'];
const ranked = await rerankDocuments(query, docs);
console.log(ranked[0]); // highest relevance
}
Keyword Extraction (src-tauri/src/keywords.rs)
Extract top-weighted keywords from text using Jieba + TextRank via a Tauri command.
Rust Command
#[tauri::command]
pub fn keywords(text: String, top_n: usize) -> Vec<Keyword> { … }
JS Invocation
import { invoke } from '@tauri-apps/api/tauri';
interface Keyword { word: string; weight: f32; }
async function extractKeywords(text: string) {
const list = await invoke<Keyword[]>('keywords', { text, top_n: 5 });
return list;
}
extractKeywords('Deep learning enables…').then(console.log);
OCR Utility (src/lib/ocr.ts)
Perform OCR on images saved in AppData using Tesseract.js with configurable languages and a 30s timeout.
import ocr from '@/lib/ocr';
async function readImageText(path: string) {
const text = await ocr(path);
if (text.startsWith('OCR')) console.error('Error:', text);
else console.log('Extracted:', text);
}
readImageText('screenshots/latest.png');
Screenshot Capture (src-tauri/src/screenshot.rs)
Capture all visible, non-minimized windows; save PNGs to AppData/temp_screenshot; return metadata.
Rust Struct
#[derive(Serialize, Deserialize)]
pub struct ScreenshotImage {
pub name: String,
pub path: String,
pub width: u32,
pub height: u32,
pub x: i32,
pub y: i32,
pub z: i32,
}
JS Invocation
import { invoke } from '@tauri-apps/api/tauri';
async function captureAllWindows() {
const images = await invoke<ScreenshotImage[]>('screenshot');
images.forEach(img => {
const el = document.createElement('img');
el.src = `file://${img.path}`;
el.width = img.width / 4;
document.body.appendChild(el);
});
}
captureAllWindows();
Local RAG Search (src/lib/rag.ts)
Ingest Markdown into a vector store and retrieve context snippets for queries.
Core Functions
• initVectorDb(): setup SQLite vector table
• processMarkdownFile(path: string): split, embed & store chunks
• getContextForQuery(query: string): return top passages with context
• handleFileUpdate(path: string): refresh file in vector store
Example: Ingest & Query
import {
initVectorDb,
processMarkdownFile,
getContextForQuery
} from '@/lib/rag';
async function buildAndSearch() {
await initVectorDb();
await processMarkdownFile('/notes/guide.md');
const context = await getContextForQuery('How to configure Tauri?');
console.log('Context:', context);
}
buildAndSearch();
## Backend API Reference
### Fuzzy Search Commands
Purpose
Provide a high-performance, parallelized fuzzy search over arbitrary `SearchItem` collections in your Tauri backend. Exposes two invokable commands:
- `fuzzy_search` – standard execution
- `fuzzy_search_parallel` – alias that currently delegates to `fuzzy_search`
Core Types
• **SearchItem**
- Represents an indexed record with optional fields: `id`, `title`, `desc`, `article`, `path`, `url`, `search_type`
- On client request can carry back `score?: number` and `matches?: MatchInfo[]`
• **MatchInfo**
- `{ key: string; indices: [number,number][]; value: string }`
- Describes which characters in which field matched
• **FuzzySearchResult**
- `{ item: SearchItem; refindex: number; score: number; matches: MatchInfo[] }`
- `refindex` is the original index in the `items` array
Command Signature
```rust
#[tauri::command]
pub fn fuzzy_search(
items: Vec<SearchItem>,
query: String,
keys: Vec<String>,
threshold: f64,
include_score: bool,
include_matches: bool,
) -> Vec<FuzzySearchResult> { … }
Parameters
items
: array of records to searchquery
: search pattern (empty yields[]
)keys
: subset of["desc","title","article","path","search_type"]
to testthreshold
: normalized match-score cutoff (0.0–1.0)include_score
: iffalse
, result.score and item.score are zeroed/clearedinclude_matches
: iffalse
, match spans are dropped
Behavior
- Builds a
SkimMatcherV2
per invocation. - For each item and each requested key:
- runs
fuzzy_indices(text, query)
- computes
normalized_score = abs(score) / query.len()
- skips if below
threshold
- collects best raw
score
and all matching spans
- runs
- Filters out non-matching items, sorts descending by
score
. - Clears scores/matches based on flags.
TypeScript Invocation Example
import { invoke } from '@tauri-apps/api/tauri';
interface MatchInfo { key: string; indices: [number,number][]; value: string; }
interface SearchItem { id?: string; title?: string; desc?: string; /*…*/ }
interface FuzzySearchResult {
item: SearchItem;
refindex: number;
score: number;
matches: MatchInfo[];
}
async function runSearch() {
const items: SearchItem[] = [ /* your indexed data */ ];
const results = await invoke<FuzzySearchResult[]>('fuzzy_search', {
items,
query: 'rust module',
keys: ['title','desc'],
threshold: 0.5,
include_score: true,
include_matches: true
});
console.log(results);
}
Practical Tips
- Tune
threshold
to control fuzziness: lower allows looser matches. - Select only needed fields in
keys
to reduce work. - Use
include_matches=false
when you only care about top-N scoring. fuzzy_search_parallel
currently aliasesfuzzy_search
, swap in a parallel version if needed.
Custom Tauri HTTP Fetch Wrapper
Purpose
Provide a drop-in replacement for the browser fetch
API that routes requests through Tauri’s plugin-http
, supporting native timeouts, proxies, max redirections, and proper header encoding/decoding.
API Signature
async function fetch(
input: string,
init?: RequestInit & ClientOptions
): Promise<Response>
Key Features
- Uses
invoke('plugin:http|fetch')
under the hood - Supports
maxRedirections
,connectTimeout
,proxy
,danger
flags - Honors and propagates
AbortSignal
- Encodes request body as
Uint8Array
- URI-encodes response headers to ensure validity
- Returns a standard
Response
with overridden.url
and.headers
Usage Example
import { fetch } from 'src/lib/encode-fetch';
async function postData() {
const controller = new AbortController();
const timeout = setTimeout(() => controller.abort(), 3000);
try {
const res = await fetch('https://api.example.com/data', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ foo: 'bar' }),
connectTimeout: 5000, // TCP connect timeout in ms
maxRedirections: 5, // max HTTP redirects
proxy: 'http://localhost:8888', // optional proxy
danger: { acceptInvalidCerts: true},// skip TLS validation
signal: controller.signal
});
clearTimeout(timeout);
if (!res.ok) {
console.error(`HTTP ${res.status}: ${res.statusText}`);
return;
}
const data = await res.json();
console.log('Received:', data);
} catch (e: any) {
if (e.message === 'Request canceled') {
console.warn('Fetch aborted or timed out');
} else {
console.error('Fetch error:', e);
}
}
}
Abort Handling
- Checks
signal.aborted
before dispatch and after header mapping - On abort, calls
invoke('plugin:http|fetch_cancel', { rid })
- Throws
Error('Request canceled')
consistent with standard fetch
Custom Options
Option | Type | Description |
---|---|---|
connectTimeout |
number | ms to wait for TCP connect |
maxRedirections |
number | how many 3xx redirects to follow |
proxy |
string | proxy URL (e.g., http://proxy.local:3128 ) |
danger |
object | plugin-specific flags (e.g., TLS skip opts) |
Under the Hood
- Strip custom fields from
init
to avoid nativeRequest
confusion - Build a
Request
to extract headers & body asArrayBuffer
- Map headers to
[name,value][]
and encode withencodeURI
- Invoke Tauri plugin methods:
plugin:http|fetch
to start (returns request ID)plugin:http|fetch_send
to send headers & receive statusplugin:http|fetch_read_body
to retrieve body
- Construct a standard
Response
, override.url
and.headers
Practical Tips
- Always provide a
signal
if you need cancellable requests - Clean up timeouts/
AbortController
to avoid dangling plugin requests - Use
danger
sparingly – only in trusted dev environments - Check
res.ok
/res.status
before consuming body - Proxy support is useful for debugging via local interceptors (e.g., Charles, Fiddler)
Quality & Delivery
Maintain code health through consistent linting standards and automated releases.
ESLint Configuration (.eslintrc.json)
Purpose: Enforce Next.js and TypeScript linting defaults while allowing scoped overrides.
This project’s .eslintrc.json
:
{
"extends": ["next/core-web-vitals", "next/typescript"],
"rules": {
"react-hooks/exhaustive-deps": "off",
"@typescript-eslint/no-explicit-any": "off"
},
"ignorePatterns": ["docs/**"]
}
Key sections:
Extends
•next/core-web-vitals
: React rules tuned for Core Web Vitals
•next/typescript
: TypeScript support for Next.jsRules Overrides
•react-hooks/exhaustive-deps: off
– Skip exhaustive-deps checks when you manage dependencies manually.
– Enable per-file with:js /* eslint react-hooks/exhaustive-deps: "warn" */
•@typescript-eslint/no-explicit-any: off
– Allowany
in scripts or third-party integrations.
– Scope via overrides:jsonc { "overrides": [ { "files": ["scripts/**/*.ts"], "rules": { "@typescript-eslint/no-explicit-any": "off" } } ] }
ignorePatterns
•docs/**
: exclude documentation from linting
• Extend patterns (e.g.,build/
,public/
) as needed
Practical Usage
npm run lint
npx eslint src/components/Button.tsx --fix
To add a new plugin rule:
- npm install -D eslint-plugin-import
- Update
.eslintrc.json
:{ "plugins": ["import"], "rules": { "import/order": ["error", { "groups": ["builtin","external","internal"] }] } }
Extracting and Propagating Tauri App Version
Purpose: Capture the semantic version from tauri-apps/tauri-action
in publish-tauri
and expose it for downstream jobs.
How It Works
- Declare
appVersion
as a job output. - On Ubuntu, read
steps.tauri-action.outputs.appVersion
. - Write to
$GITHUB_OUTPUT
for use by other jobs.
Key Snippet:
jobs:
publish-tauri:
strategy:
matrix:
include:
- platform: 'ubuntu-24.04'
args: ''
- platform: 'windows-latest'
args: '--msix'
- platform: 'macos-latest'
args: ''
runs-on: ${{ matrix.platform }}
outputs:
appVersion: ${{ steps.set_output.outputs.appVersion }}
steps:
- uses: actions/checkout@v4
- uses: tauri-apps/tauri-action@v0
id: tauri-action
with:
# Tauri config here
- name: Extract version
if: matrix.platform == 'ubuntu-24.04'
run: echo "appVersion=${{ steps.tauri-action.outputs.appVersion }}" >> $GITHUB_OUTPUT
id: set_output
Usage in Downstream Job:
jobs:
upgradeLink-upload:
needs: publish-tauri
runs-on: ubuntu-latest
steps:
- uses: toolsetlink/upgradelink-action@v5
with:
source-url: >
https://github.com/codexu/note-gen/releases/
download/note-gen-v${{ needs.publish-tauri.outputs.appVersion }}/latest.json
access-key: ${{ secrets.UPGRADE_LINK_ACCESS_KEY }}
tauri-key: ${{ secrets.UPGRADE_LINK_TAURI_KEY }}
github-token: ${{ secrets.GITHUB_TOKEN }}
Practical Tips
• Restrict write-back to one OS to avoid race conditions
• Match the id
of the tauri-action
step when reading its outputs
• Inject the version via ${{ needs.publish-tauri.outputs.appVersion }}
Contributing
Thank you for improving NoteGen! This guide covers how to file issues, propose features, open pull requests, our branch strategy, and coding style standards.
Filing Issues
Use GitHub Issues to report bugs or request help.
- Check existing issues for similar reports.
- Click New Issue and choose the Bug Report template.
- Provide:
- A descriptive title.
- Steps to reproduce.
- Expected vs. actual behavior.
- Environment details (OS, Node version).
- Label issues appropriately (bug, question).
Proposing Features
Create a feature request using the Feature Request template:
- Click New Issue > Feature Request.
- Describe:
- The problem you want to solve.
- Proposed API or UX changes.
- Any relevant examples or mockups.
- Use the
enhancement
label.
Discussion happens right in the issue thread.
Opening Pull Requests
- Fork the repository and clone your fork:
git clone git@github.com:your-username/note-gen.git cd note-gen
- Add upstream remote:
git remote add upstream git@github.com:codexu/note-gen.git
- Sync and create a feature branch:
git fetch upstream git checkout develop git pull upstream develop git checkout -b feature/your-feature-name
- Make your changes; run tests and linters:
npm install npm test npm run lint
- Commit with a clear, conventional message:
feat(notes): add support for markdown export
- Push and open a PR against the
develop
branch:git push -u origin feature/your-feature-name
- Fill out the PR template:
- Summary of changes.
- Related issues (e.g., “Closes #123”).
- Screenshots or logs if relevant.
- Request reviews from teammates; address feedback promptly.
Branch Strategy
We use a simplified Git Flow:
- main: always production-ready.
- develop: integration branch for features.
- feature/: new features and enhancements.
- fix/: bug fixes.
- hotfix/: urgent production fixes off
main
.
Release process:
- Merge completed features into
develop
. - Create a release branch (
release/vX.Y.Z
), fix minor bugs. - Merge release into
main
and tag. - Merge
main
back intodevelop
.
Branch names example:
feature/user-auth, fix/login-crash, hotfix/critical-bug
Coding Style
NoteGen uses ESLint and Prettier to enforce consistent code.
- Indentation: 2 spaces
- Quotes: single (
'
) - Semicolons: required
- Line length: 100 characters
Run checks locally:
npm run lint # ESLint
npm run format # Prettier
npm test # Jest tests
ESLint and Prettier configs live in the repo root (.eslintrc.js
, .prettierrc
). Ensure all checks pass before submitting a PR.
Thank you for contributing to NoteGen!
License
This project is released under the MIT License. It grants broad permissions and disclaims all warranties. You must include the full license text in any distributions or derivative works.
Summary
- Grants free permission to use, copy, modify, merge, publish, distribute, sublicense and/or sell copies.
- Requires inclusion of the original copyright and license notice.
- Disclaims warranties and limits liability.
Obligations
- Bundle the MIT License text in your source or documentation.
- Retain the following in every copy or substantial portion:
- Copyright notice
- Permission notice
Including the License
In package.json
Specify the license identifier so tooling recognizes it:
{
"name": "note-gen",
"version": "1.0.0",
"license": "MIT",
"files": [
"dist/",
"LICENSE"
]
}
In source files
Add a brief header to each file:
/**
* note-gen v1.0.0
* Copyright (c) 2025 CodexU
* Released under the MIT License
*/
Redistributing
Ensure your published bundle or Docker image includes the LICENSE file at its root:
- npm: The
files
field in package.json - Docker:
COPY LICENSE /app/
- TAR/GZIP: Preserve the LICENSE alongside your release artifacts.