# Kartuli Documentation Bundle > **Generated**: 2025-10-21T22:14:07.028Z > > This file contains all Kartuli documentation in a single markdown file for LLM context. > The documents are ordered as they appear in the documentation sidebar. --- ## Table of Contents ### Product - Project Overview ### Tech - **Decisions** - E2E Testing Implementation - CI/CD Foundation and Basic Validation - Version Management - Monorepo Setup - Code Quality - Core Tech Stack - Testing Strategy - Design System - Documentation Management - **Development** - AI-Assisted Workflow - Code Conventions - GitHub Workflow - Tech Stack & Providers --- # Product ## Mission To make Georgian language learning accessible and free for everyone, breaking down language barriers for international residents, newcomers, and travelers in Georgia ## Vision To become the go-to platform for Georgian language learning, ensuring that language is never a barrier to connecting with Georgian culture and community ## Core principles - **Free forever**: All learning content remains free forever - **No barriers**: No need to create an account, No need to have internet, no premium accounts, no paywalls, no content restrictions - **Cost optimization**: Optimize every infrastructure decision to support the maximum number of students while delaying monetization needs as long as possible - **Operational monetization only**: Revenue only covers operational costs (servers, databases, authentication, email services), **never for profit** ## Target market Non-Georgian speakers ## Value proposition - **Free**, comprehensive Georgian language learning **without any content restrictions or premium barriers** - Includes **offline support** for learning anywhere, anytime; - No need to create an account or install anything to **try the learning experience in just a few clicks** ## Revenue model - **Purpose**: Only to cover operational costs (servers, databases, authentication, email services), **never profit** - **Possible sources of income**: Affiliate partnerships with learning resources (books, courses), learning platforms (online classes) or physical language schools - **We would monetize only after provider free-tier limits are consistently exceeded** - **We will never go with**: Premium accounts, ads, paywalls, content restrictions, or marketing-focused monetization ## Competitive advantage - **Major language learning apps don't support Georgian well** - Existing Georgian language learning apps are revenue driven, prioritizing profit vs student learning experience (freemium models, subscriptions, ads...) - Multi-language support for native languages not typically supported --- # Tech ## Decisions # E2E Testing Implementation: Playwright Strategy **Date**: 2025-01-20 **Issue**: [#16](https://github.com/kartuli-app/kartuli/issues/16) ## Context We need a comprehensive E2E testing strategy to ensure the Georgian language learning platform works correctly across different environments and user journeys. The current testing setup includes unit and integration tests with vitest, but lacks end-to-end validation of the complete user experience. ## Decision We will implement E2E testing using **Playwright** with a phased approach: ### Tool Structure - **Location**: `/tools/e2e/` as a dedicated tool package - **Scope**: V1 focuses on `game-client` only - **Runner**: Playwright with Chromium-only for initial implementation ### V1 Implementation (Immediate) - **App boot test**: Verify game client loads and shows stable marker - **Console error test**: Ensure no error-level console messages on first load - **Skip 404 test**: Deferred to later phase ### CI/CD Integration - **PR workflow**: E2E tests run against Vercel preview URLs after deployment - **Main workflow**: E2E smoke tests run against production deployment - **Artifacts**: Screenshots, traces, videos uploaded on failure with 3-day retention ### Configuration - **Environment**: Single `BASE_URL` variable drives test execution - **Timeouts**: 5s expect, 30s test, 30s overall - **Concurrency**: 1 worker for stability - **Retry policy**: 1 retry in CI, 0 locally ## Consequences ### Positive ✅ **Quality gate** - E2E tests validate complete user workflows ✅ **Environment validation** - Tests run against real deployed applications ✅ **Regression prevention** - Catches integration issues missed by unit tests ✅ **CI integration** - Automated testing in deployment pipeline ✅ **Artifact collection** - Debug information available on failures ### Negative ⚠️ **Maintenance overhead** - E2E tests require ongoing maintenance ⚠️ **Execution time** - Adds ~1-2 minutes to CI pipeline ⚠️ **Flakiness risk** - Network and timing issues can cause false failures ## Implementation ### Current Status: V1 Complete - Playwright installed and configured in `/tools/e2e/` - V1 smoke tests implemented (app boot + console errors) - CI integration added to PR and main workflows - Stable selectors using `data-testid` attributes - Documentation and setup guides created ### Future Phases - **V1.1**: Primary flow smoke, navigation testing, assets sanity - **V2**: Mobile viewport testing, Firefox support, accessibility checks ### Test Structure ``` tools/e2e/ tests/ game-client/ smoke.spec.ts # V1: boot + console errors playwright.config.ts # Configuration package.json # Dependencies and scripts ``` ### CI Integration - Tests run after Vercel deployment in both PR and main workflows - Artifacts uploaded on failure with 3-day retention (free tier friendly) - Browser installation automated in CI pipeline --- # CI/CD Foundation and Basic Validation ## Context As the Kartuli project moves beyond the skeleton milestone, we need to establish automated quality checks and validation processes. The project is a monorepo with multiple packages and applications, requiring a robust CI/CD pipeline that can: - Validate code quality across all packages - Ensure type safety throughout the codebase - Run tests consistently - Provide feedback on pull requests - Prepare for future deployment automation Currently, the project has basic scripts for linting and testing, but lacks: - TypeScript type checking automation - GitHub Actions workflows for CI - Automated validation on pull requests - Foundation for future deployment pipelines ## Decision We will implement a basic CI/CD foundation with the following components: ### 1. TypeScript Type Checking - Add `typecheck` scripts to all packages with TypeScript files - Configure Turbo to run type checking with proper dependency ordering - Include type checking in the CI pipeline ### 2. GitHub Actions Workflow - Create a preview workflow that triggers on pull requests to main - Implement sequential validation: typecheck → lint → test - Use Node.js 20 with pnpm caching for optimal performance - Run validation across all packages in the monorepo ### 3. Package Configuration - Add typecheck scripts to: - `apps/game-client` (Next.js app) - `apps/backoffice-client` (Next.js app) - `packages/ui` (React components) - `tools/storybook` (Storybook tool) - Exclude packages without TypeScript files: - `packages/theme` (CSS only) - `tools/web-docs-client` (JavaScript only) ### 4. Turbo Configuration - Configure typecheck task with proper dependency ordering - Enable caching for improved performance - Define appropriate input patterns for cache invalidation ## Consequences ### Positive - **Automated Quality Gates**: Every PR will be validated for type safety, linting, and tests - **Early Error Detection**: Type errors caught before merge, reducing production issues - **Consistent Environment**: All developers work with the same validation standards - **Foundation for Deployment**: Establishes patterns for future deployment automation - **Performance**: Turbo caching reduces CI execution time - **Developer Experience**: Clear feedback on code quality issues ### Negative - **Initial Setup Complexity**: Requires configuration across multiple files - **CI Execution Time**: Additional validation steps increase PR feedback time - **Maintenance Overhead**: Need to maintain and update CI configurations - **Dependency on External Services**: Relies on GitHub Actions availability ### Risks - **False Positives**: CI failures due to environment differences - **Configuration Drift**: CI and local environments may diverge over time - **Performance Degradation**: Large monorepos may experience slower CI execution ## Implementation ### Current State The following has been implemented: 1. **Root Package Configuration** - Added `typecheck` script to root `package.json` - Configured to run across all packages via Turbo 2. **Package-Level Scripts** - Added `typecheck: "tsc --noEmit"` to all TypeScript packages - Fixed TypeScript configuration issues with jest-dom types - Excluded non-TypeScript packages appropriately 3. **Turbo Configuration** - Added typecheck task with dependency ordering - Enabled caching with appropriate input patterns - Configured to run in parallel where possible 4. **GitHub Actions Workflows** - Created `.github/workflows/app-deploy-pr.yml` for PR validation and preview deployments - Configured for pull request validation and Vercel preview deployments - Sequential execution: typecheck → lint → test → build → deploy - Node.js 20 with pnpm caching - Leverages Vercel's native GitHub integration for automatic preview cleanup 5. **Validation** - All packages pass typecheck validation - All packages pass linting validation - All tests pass successfully - GitHub Actions workflow ready for testing 4. **Production Deployment Workflow** - Created `.github/workflows/app-deploy-main.yml` - Configured for automated production deployments on main branch pushes - Uses Vercel Action for deployment with proper authentication - Path filtering ensures only game-client related changes trigger deployments - Includes build step before deployment for validation 5. **Vercel Integration** - Configured GitHub Actions secrets for Vercel authentication - Set up automated deployments to production environment - Custom domain `kartuli.app` configured and working - Manual deployment pipeline replaced with automated workflow - **✅ DEPLOYMENT SUCCESSFUL**: Production deployment working in ~2 minutes - **✅ RESOLVED**: Fixed Root Directory configuration conflicts - **✅ VERIFIED**: GitHub Actions → Vercel deployment pipeline operational 6. **Preview Deployments for Pull Requests** - Enhanced `app-deploy-pr.yml` workflow to include Vercel preview deployments - Each PR gets a unique preview URL for testing changes before merge - Automatic PR comments with preview URLs for easy access - Preview deployments automatically cleaned up by Vercel's native GitHub integration - Same validation pipeline (typecheck → lint → test → build → deploy) for consistency 7. **Remote Caching with Turborepo** - Enabled Turborepo remote caching for faster CI/CD builds - Build artifacts are shared between local development and CI environments - Significantly reduces build times in CI by reusing cached artifacts from local runs - Cache sharing ensures consistent builds between local and CI environments - Performance optimization reduces CI compute time and costs - **Cache Key Strategy**: Cache keys are based on file contents and configuration, not git metadata - **Cross-Environment Sharing**: Local → PR → Main → Team members all share the same cache - **Cache Behavior**: - Local development populates remote cache - PR workflows hit cache from local runs (fast builds) - Main workflows hit cache from PR runs (fast deployments) - Team members benefit from shared cache across all environments ### Next Steps This foundation enables future CI/CD enhancements: - E2E testing integration - Performance monitoring - Error tracking setup The implementation follows the principle of starting simple and building incrementally, ensuring a solid foundation for the project's growing CI/CD needs. --- **Migration Status**: Successfully migrated to `kartuli-app` organization. Documentation deployment pipeline verified and working at `https://kartuli-app.github.io/kartuli/` - January 19, 2025. --- # Version Management with PNPM Catalog **Date**: 2025-10-01 **Issue**: [#1](https://github.com/kartuli-app/kartuli/issues/1) ## Context We need to maintain consistent dependency versions across the monorepo to prevent version conflicts and ensure reproducible builds. ## Decision We will use **PNPM catalog** for centralized version management with pinned versions for all shared dependencies, workspace dependencies using catalog references, and peer dependency auto-install via `.npmrc` configuration. ## Consequences ### Positive ✅ **Version consistency** - All packages use same dependency versions ✅ **Easy updates** - Single place to update versions across workspace ✅ **Reproducible builds** - Consistent dependency resolution ✅ **Conflict prevention** - No version mismatches between packages ### Negative ⚠️ **Centralized control** - All version updates go through root package.json ⚠️ **Catalog maintenance** - Need to keep catalog up to date ## Implementation ### Current Catalog Configuration The root `package.json` contains the PNPM catalog with pinned versions: ```json { "pnpm": { "catalog": { "typescript": "^5.0.0", "react": "^18.0.0", "next": "^14.0.0", "vitest": "^1.0.0", "@testing-library/react": "^14.0.0" } } } ``` ### Package Dependency Pattern All packages use catalog references for shared dependencies: ```json { "dependencies": { "react": "catalog:", "typescript": "catalog:" } } ``` ### Configuration Files - **PNPM catalog** is maintained in root `package.json` for all shared dependencies - **Workspace dependencies** consistently use catalog references - **Peer dependency auto-install** is configured via `.npmrc` settings --- # Monorepo Setup with pnpm and Turborepo **Date**: 2025-10-02 **Issue**: [#1](https://github.com/kartuli-app/kartuli/issues/1) ## Context We need to establish a solid foundation for the Kartuli project with proper package management, build system, and development workflow. ## Decision We will use **pnpm** with **Turborepo** for monorepo management with pnpm for workspace management and dependency resolution, PNPM catalog for version pinning across workspace, auto-install-peers=true in `.npmrc` for seamless peer dependency handling, Turborepo for build orchestration and caching, and Git workflow with conventional commits. ## Consequences ### Positive ✅ **Fast builds** - Turborepo caching and parallel execution ✅ **Consistent dependencies** - PNPM catalog prevents version drift ✅ **Clean workspace** - Proper gitignore and npmrc configuration ✅ **Scalable structure** - Easy to add new packages and apps ### Negative ⚠️ **Learning curve** - Team needs to understand pnpm + turbo workflow ⚠️ **Tool complexity** - More configuration files to maintain ## Implementation ### Current File Structure ``` package.json # Root workspace configuration pnpm-workspace.yaml # Workspace definition turbo.json # Build system configuration .npmrc # PNPM settings .nvmrc # Node version .gitignore # Comprehensive ignore patterns ``` ### Monorepo Configuration - **pnpm** is configured for workspace management and dependency resolution - **Turborepo** handles build orchestration and caching - **Turbo.json** defines task dependencies and outputs - **Parallel execution** of tasks across packages is enabled - **Comprehensive .gitignore** excludes all generated files --- # Code Quality with Biome **Date**: 2025-10-03 **Issue**: [#1](https://github.com/kartuli-app/kartuli/issues/1) ## Context We need consistent code quality across the monorepo with linting and formatting standards that integrate well with our development workflow. ## Decision We will use **Biome** as our single tool for both linting and formatting: ### Code Quality Tools - **Biome** for linting and formatting - **No ESLint or Prettier** - Biome replaces both - **Pre-commit hooks** (planned) for automatic formatting ### Configuration - **biome.json** configuration file - **File exclusions** for generated files and dependencies - **Consistent rules** across all packages and apps ## Consequences ### Positive ✅ **Single tool** - No configuration conflicts between linters ✅ **Fast execution** - Biome is significantly faster than ESLint + Prettier ✅ **Consistent formatting** - Unified code style across workspace ✅ **Easy maintenance** - Single configuration file ### Negative ⚠️ **Learning curve** - Team needs to adapt to Biome rules ⚠️ **Ecosystem compatibility** - Some ESLint plugins not available ## Implementation ### Current Biome Configuration The `biome.json` configuration file is set up with: ```json { "files": { "ignoreUnknown": false, "includes": ["**", "!**/node_modules", "!**/dist", "!**/.vitepress/cache"] }, "formatter": { "enabled": true }, "linter": { "enabled": true } } ``` ### Workflow Integration Status - **IDE integration** - Biome extension is configured for editors - **CI/CD integration** - Automated linting runs in GitHub Actions - **Pre-commit hooks** - Planned for automatic formatting on commit --- # Core Tech Stack: Next.js, React, TypeScript, and Testing **Date**: 2025-10-04 **Issue**: [#2](https://github.com/kartuli-app/kartuli/issues/2) ## Context We need to establish the core technology stack for building the Georgian language learning platform with modern web technologies. ## Decision We will use the following core technologies: ### Frontend Framework - **Next.js** with App Router for React applications - **React** as the UI library - **TypeScript** for type safety and developer experience ### Testing Framework - **vitest** for unit and integration testing - **@testing-library/react** for component testing - **Playwright** (planned) for E2E testing ### Development Tools - **Turbopack** for Next.js applications (fast bundling) - **TypeScript strict mode** for maximum type safety ## Consequences ### Positive ✅ **Modern stack** - Industry-standard technologies ✅ **Type safety** - TypeScript prevents runtime errors ✅ **Fast development** - Next.js + Turbopack for quick iteration ✅ **Comprehensive testing** - Unit, integration, and E2E coverage ### Negative ⚠️ **Bundle size** - React + Next.js adds overhead ⚠️ **Learning curve** - Team needs TypeScript expertise ## Implementation ### Current Apps Structure ``` apps/ game-client/ # Learning game (Next.js) backoffice-client/ # Content management (Next.js) ``` ### Testing Implementation Status - **Unit tests** - Individual functions and components are tested with vitest - **Integration tests** - Component interactions are tested with vitest - **E2E tests** - Full user workflows testing is planned with Playwright --- # Testing Strategy: Unit, Integration, and E2E **Date**: 2025-10-05 **Issue**: [#2](https://github.com/kartuli-app/kartuli/issues/2) ## Context We need a comprehensive testing strategy that ensures code quality and prevents regressions as the Georgian language learning platform grows. ## Decision We will implement a **three-tier testing approach**: ### Unit Testing - **vitest** for fast unit tests - **@testing-library/react** for component testing - **Co-located tests** next to source files ### Integration Testing - **vitest** for integration tests - **API testing** for serverless functions - **Component interaction testing** ### End-to-End Testing - **Playwright** for E2E testing (planned) - **User journey testing** for critical paths - **Cross-browser compatibility** testing ## Consequences ### Positive ✅ **Fast feedback** - vitest provides quick test execution ✅ **Reliable components** - Testing Library ensures accessible components ✅ **User confidence** - E2E tests verify complete workflows ✅ **Regression prevention** - Comprehensive test coverage ### Negative ⚠️ **Maintenance overhead** - Tests need to be kept up to date ⚠️ **E2E complexity** - Playwright tests can be flaky ## Implementation ### Current Test Structure ``` src/ components/ Button.tsx Button.test.tsx # Co-located unit tests __tests__/ # Integration tests e2e/ # E2E tests (planned) ``` ### Testing Tools Implementation - **vitest** - Currently used for unit and integration testing - **@testing-library/react** - Component testing utilities are integrated - **Playwright** - E2E testing is planned for future implementation --- # Design System: Token-Driven Architecture with Tailwind CSS v4 **Date**: 2025-10-06 **Issue**: [#3](https://github.com/kartuli-app/kartuli/issues/3) ## Context We need a scalable design system that ensures consistency across all applications while supporting our Georgian language learning platform's accessibility requirements and offline-first architecture. ## Decision We will implement a **token-driven design system** using Tailwind CSS v4 with OKLCH color space, CSS custom properties for runtime theme customization, and a monorepo package structure for shared design tokens and components. ### Core Technologies - **Tailwind CSS v4** with token-driven design - **OKLCH color space** for better perceptual uniformity - **CSS custom properties** for runtime theme customization - **@kartuli/theme** package for design tokens - **@kartuli/ui** package for React components - **Storybook** for component development and documentation ### Design Principles - **Accessibility First** - WCAG 2.1 AA compliance - **Cultural Sensitivity** - Georgian language and cultural context - **Offline-First** - Optimized for PWA and offline learning - **Cost Optimization** - Efficient bundle sizes and performance ## Consequences ### Positive ✅ **Consistency** - Unified visual language across all applications ✅ **Scalability** - Token system grows with the platform ✅ **Accessibility** - Built-in WCAG compliance ✅ **Performance** - Optimized for offline-first learning ✅ **Maintainability** - Centralized design decisions ✅ **Cultural Integration** - Georgian language optimization ### Negative ⚠️ **Learning curve** - Team needs design system expertise ⚠️ **Bundle size** - Additional CSS and component overhead ⚠️ **Complexity** - More sophisticated build process ## Implementation ### Current Package Structure ``` packages/ ├── theme/ # Design tokens and CSS variables │ ├── src/ │ │ └── default-theme.css │ └── package.json └── ui/ # React components ├── src/ │ └── components/ │ ├── Button.tsx │ └── Button.stories.tsx └── package.json ``` ### Design Tokens Implementation The theme package uses Tailwind CSS v4's `@theme` directive with OKLCH color space: ```css @theme { --color-primary-500: oklch(55% 0.22 250); --color-neutral-500: oklch(55% 0 0); --text-base: 1rem; --spacing-4: 1rem; } ``` ### Component Usage Pattern Components are consumed via the `@kartuli/ui` package: ```tsx import { Button } from '@kartuli/ui/components/Button'; ``` ### Quality Assurance Setup - **Chromatic** configured for visual regression testing - **axe-core** integrated for automated accessibility testing - **Lighthouse CI** planned for performance monitoring --- # Documentation Management with VitePress **Date**: 2025-10-07 **Issue**: [#4](https://github.com/kartuli-app/kartuli/issues/4) ## Context We need a comprehensive documentation system that can scale with the project and provide easy maintenance and navigation. ## Decision We will use **VitePress** with dynamic navigation generation based on frontmatter metadata for our documentation system. ### Core Technologies - **VitePress** for static site generation - **GitHub Pages** hosting (planned) - **Local search** with VitePress built-in search - **Frontmatter-driven** navigation with `section` and `title` fields - **Dynamic sidebar generation** from file system scanning - **Chronological ADR ordering** by date metadata ## Consequences ### Positive ✅ **Maintainable** - Frontmatter drives navigation automatically ✅ **Scalable** - Easy to add new documents and sections ✅ **Searchable** - Built-in local search functionality ✅ **Version controlled** - Documentation lives with code ### Negative ⚠️ **Build complexity** - Dynamic navigation requires custom config ⚠️ **Frontmatter dependency** - All files need proper metadata ## Implementation ### Current File Structure ``` docs/ tech/decisions/ # Architecture Decision Records (sorted by date in sidebar) version-management.md monorepo-setup.md code-quality.md core-tech-stack.md testing-strategy.md design-system.md documentation-management.md tech/development/ # Development guides and conventions ai-assisted-workflow.md code-conventions.md github-workflow.md product/ # Product documentation project-overview.md stack-and-providers.md index.md # Homepage kartuli-llm.txt # Generated LLM bundle ``` ### Navigation System Implementation The VitePress configuration uses file system scanning with recursive directory traversal, frontmatter parsing for section, title, and date extraction, and automatic sidebar and navbar generation with chronological sorting where items are sorted by date and navigation links to the oldest item first. ### GitHub Actions Integration The documentation system includes automated workflows for generation and deployment: - **`docs-deploy-main.yml`** - Main workflow that: - Generates LLM documentation bundle - Builds VitePress site with proper base URL configuration - Deploys to GitHub Pages at `/kartuli/` - Copies LLM bundle to assets folder for static serving - **Label management workflows** - Automated label management for issues and PRs ### LLM Bundle Generation - **Shared Processor**: `docs-processor.js` utility consolidates all documentation processing logic - **Centralized Link Fixing**: Configuration-driven link replacement system - **Automatic Processing**: Frontmatter removal and content processing - **Static Asset**: LLM bundle served as `/kartuli/assets/kartuli-llm.txt` ### ADR Template Standard All decision documents follow this standardized template format with frontmatter metadata and consistent section structure for maintainability and consistency across the documentation system. --- ## Development # AI-Assisted Workflow Complete workflow for contributing to Kartuli using AI assistance, from idea to merged PR. ## Overview The workflow has 4 phases: 1. **Refinement: Idea → Issue** - Refine idea into GitHub issue 2. **Implementation: Issue → PR** - Implement the issue 3. **Integration: PR → Main** - Review and merge 4. **Cleanup** - Post-merge housekeeping Each phase can be done manually or with AI assistance. ## Refinement: Idea → Issue ### Manual 1. Think through requirements 2. Define acceptance criteria 3. Create issue using GitHub template 4. Check `[x]` label boxes 5. Labels auto-apply via GitHub Action ### With AI 1. **Use AI assistance** to refine ideas and generate GitHub issues - **Phase 1:** Discuss idea with AI to refine requirements and acceptance criteria - **Phase 2:** Ask AI to format as GitHub issue with proper labels 2. **Create issue on GitHub** - Review AI-generated markdown - Copy/paste into GitHub issue template - Labels auto-apply via GitHub Action **Result:** Issue created in project board (Backlog status) ## Implementation: Issue → PR ### Create Branch **Option A:** GitHub auto-create (recommended) - Click "Create a branch" in issue sidebar - Auto-linked to issue - Share issue URL with AI to find branch **Option B:** Manual creation - Create branch following naming convention - Link PR to issue using `Closes #X` ### Implement **Use AI assistance** for implementation: With Cursor or similar AI agent: 1. Share issue URL with AI 2. AI finds linked branch and switches to it 3. AI guides through: docs → implement → test → commit → PR 4. Confirm at decision points (commit message, PR updates) 5. Copy PR content AI generates **Result:** PR created, labels auto-propagate from issue ### Handle Bot Feedback Loop Automated code review bots (like Qodo) may provide suggestions. This can loop multiple times: **For each review cycle:** 1. **Share PR link** with AI agent after it's created 2. **Review suggestions** - AI helps analyze each bot suggestion 3. **Discuss & decide** - Determine what to implement/reject/defer 4. **Implement approved changes:** - AI makes changes on same branch - Test and verify changes work - AI offers to generate formatted response comment for the bot - You confirm, AI provides markdown to copy/paste to PR - AI proposes commit message - You confirm, AI commits and pushes 5. **Wait for next bot review** and repeat **Bot response format** (AI generates this for you): - Lists each suggestion with team decision (✅ Applied / ❌ Rejected / ⏸️ Deferred) - Includes team's reasoning for each decision - Provides external AI perspective on the decisions made ## Integration: PR → Main 1. Review PR yourself 2. Address any human feedback 3. Merge when ready **Result:** Issue + PR auto-close and move to "Done" ## Cleanup ```bash git checkout main git pull origin main git branch -d {branch-name} ``` ## Project Board Tracking | Status | When | How | |--------|------|-----| | Backlog | Issue created | Auto | | Ready | Spec complete | Manual | | In Progress | PR opened | Auto (maybe - test this) | | Done | PR merged | Auto | ## Quick Reference - **Conventions:** [Code](./tech/development/code-conventions.md) | [GitHub](./tech/development/github-workflow.md) --- # Code conventions ## Commit Convention Always use conventional commit format for all commits: ### Format ``` [optional scope]: ``` ### Supported Types - **feat**: New feature - **fix**: Bug fix - **chore**: Infrastructure, setup tasks, non-feature work - **docs**: Documentation changes - **test**: Testing-related changes - **refactor**: Code refactoring - **perf**: Performance improvements - **style**: Code style changes (formatting, etc.) - **ci**: CI/CD changes ### Scope Examples Use scopes to indicate which part of the monorepo is affected: - `game-client` - Game client application - `backoffice-client` - Backoffice client application - `ui` - UI package - `storybook` - Storybook tool - `e2e` - E2E testing - `global` - Shared packages or general repository tasks ### Examples ```bash feat(game-client): add user authentication fix(ui): resolve button alignment on mobile docs: update contributing guidelines chore(e2e): upgrade Playwright to v1.40 test: add unit tests for auth module refactor(ui): extract common button component perf(game-client): optimize image loading style: fix code formatting ci: add automated dependency updates ``` ### Enforcement - **Local**: Git hooks validate commit messages automatically - **CI**: PR titles must follow the same format - **Specification**: [Conventional Commits](https://www.conventionalcommits.org/) ## Code Style - Use TypeScript for all new code - Follow Biome configuration for linting and formatting - Write meaningful variable and function names - Fix any warnings or errors immediately when they appear ## Architecture - Prefer serverless and managed solutions - Optimize for cost efficiency - Design for offline-first functionality - Use PWA best practices ## Testing - Tests files live next to the file they are testing - Avoid separate test folder - Use descriptive test names - Test both happy path and edge cases ## Documentation - All documentation changes trigger automated workflows - LLM bundle is generated automatically using shared `docs-processor.js` utility - VitePress builds with proper base URL configuration for GitHub Pages - Use proper frontmatter format for navigation integration - Follow ADR template for decision documents - Link fixes are centralized in the shared processor for maintainability --- This document describes how we use GitHub for collaboration, issue tracking, and automation. ## Issues and Pull Requests ### Creating Issues All work should start with a GitHub issue. Use our issue template to ensure consistency: 1. Navigate to **Issues** → **New Issue** 2. Select **Feature or Task** template 3. Fill out all required sections: - **Type**: Select one (feat, chore, fix, docs, test) - **Scope**: Select applicable scope(s) (game-client, backoffice-client, ui, storybook, e2e, global) - **Description**: Clearly describe what needs to be done - **Acceptance Criteria**: List specific conditions for completion 4. Optionally add: - **Size**: Estimate effort (small/medium/large) - **Priority**: Set priority level (high/medium/low) - **Extra Tags**: Mark as `good first issue` or `help wanted` if appropriate ### Creating Pull Requests #### From an Issue (Recommended) The recommended approach is to create a PR directly from an issue: 1. Navigate to the issue 2. Click **"Create a branch"** in the right sidebar (or **"Development"** section) 3. This creates a linked branch and helps GitHub auto-link the PR to the issue #### Manual PR Creation If creating a PR manually: 1. Create your feature branch locally 2. Make your changes following our coding standards 3. Push your branch to GitHub 4. Create a pull request using our PR template 5. **Important**: Link to the issue using keywords in the PR description: - `Closes #123` - Closes the issue when PR is merged - `Fixes #456` - Same as `Closes` - `Resolves #789` - Same as `Closes` #### PR Title Format PR titles **must** follow our [conventional commit format](./code-conventions.md#commit-convention). **Quick examples**: - `feat(game-client): add user authentication system` - `fix(ui): resolve button alignment on mobile` - `docs: update contributing guidelines` - `chore(e2e): upgrade Playwright to v1.40` ### Label Propagation When you create a PR from an issue or link it properly using keywords (`Closes #123`), our automation will: - Automatically copy labels from the issue to the PR - This helps maintain consistent labeling across issues and PRs ## Labels Structure Our repository uses a structured labeling system defined in `.github/labels.yml`. ### Scope Labels (Blue - `#0075ca`) Define which part of the monorepo is affected: - `scope:game-client` - Game client application - `scope:backoffice-client` - Backoffice client application - `scope:ui` - UI package - `scope:storybook` - Storybook tool - `scope:e2e` - E2E testing - `scope:global` - Shared packages or general repository tasks ### Type Labels (Green - `#28a745`) Aligned with conventional commit types: - `type:feat` - New feature - `type:chore` - Infrastructure, setup tasks, non-feature work - `type:fix` - Bug fix - `type:docs` - Documentation changes - `type:test` - Testing-related changes ### Priority Labels (Red/Orange/Yellow) Indicate urgency and importance: - `priority:high` (Red - `#d73a4a`) - Urgent, blocking issues - `priority:medium` (Orange - `#ff9800`) - Important, should be addressed soon - `priority:low` (Yellow - `#ffd700`) - Nice to have, can wait ### Size Labels (Purple - `#9c27b0`) Estimate the effort required: - `size:small` - Quick fix, minor change (< 2 hours) - `size:medium` - Moderate effort (2-8 hours) - `size:large` - Significant work (> 8 hours) ### Extra Labels Special purpose labels: - `good first issue` (Purple - `#7057ff`) - Suitable for newcomers - `help wanted` (Teal - `#008672`) - Community help is appreciated ### Applying Labels **On Issues**: - Check `[x]` the appropriate label boxes in the issue template - Labels are automatically applied by GitHub Action - Manual addition also supported if needed **On PRs**: - Labels are automatically propagated from linked issues - Manual addition also supported if needed **Required labels**: Type and Scope (at least one of each) ## Linking Issues and PRs ### Automatic Linking GitHub supports several methods to link PRs to issues: 1. **Using the GitHub UI**: Click "Create a branch" or "Development" section on an issue 2. **Using keywords in PR description**: `Closes #123`, `Fixes #456`, `Resolves #789` 3. **Using the PR sidebar**: Select linked issues in the "Development" section ### Closing Issues Automatically When you use keywords like `Closes #123` in your PR description, GitHub will: - Link the PR to issue #123 - Automatically close issue #123 when the PR is merged - Add a reference in the issue timeline ### Multiple Issues If your PR addresses multiple issues: ``` Closes #123 Closes #456 Fixes #789 ``` ## GitHub Actions ### Syncing Labels to GitHub Labels are defined in `.github/labels.yml`. To sync them with GitHub: 1. Navigate to **Actions** tab 2. Select **"Labels Sync Available on GitHub from Repo Config"** workflow 3. Click **"Run workflow"** → **"Run workflow"** 4. Wait for completion (usually < 30 seconds) **When to sync**: - After adding new labels to `.github/labels.yml` - After modifying label names, colors, or descriptions - When setting up a new repository ### Automatic Label Propagation The **"Labels Propagate to PR from Linked Issue"** workflow runs automatically when: - A new pull request is created - The PR description contains issue keywords (`Closes #123`, `Fixes #456`, etc.) This workflow: 1. Extracts the linked issue number from the PR description 2. Fetches labels from that issue 3. Applies the same labels to the PR No manual action required - it runs automatically! ### Automatic Issue Labeling The **"Labels Auto Apply to Issues from Template"** workflow runs automatically when: - A new issue is created - An existing issue is edited This workflow: 1. Scans the issue body for checked label boxes (e.g., `- [x] type:feat`) 2. Extracts all checked labels 3. Applies those labels to the issue automatically **How to use:** 1. When creating an issue, check `[x]` the appropriate label boxes in the template 2. Submit the issue 3. Labels are automatically applied within seconds 4. Verify labels appear on the issue **Supported format:** - `- [x] type:feat` ✅ - `- [X] scope:global` ✅ (case-insensitive) - With or without backticks: `` `type:feat` `` or `type:feat` **Note:** If labels don't auto-apply: - Check the Actions tab for workflow errors - Verify boxes are marked with `[x]` - Manually add labels if automation fails ## Templates ### Issue Template Located at `.github/ISSUE_TEMPLATE/feature_or_task.md` Single comprehensive template for all issues (features, bugs, tasks, documentation). **Required fields**: - Type (feat, chore, fix, docs, test) - Scope (game-client, backoffice-client, ui, storybook, e2e, global) - Description - Acceptance Criteria **Optional fields**: - Size estimate - Priority level - Extra tags (good first issue, help wanted) - Notes and references ### Pull Request Template Located at `.github/pull_request_template.md` Standard template for all pull requests. **Required sections**: - Description - Linked issues (using `Closes #` keywords) - Type selection - Scope selection **Optional sections**: - Screenshots - Preview links - Testing notes **Important**: PR title must follow conventional commit format. ## Git Hooks We use **Lefthook** to run fast, local checks that improve developer experience without replacing CI as the source of truth. Hooks are automatically installed when you run `pnpm install`. ### What Hooks Do **Pre-commit** (target: <3s): - **Format**: Auto-fixes code formatting using Biome - **Lint**: Runs linting with safe auto-fixes, fails on remaining errors - **Foot-gun guards**: - Blocks merge conflict markers (`<<<<<<<`, `=======`, `>>>>>>>`) - Prevents large files (>5MB) in `src/` directories - Warns about debug patterns (`console.log`, `debugger`, `alert()`) **Commit-msg**: - **Conventional Commits**: Enforces commit message format - Format: `(): ` - Supported types: `feat`, `fix`, `chore`, `docs`, `test`, `refactor`, `perf`, `style`, `ci` **Pre-push** (target: <60s): - **Typecheck**: Runs TypeScript checks on affected packages only - **Unit tests**: Runs tests on affected packages (excludes E2E tests) ### Time Budgets - **Pre-commit**: <3 seconds on typical changes - **Pre-push**: <60 seconds on typical changes - Hooks only run on staged files (pre-commit) or affected workspaces (pre-push) ### Fixing Hook Failures **Formatting issues**: ```bash pnpm lint:fix # Auto-fix formatting and linting issues ``` **Commit message format**: ```bash git commit --amend -m "feat(game-client): add user authentication" ``` **Large files**: - Remove large files from `src/` directories - Add them to `.gitignore` if they're needed elsewhere **Debug patterns**: - Remove `console.log`, `debugger`, `alert()` statements - Or use `git commit --no-verify` for emergency commits ### Bypass Policy **Emergency bypass**: Use `git commit --no-verify` or `git push --no-verify` **Important**: Bypassing hooks doesn't bypass CI. All merges are still gated by required CI checks (typecheck, lint, tests, E2E). ### Setup Hooks are automatically installed when you run: ```bash pnpm install ``` The `prepare` script runs `lefthook install` after package installation. ## AI-Assisted Workflow For complete guidance on using AI to assist with issue creation and implementation, see the **[AI-Assisted Workflow Guide](./tech/development/ai-assisted-workflow.md)**. This includes: - Using AI to refine ideas into well-structured issues - Implementing issues with AI agents (Cursor, Copilot, etc.) - Handling automated bot feedback - Complete workflow from idea to merged PR --- ## Stack Status: 🟢 Active | ⚫ Planned | 🟡 To be discussed | Technology | Status | Layer | Notes | | ---------- | ------ | ----- | ----- | | **git** | 🟢 | Version Control | | | **pnpm** | 🟢 | Package Manager | Workspace management | | **turborepo** | 🟢 | Build System | Monorepo management | | **TypeScript** | 🟢 | Language | | | **Biome** | 🟢 | Code Quality | • Linting
• Formatting | | **Tailwind** | 🟢 | Styling | v4 + token-driven design | | **React** | 🟢 | UI Library | | | **react-aria** | ⚫ | UI Components | • Accessibility
• WCAG compliance | | **Next.js** | 🟢 | Framework | App router| | **Turbopack** | 🟢 | Bundler | Next.js applications | | **Vite** | 🟢 | Bundler | • Storybook
• VitePress | | **rxdb** | ⚫ | Storage (Client) | Sync management | | **IndexedDB** | ⚫ | Storage (Client) | Offline support| | **Cache Storage** | ⚫ | Storage (Client) | PWA asset caching | | **CDN** | ⚫ | Infrastructure | Cloudflare | | **Postgres** | ⚫ | Database (Server) | Supabase | | **PWA** | ⚫ | Platform | Installable app | | **vitest** | 🟢 | Testing | • Integration tests
• Unit tests| | **Playwright** | 🟢 | Testing | e2e tests | | **Lighthouse CI** | ⚫ | Quality | • Performance
• Accessibility monitoring
• Runs on GitHub Actions | | **Fuse.js** | ⚫ | Search | • Client-side
• offline-first | | | 🟡 | Internationalization | Options:
• intlayer
• next-intl
• i18next | | **Markdown** | 🟢 | Content | • Documentation
• Info pages (terms, privacy) | | **Storybook** | 🟢 | Documentation & Development | • Component development
• Documentation
• Theme preview | | **VitePress** | 🟢 | Documentation Site | • Project documentation
• Hosted on GitHub Pages
• LLM bundle generation
• Shared docs processor | ## Providers 1 entry per service, even if same provider Status: 🟢 Active | ⚫ Planned | Service | Provider | Status | Links | Notes | | ------- | -------- | ------ | ----- | ----- | | **Version Control** | GitHub | 🟢 | https://github.com/kartuli-app/ | | | **CI/CD** | GitHub | 🟢 | https://github.com/kartuli-app/ | • Labels sync
• Labels propagation from Issue to PR
• Documentation deployment
• LLM bundle generation | | **Dependency Updates** | Mend.io | ⚫ | https://github.com/marketplace/renovate | • Automated dependency bot for PRs
• GitHub integration | | **Projects** | GitHub | 🟢 | https://github.com/kartuli-app/ | • Issue tracking
• Project boards | | **Hosting (Documentation)** | GitHub Pages | 🟢 | https://pages.github.com | VitePress documentation site | | **AI Code Review** | Qodo | 🟢 | https://qodo.ai | GitHub integration | | **Hosting** | Vercel | ⚫ | https://vercel.com | Next.js optimized | | **Serverless Functions** | Vercel | ⚫ | https://vercel.com | API endpoints | | **Database** | Supabase | ⚫ | https://supabase.com | • Used for Students Activity, CMS
• Frankfurt region (closest to Georgia, good speed for continental Europe) | | **Authentication** | Supabase | ⚫ | https://supabase.com | • Google social login
• Facebook social login | | **File Storage** | Supabase | ⚫ | https://supabase.com | • Assets
• Content packs | | **CDN** | Cloudflare | ⚫ | https://cloudflare.com | Serves assets and content packs from Supabase | | **Domain** | Cloudflare | 🟢 | https://cloudflare.com | | | **Email Services** | Cloudflare | ⚫ | https://cloudflare.com | • Capture mails to any domain address | | **Analytics** | PostHog | ⚫ | https://posthog.com | • User behavior
• Consent-based | | **Error Tracking** | Sentry | ⚫ | https://sentry.io | | | **Performance Monitoring** | New Relic | ⚫ | https://newrelic.com | | | **Uptime Monitoring** | BetterStack | ⚫ | https://betterstack.com | • Heartbeats
• Status pages | | **Visual Testing** | Chromatic | ⚫ | https://chromatic.com | • Visual regression
• UI review | | **Forms & Surveys** | Tally | ⚫ | https://tally.so | • Anonymous surveys
• User feedback | ---