Skip to content

Simularity with local inference (Llama.cpp, Tauri), run as an app (AI Visual Novels with scenarios (https://github.com/vladfaust/simularity-scenarios))

Notifications You must be signed in to change notification settings

vladfaust/simularity

Repository files navigation

Simularity

This repository contains mathematical proof of simulated reality (simularity).

The project can run locally (currently MacOS M-series chips only). Alternatively, cloud inference is available at simularity.ai.

Demo πŸš€

Click to play a YouTube video:

Simularity Chat Mode Demo

Simularity Visual Novel Demo

Deployment 🚒

Quick Start

Run development web server:

cd packages/client
npm run dev

Run the Tauri application:

cd packages/tauri
cargo tauri dev

Simularity depends on scenarios. Download example scenarios from the simularity-scenarios repository and place them into the /Users/user/Library/Application Support/ai.simularity.dev/scenarios directory.

Dokku

See packages/api/README.md and packages/web/README.md for Dokku deployment instructions.

CI

Windows Host on Hetzner

  1. Use https://docs.hetzner.com/cloud/servers/windows-on-cloud/#example-instructions to install Windows Server 2022 Standard:

    1. Rent an Ubuntu server.

    2. Mount Windows Server 2022 English ISO, reboot the server, connect via Hetzner console.

    3. Proceed with the installation.

    4. When at the disks page, mount virtio-win-0.1.248.iso, press "Load driver", install the following drivers:

      1. Baloon/2k22/amd64,
      2. NetKVM/2k22/amd64,
      3. vioscsi/2k22/amd64.
    5. Switch back to the Windows Server 2022 ISO.

    6. Remove all disk partitions, create a new one.

  2. Connect via Remote Desktop.

  3. These are the variable you'll need to set:

    $buildkiteAgentToken = "TOKEN"
    $sshKeyUser = "[email protected]"
    $userPassword = "ADMIN_PASSWORD"
  4. Download VS Build Tools and install the following:

    1. MSVC,
    2. Windows SDK,
    3. CMake,
    4. Windows Universal CRT SDK (from individual components).
    $ProgressPreference = 'SilentlyContinue'; Invoke-WebRequest https://aka.ms/vs/17/release.ltsc.17.10/vs_buildtools.exe -OutFile ~\Downloads\vs_BuildTools.exe; ~\Downloads\vs_buildtools.exe `
      --quiet --wait --includeRecommended `
      --add Microsoft.VisualStudio.Workload.VCTools `
      --add Microsoft.Component.VC.Runtime.UCRTSDK `
      --add Microsoft.VisualStudio.Component.VC.CMake.Project
    Get-Process -Name "vs_*", "setup*"
  5. Download and install Cuda with

    $ProgressPreference = 'SilentlyContinue'; Invoke-WebRequest -Uri "https://developer.download.nvidia.com/compute/cuda/12.6.1/network_installers/cuda_12.6.1_windows_network.exe" -OutFile ~\Downloads\cuda_12.6.1_windows_network.exe; ~\Downloads\cuda_12.6.1_windows_network.exe -s
    Get-Process -Name "cuda*", "setup*"

    See https://www.server-world.info/en/note?os=Windows_Server_2022&p=cuda.

  6. After CUDA is installed, copy some extensions: cp "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.6\extras\visual_studio_integration\MSBuildExtensions\*" "C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\BuildCustomizations\" (see https://stackoverflow.com/questions/56636714/cuda-compile-problems-on-windows-cmake-error-no-cuda-toolset-found).

  7. Install Scoop with iex "& {$(irm get.scoop.sh)} -RunAsAdmin".

  8. Install required packages with scoop install git ninja ccache nano nssm nodejs-lts.

  9. Run git config --system core.longpaths true (https://stackoverflow.com/questions/22041752/github-clone-succeeded-but-checkout-failed) and git config --global core.sshCommand "C:/Windows/System32/OpenSSH/ssh.exe" (see https://stackoverflow.com/a/79075865).

  10. Download RustUp with $ProgressPreference = 'SilentlyContinue'; Invoke-WebRequest -Uri "https://win.rustup.rs/x86_64" -OutFile ~\Downloads\rustup.exe.

  11. Install Rust toolchain with ~\Downloads\rustup.exe default stable.

  12. Add Rust to Path (temporarily) with $env:Path += ";C:\Users\Administrator\.rustup\toolchains\stable-x86_64-pc-windows-msvc\bin;C:\Users\Administrator\.cargo\bin".

  13. Install Tauri CLI globally with cargo install tauri-cli (takes a long time).

  14. Install Buildkite agent with https://buildkite.com/docs/agent/v3/windows:

    $env:buildkiteAgentToken = $buildkiteAgentToken
    Set-ExecutionPolicy Bypass -Scope Process -Force
    iex ((New-Object System.Net.WebClient).DownloadString('https://raw.githubusercontent.com/buildkite/agent/main/install.ps1'))
  15. Generate SSH key with: ssh-keygen -t rsa -b 4096 -C $sshKeyUser; cat C:\Users\Administrator/.ssh/id_rsa.pub. This key shall be added to the Git repository.

  16. Edit Buildkite config with nano C:\buildkite-agent\buildkite-agent.cfg:

    1. Set tags to tags="queue=buildkite-agent-windows".
    2. Enable PowerShell with new line: shell="C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe".
    3. Set git-clone-flags=-v --depth=1.
  17. Install NSSM:

    nssm install buildkite-agent "C:\buildkite-agent\bin\buildkite-agent.exe" "start"
    nssm set buildkite-agent AppStdout "C:\buildkite-agent\buildkite-agent.log"
    nssm set buildkite-agent AppStderr "C:\buildkite-agent\buildkite-agent.log"
    nssm set buildkite-agent ObjectName "$Env:ComputerName\$Env:UserName" "$userPassword"
    nssm start buildkite-agent

History πŸ“œ

My previous simulation project, aistories, was dependent on third-party AI providers. This project is a self-contained simulation engine which can run locally thanks to Llama.cpp.

Marketing Ideas πŸ“ˆ

Research πŸ“š

Token healing

Streaming LLM

A.k.a. Context Shifting.

Paper: https://github.com/mit-han-lab/streaming-llm.

Llama.cpp

Roleplay

Training

Serving

About

Simularity with local inference (Llama.cpp, Tauri), run as an app (AI Visual Novels with scenarios (https://github.com/vladfaust/simularity-scenarios))

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published