<?xml version="1.0" encoding="UTF-8"?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>sgolovin</title>
    <link>https://sgolovin.live/</link>
    <description>[github](https://github.com/daynin) - [mastodon](https://mastodon.social/@sgolovin) - [twitter](https://x.com/_sgolovin)</description>
    <pubDate>Thu, 07 May 2026 22:38:44 +0000</pubDate>
    <item>
      <title>Kagi: search that feels like the old web (without the junk)</title>
      <link>https://sgolovin.live/kagi-search-that-feels-like-the-old-web-without-the-junk?pk_campaign=rss-feed</link>
      <description>&lt;![CDATA[kagi logo&#xA;&#xA;I started using Kagi because I was tired of digging through ads and SEO spam just to find a straight answer. With Kagi I can actually search again. The interface is clean, the results load fast, and there’s far less noise. It reminds me of the early web - simple and straightforward - while still feeling modern. After a short time it mostly replaced AI chat tools for my search needs, because I’d rather read real sources than risk confident guesses.&#xA;&#xA;What makes Kagi click for me is the focus on the result, not the revenue. I’m not forced to scroll past a wall of sponsored links or affiliate “roundups.” I type what I want and I get useful pages: docs, forums, blogs, papers. That’s the whole point of a search engine, and Kagi gets out of the way so I can reach the information faster.&#xA;&#xA;!--more--&#xA;&#xA;Lenses: choose your slice of the web&#xA;&#xA;search example&#xA;&#xA;The feature that sold me is Lenses. A Lens lets me choose which slice of the internet I want to look through. If I’m debugging, I set the lens to documentation or forums and I immediately get official references and real-world threads instead of generic SEO content. When I want deeper takes, I switch to blogs and personal sites to find people who actually built or tested the thing. One small toggle turns a vague query into a focused search. It also reduces the mental filtering I used to do on other engines—less skimming, more learning.&#xA;&#xA;A simple, fast workflow&#xA;&#xA;My workflow is straightforward. I start broad, then narrow. I’ll write the plain question first, see what comes back, and then steer with a Lens. I still use basic operators because they’re fast and predictable: exact phrases with &#34;like this&#34;, excluding terms with -template, and pinning to a domain with site:docs.example.dev. Two or three promising tabs open in the background, I skim quickly, pick the winner, and close the rest. The whole loop feels efficient because the junk is mostly gone.&#xA;&#xA;Why it replaced AI chat for me&#xA;&#xA;Kagi helps me avoid the “confident but wrong” problem. Instead of a polished paragraph that might not be true, I get links to sources I can verify myself. It’s faster to trust what I can read than to audit an answer that only sounds right. For research, troubleshooting, and learning new tools, that difference matters. That’s why Kagi replaced AI chat in my search routine.&#xA;&#xA;Trade-offs worth making&#xA;&#xA;Kagi is paid, and I’m fine with that; the cost replaces the time I used to waste digging through ads. Index size and coverage can vary compared to the big engines, so once in a while I’ll check a second source. But the net effect is positive: less distraction, more signal, and better use of my attention.&#xA;&#xA;Final thoughts&#xA;&#xA;Kagi brings back intentional search. I spend less time fighting ads and second-guessing slick summaries, and more time reading the sources that matter. If you want the best parts of the old web—clean, useful, direct—Kagi delivers.]]&gt;</description>
      <content:encoded><![CDATA[<p><img src="https://i.snap.as/GhsFxzDV.jpg" alt="kagi logo"/></p>

<p>I started using <strong>Kagi</strong> because I was tired of digging through ads and SEO spam just to find a straight answer. With Kagi I can actually search again. The interface is clean, the results load fast, and there’s far less noise. It reminds me of the early web – simple and straightforward – while still feeling modern. After a short time it mostly replaced AI chat tools for my search needs, because I’d rather read real sources than risk confident guesses.</p>

<p>What makes Kagi click for me is the focus on the result, not the revenue. I’m not forced to scroll past a wall of sponsored links or affiliate “roundups.” I type what I want and I get useful pages: docs, forums, blogs, papers. That’s the whole point of a search engine, and Kagi gets out of the way so I can reach the information faster.</p>



<h2 id="lenses-choose-your-slice-of-the-web" id="lenses-choose-your-slice-of-the-web">Lenses: choose your slice of the web</h2>

<p><img src="https://i.snap.as/wPC1Xp8v.png" alt="search example"/></p>

<p>The feature that sold me is <strong>Lenses</strong>. A Lens lets me choose which slice of the internet I want to look through. If I’m debugging, I set the lens to documentation or forums and I immediately get official references and real-world threads instead of generic SEO content. When I want deeper takes, I switch to blogs and personal sites to find people who actually built or tested the thing. One small toggle turns a vague query into a focused search. It also reduces the mental filtering I used to do on other engines—less skimming, more learning.</p>

<h2 id="a-simple-fast-workflow" id="a-simple-fast-workflow">A simple, fast workflow</h2>

<p>My workflow is straightforward. I start broad, then narrow. I’ll write the plain question first, see what comes back, and then steer with a Lens. I still use basic operators because they’re fast and predictable: exact phrases with <code>&#34;like this&#34;</code>, excluding terms with <code>-template</code>, and pinning to a domain with <code>site:docs.example.dev</code>. Two or three promising tabs open in the background, I skim quickly, pick the winner, and close the rest. The whole loop feels efficient because the junk is mostly gone.</p>

<h2 id="why-it-replaced-ai-chat-for-me" id="why-it-replaced-ai-chat-for-me">Why it replaced AI chat for me</h2>

<p>Kagi helps me avoid the “confident but wrong” problem. Instead of a polished paragraph that might not be true, I get links to sources I can verify myself. It’s faster to trust what I can read than to audit an answer that only sounds right. For research, troubleshooting, and learning new tools, that difference matters. That’s why Kagi replaced AI chat in my search routine.</p>

<h2 id="trade-offs-worth-making" id="trade-offs-worth-making">Trade-offs worth making</h2>

<p>Kagi is paid, and I’m fine with that; the cost replaces the time I used to waste digging through ads. Index size and coverage can vary compared to the big engines, so once in a while I’ll check a second source. But the net effect is positive: less distraction, more signal, and better use of my attention.</p>

<h2 id="final-thoughts" id="final-thoughts">Final thoughts</h2>

<p>Kagi brings back intentional search. I spend less time fighting ads and second-guessing slick summaries, and more time reading the sources that matter. If you want the best parts of the old web—clean, useful, direct—Kagi delivers.</p>
]]></content:encoded>
      <guid>https://sgolovin.live/kagi-search-that-feels-like-the-old-web-without-the-junk</guid>
      <pubDate>Sun, 10 Aug 2025 01:42:21 +0000</pubDate>
    </item>
    <item>
      <title>Managing Multi-Language Development Environments with Nix Flakes</title>
      <link>https://sgolovin.live/managing-multi-language-development-environments-with-nix-flakes?pk_campaign=rss-feed</link>
      <description>&lt;![CDATA[Introduction&#xA;&#xA;Modern software development frequently requires coordinating multiple programming languages within a single project ecosystem. In this article I tried to describe the challenges of managing Rust-to-TypeScript workflows and present a solution using Nix flakes, based on patterns observed in real-world development projects.&#xA;&#xA;The examples presented here are hypothetical but derived from actual implementation patterns used in production environments where high-performance Rust libraries are compiled to WebAssembly and consumed by TypeScript frontend applications.&#xA;&#xA;Common Challenges in Multi-Language Development&#xA;&#xA;Projects that combine Rust and TypeScript typically encounter several coordination challenges:&#xA;&#xA;Toolchain Management: Different components require specific versions of Rust, Node.js, and associated build tools&#xA;Build Reproducibility: Ensuring consistent compilation results across development, CI, and production environments&#xA;Dependency Coordination: Managing the interface between Rust compilation outputs and TypeScript build processes&#xA;Environment Consistency: Maintaining identical development setups across team members&#xA;Branch Synchronization: Coordinating different versions of core libraries with corresponding frontend applications&#xA;Manual Integration Steps: Copying build artifacts between repositories and maintaining build scripts&#xA;&#xA;Traditional approaches using Docker containers, shell scripts, or manual environment management often prove insufficient for addressing the full scope of these coordination challenges.&#xA;&#xA;!--more--&#xA;&#xA;Project Structure Analysis&#xA;&#xA;Consider a typical multi-repository project structure combining Rust and TypeScript components:&#xA;&#xA;project-ecosystem/&#xA;├── json-parser/              # Rust library for WebAssembly compilation&#xA;│   ├── Cargo.toml&#xA;│   ├── src/lib.rs&#xA;│   └── build.sh              # Build coordination script&#xA;├── web-application/          # Primary TypeScript application&#xA;│   ├── package.json&#xA;│   ├── webpack.config.js&#xA;│   ├── src/&#xA;│   └── src/utils/jsoncore/  # WebAssembly integration directory&#xA;├── dashboard/                # Secondary React application&#xA;│   ├── package.json&#xA;│   └── src/&#xA;└── integrations/             # Platform-specific TypeScript components&#xA;    ├── widget-a/&#xA;    ├── widget-b/&#xA;    └── iframe-embed/&#xA;&#xA;This structure presents several operational challenges:&#xA;&#xA;Toolchain Synchronization: Each developer must maintain compatible versions of Rust, Node.js, wasm-pack, and related tools&#xA;Build Script Complexity: Coordination scripts become increasingly complex as they handle cross-compilation, file copying, and dependency management&#xA;Manual Artifact Management: WebAssembly files must be manually transferred from Rust build outputs to TypeScript project directories&#xA;Branch Coordination: Different development branches require corresponding versions of core libraries&#xA;CI/CD Complexity: Continuous integration systems require complex Docker configurations and dependency caching strategies&#xA;&#xA;Nix Flakes as a Solution Framework&#xA;&#xA;Nix flakes provide a declarative approach to managing multi-language development environments. The solution addresses the coordination challenges through:&#xA;&#xA;Unified Dependency Declaration: Single configuration files that specify exact tool versions and dependencies&#xA;Environment Reproducibility: Identical development environments across all team members and deployment targets&#xA;Automated Build Coordination: Programmatic integration between Rust compilation and TypeScript build processes&#xA;Branch-Based Dependency Management: Dynamic dependency resolution based on git branch references&#xA;&#xA;Foundational Flake Structure&#xA;&#xA;A basic Nix flake for multi-language development begins with input declarations:&#xA;&#xA;{&#xA;  description = &#34;Multi-language development environment&#34;;&#xA;&#xA;  inputs = {&#xA;    nixpkgs.url = &#34;github:NixOS/nixpkgs/nixos-unstable&#34;;&#xA;    rust-overlay.url = &#34;github:oxalica/rust-overlay&#34;;&#xA;    flake-utils.url = &#34;github:numtide/flake-utils&#34;;&#xA;  };&#xA;&#xA;  outputs = { self, nixpkgs, rust-overlay, flake-utils }:&#xA;    # Implementation details follow&#xA;}&#xA;&#xA;The inputs section declares external dependencies required for the project. The rust-overlay provides access to specific Rust toolchain versions, while flake-utils reduces boilerplate code.&#xA;&#xA;Overlays enable precise version control for language-specific tools:&#xA;&#xA;let&#xA;  overlays = [&#xA;    rust-overlay.overlays.default&#xA;    (final: prev: {&#xA;      # Specify exact Node.js version&#xA;      nodejs = prev.nodejs20;&#xA;      yarn = prev.yarn.override { nodejs = prev.nodejs20; };&#xA;    })&#xA;  ];&#xA;  &#xA;  pkgs = import nixpkgs {&#xA;    inherit system overlays;&#xA;  };&#xA;in&#xA;&#xA;This approach eliminates version conflicts and ensures consistent toolchain availability across development environments.&#xA;&#xA;Rust-to-WebAssembly Compilation Pipeline&#xA;&#xA;The core challenge in Rust-to-TypeScript integration involves establishing a reliable compilation pipeline that handles:&#xA;&#xA;Rust Library Development - Performance-critical code implementation&#xA;&#xA;WebAssembly Cross-Compilation - Targeting the wasm32-unknown-unknown platform&#xA;&#xA;TypeScript Binding Generation - Creating type-safe interfaces using wasm-bindgen&#xA;&#xA;Build System Integration - Automating artifact delivery to frontend applications&#xA;&#xA;Multi-Branch Coordination - Supporting different development environments&#xA;&#xA;The following example demonstrates a comprehensive flake implementation for WebAssembly compilation. First, let&#39;s establish the required project structure:&#xA;&#xA;Project Structure:&#xA;&#xA;json-parser/&#xA;├── flake.nix&#xA;├── Cargo.toml&#xA;├── Cargo.lock&#xA;└── src/&#xA;    └── lib.rs&#xA;&#xA;Cargo.toml:&#xA;&#xA;[package]&#xA;name = &#34;json-parser&#34;&#xA;version = &#34;0.1.0&#34;&#xA;edition = &#34;2021&#34;&#xA;&#xA;[lib]&#xA;crate-type = [&#34;cdylib&#34;]&#xA;&#xA;[dependencies]&#xA;wasm-bindgen = &#34;0.2&#34;&#xA;serde = { version = &#34;1.0&#34;, features = [&#34;derive&#34;] }&#xA;serdejson = &#34;1.0&#34;&#xA;&#xA;[dependencies.web-sys]&#xA;version = &#34;0.3&#34;&#xA;features = [&#xA;  &#34;console&#34;,&#xA;]&#xA;&#xA;src/lib.rs:&#xA;&#xA;use wasmbindgen::prelude::;&#xA;&#xA;[wasmbindgen]&#xA;extern &#34;C&#34; {&#xA;    #[wasmbindgen(jsnamespace = console)]&#xA;    fn log(s: &amp;str);&#xA;}&#xA;&#xA;[wasmbindgen]&#xA;pub struct ParseOptions {&#xA;    indentsize: u32,&#xA;    sortkeys: bool,&#xA;}&#xA;&#xA;[wasmbindgen]&#xA;impl ParseOptions {&#xA;    #[wasmbindgen(constructor)]&#xA;    pub fn new(indentsize: u32, sortkeys: bool) -  ParseOptions {&#xA;        ParseOptions { indentsize, sortkeys }&#xA;    }&#xA;&#xA;    #[wasmbindgen(getter)]&#xA;    pub fn indentsize(&amp;self) -  u32 {&#xA;        self.indentsize&#xA;    }&#xA;&#xA;    #[wasmbindgen(getter)]&#xA;    pub fn sortkeys(&amp;self) -  bool {&#xA;        self.sortkeys&#xA;    }&#xA;}&#xA;&#xA;[wasmbindgen]&#xA;pub fn parsejson(jsonstring: &amp;str) -  ResultString, JsValue {&#xA;    let value: serdejson::Value = serdejson::fromstr(jsonstring)&#xA;        .maperr(|e| JsValue::fromstr(&amp;e.tostring()))?;&#xA;    &#xA;    serdejson::tostring(&amp;value)&#xA;        .maperr(|e| JsValue::fromstr(&amp;e.tostring()))&#xA;}&#xA;&#xA;[wasmbindgen]&#xA;pub fn stringifyjson(jsonstring: &amp;str, options: &amp;ParseOptions) -  ResultString, JsValue {&#xA;    let value: serdejson::Value = serdejson::fromstr(jsonstring)&#xA;        .maperr(|e| JsValue::fromstr(&amp;e.tostring()))?;&#xA;    &#xA;    if options.sortkeys {&#xA;        // For simplicity, just return pretty-printed JSON&#xA;        serdejson::tostringpretty(&amp;value)&#xA;            .maperr(|e| JsValue::fromstr(&amp;e.tostring()))&#xA;    } else {&#xA;        serdejson::tostring(&amp;value)&#xA;            .maperr(|e| JsValue::fromstr(&amp;e.tostring()))&#xA;    }&#xA;}&#xA;&#xA;[wasmbindgen]&#xA;pub fn validatejson(jsonstring: &amp;str) -  bool {&#xA;    serdejson::fromstr::serdejson::Value(jsonstring).isok()&#xA;}&#xA;&#xA;flake.nix:&#xA;&#xA;{&#xA;  description = &#34;JSON parser WebAssembly package&#34;;&#xA;&#xA;  inputs = {&#xA;    nixpkgs.url = &#34;github:NixOS/nixpkgs/nixos-unstable&#34;;&#xA;    rust-overlay.url = &#34;github:oxalica/rust-overlay&#34;;&#xA;    flake-utils.url = &#34;github:numtide/flake-utils&#34;;&#xA;  };&#xA;&#xA;  outputs = { self, nixpkgs, rust-overlay, flake-utils }:&#xA;    flake-utils.lib.eachDefaultSystem (system:&#xA;      let&#xA;        overlays = [ (import rust-overlay) ];&#xA;        pkgs = import nixpkgs {&#xA;          inherit system overlays;&#xA;        };&#xA;&#xA;        # Rust toolchain configuration with WebAssembly target&#xA;        rustToolchain = pkgs.rust-bin.stable.latest.default.override {&#xA;          extensions = [ &#34;rust-src&#34; &#34;clippy&#34; &#34;rustfmt&#34; ];&#xA;          targets = [ &#34;wasm32-unknown-unknown&#34; ];&#xA;        };&#xA;&#xA;        # WebAssembly build function&#xA;        buildJsonParserWasm = pkgs.rustPlatform.buildRustPackage rec {&#xA;          pname = &#34;json-parser&#34;;&#xA;          version = &#34;0.1.0&#34;;&#xA;          src = ./.;&#xA;&#xA;          cargoLock = {&#xA;            lockFile = ./Cargo.lock;&#xA;          };&#xA;&#xA;          nativeBuildInputs = with pkgs; [&#xA;            rustToolchain&#xA;            wasm-bindgen-cli&#xA;            pkg-config&#xA;          ];&#xA;&#xA;          buildInputs = with pkgs; [&#xA;            openssl&#xA;          ] ++ pkgs.lib.optionals pkgs.stdenv.isDarwin [&#xA;            pkgs.darwin.applesdk.frameworks.Security&#xA;            pkgs.darwin.applesdk.frameworks.SystemConfiguration&#xA;          ];&#xA;&#xA;          # Custom build phase for WebAssembly compilation&#xA;          buildPhase = &#39;&#39;&#xA;            runHook preBuild&#xA;&#xA;            export HOME=$TMPDIR&#xA;            export CARGOHOME=$TMPDIR/.cargo&#xA;&#xA;            # Build the WebAssembly binary&#xA;            cargo build --lib --release --target wasm32-unknown-unknown&#xA;&#xA;            # Generate TypeScript bindings&#xA;            mkdir -p pkg&#xA;            wasm-bindgen target/wasm32-unknown-unknown/release/jsonparser.wasm &#xA;              --out-dir pkg &#xA;              --target web &#xA;              --typescript&#xA;&#xA;            runHook postBuild&#xA;          &#39;&#39;;&#xA;&#xA;          # Install WebAssembly artifacts&#xA;          installPhase = &#39;&#39;&#xA;            runHook preInstall&#xA;            mkdir -p $out/pkg&#xA;            cp -r pkg/ $out/pkg/&#xA;            runHook postInstall&#xA;          &#39;&#39;;&#xA;&#xA;          doCheck = false;&#xA;        };&#xA;&#xA;      in&#xA;      {&#xA;        packages.default = buildJsonParserWasm;&#xA;&#xA;        # Development environment&#xA;        devShells.default = pkgs.mkShell {&#xA;          buildInputs = with pkgs; [&#xA;            rustToolchain&#xA;            wasm-bindgen-cli&#xA;            pkg-config&#xA;            openssl&#xA;          ];&#xA;&#xA;          shellHook = &#39;&#39;&#xA;            echo &#34;JSON Parser WebAssembly development environment&#34;&#xA;            echo &#34;Available commands:&#34;&#xA;            echo &#34;  cargo build --target wasm32-unknown-unknown&#34;&#xA;            echo &#34;  cargo test&#34;&#xA;            echo &#34;  nix build  # Build WASM package&#34;&#xA;          &#39;&#39;;&#xA;        };&#xA;      });&#xA;}&#xA;&#xA;To use this, create the directory structure, add the files above, then run:&#xA;&#xA;Generate Cargo.lock&#xA;nix develop&#xA;cargo generate-lockfile&#xA;&#xA;Build the WebAssembly package&#xA;nix build&#xA;&#xA;Advanced WebAssembly Build Patterns&#xA;&#xA;Multi-Target Builds&#xA;&#xA;When building WebAssembly modules, you often need to support different JavaScript environments. Each target (web browsers, Node.js, bundlers like Webpack) has different requirements for how the WASM module is loaded and initialized. This pattern allows you to build all variants simultaneously, ensuring your library works across all deployment scenarios.&#xA;&#xA;buildMultiTargetWasm = { pname, src }:&#xA;  let&#xA;    targets = [&#xA;      { name = &#34;web&#34;; target = &#34;web&#34;; }&#xA;      { name = &#34;nodejs&#34;; target = &#34;nodejs&#34;; }&#xA;      { name = &#34;bundler&#34;; target = &#34;bundler&#34;; }&#xA;    ];&#xA;    &#xA;    buildTarget = { name, target }: pkgs.runCommand &#34;${pname}-${name}&#34; {&#xA;      nativeBuildInputs = [ rustToolchain wasm-pack ];&#xA;    } &#39;&#39;&#xA;      cp -r ${src} source&#xA;      cd source&#xA;      wasm-pack build --target ${target} --out-dir $out&#xA;    &#39;&#39;;&#xA;  in&#xA;  pkgs.symlinkJoin {&#xA;    name = &#34;${pname}-multi-target&#34;;&#xA;    paths = map buildTarget targets;&#xA;  };&#xA;&#xA;Feature-Based Builds&#xA;&#xA;Rust&#39;s feature flags allow you to compile different variants of your library with specific functionality enabled or disabled. This is particularly useful for WebAssembly where you might want a lightweight version for simple use cases and a full-featured version for complex applications. This pattern automates building multiple feature combinations.&#xA;&#xA;buildWasmVariants = { pname, src, variants }:&#xA;  let&#xA;    buildVariant = { name, features ? [], optimizeSize ? true }:&#xA;      buildWasmLibrary {&#xA;        inherit pname src features optimizeSize;&#xA;        version = &#34;${name}-variant&#34;;&#xA;      };&#xA;  in&#xA;  pkgs.linkFarm &#34;${pname}-variants&#34; &#xA;    (map (variant: {&#xA;      name = variant.name;&#xA;      path = buildVariant variant;&#xA;    }) variants);&#xA;&#xA;Frontend Integration Architecture&#xA;&#xA;The frontend integration component addresses the coordination challenges between Rust compilation outputs and TypeScript build processes. This implementation demonstrates automated WebAssembly artifact management:&#xA;&#xA;For a practical MVP, here&#39;s a working frontend integration. First, the project structure:&#xA;&#xA;Frontend Project Structure:&#xA;&#xA;web-app/&#xA;├── flake.nix&#xA;├── package.json&#xA;├── webpack.config.js&#xA;├── tsconfig.json&#xA;├── src/&#xA;│   ├── index.html&#xA;│   ├── index.tsx&#xA;│   ├── components/&#xA;│   │   └── JsonEditor.tsx&#xA;│   ├── hooks/&#xA;│   │   └── useJsonService.ts&#xA;│   └── utils/&#xA;│       └── jsonparser/&#xA;│           └── (WASM files will be placed here)&#xA;└── json-parser-flake/  # Local reference to the Rust project&#xA;&#xA;package.json:&#xA;&#xA;{&#xA;  &#34;name&#34;: &#34;json-editor-app&#34;,&#xA;  &#34;version&#34;: &#34;1.0.0&#34;,&#xA;  &#34;scripts&#34;: {&#xA;    &#34;dev&#34;: &#34;webpack serve --mode development&#34;,&#xA;    &#34;build&#34;: &#34;webpack --mode production&#34;,&#xA;    &#34;parser:build&#34;: &#34;nix run .#build-parser&#34;,&#xA;    &#34;parser:clean&#34;: &#34;nix run .#clean-parser&#34;&#xA;  },&#xA;  &#34;devDependencies&#34;: {&#xA;    &#34;webpack&#34;: &#34;^5.88.0&#34;,&#xA;    &#34;webpack-cli&#34;: &#34;^5.1.0&#34;,&#xA;    &#34;webpack-dev-server&#34;: &#34;^4.15.0&#34;,&#xA;    &#34;typescript&#34;: &#34;^5.1.0&#34;,&#xA;    &#34;ts-loader&#34;: &#34;^9.4.0&#34;,&#xA;    &#34;html-webpack-plugin&#34;: &#34;^5.5.0&#34;,&#xA;    &#34;@types/react&#34;: &#34;^18.2.0&#34;&#xA;  },&#xA;  &#34;dependencies&#34;: {&#xA;    &#34;preact&#34;: &#34;^10.19.0&#34;&#xA;  }&#xA;}&#xA;&#xA;webpack.config.js:&#xA;&#xA;const path = require(&#39;path&#39;);&#xA;const HtmlWebpackPlugin = require(&#39;html-webpack-plugin&#39;);&#xA;&#xA;module.exports = {&#xA;  entry: &#39;./src/index.tsx&#39;,&#xA;  module: {&#xA;    rules: [&#xA;      {&#xA;        test: /\.tsx?$/,&#xA;        use: &#39;ts-loader&#39;,&#xA;        exclude: /nodemodules/,&#xA;      },&#xA;    ],&#xA;  },&#xA;  resolve: {&#xA;    extensions: [&#39;.tsx&#39;, &#39;.ts&#39;, &#39;.js&#39;],&#xA;    alias: {&#xA;      &#39;react&#39;: &#39;preact/compat&#39;,&#xA;      &#39;react-dom&#39;: &#39;preact/compat&#39;&#xA;    },&#xA;  },&#xA;  output: {&#xA;    filename: &#39;bundle.js&#39;,&#xA;    path: path.resolve(_dirname, &#39;dist&#39;),&#xA;    clean: true,&#xA;  },&#xA;  plugins: [&#xA;    new HtmlWebpackPlugin({&#xA;      template: &#39;./src/index.html&#39;,&#xA;    }),&#xA;  ],&#xA;  devServer: {&#xA;    static: &#39;./dist&#39;,&#xA;    port: 8080,&#xA;  },&#xA;  experiments: {&#xA;    asyncWebAssembly: true,&#xA;  },&#xA;};&#xA;&#xA;flake.nix:&#xA;&#xA;{&#xA;  description = &#34;Frontend application with WebAssembly integration&#34;;&#xA;&#xA;  inputs = {&#xA;    nixpkgs.url = &#34;github:NixOS/nixpkgs/nixos-unstable&#34;;&#xA;    flake-utils.url = &#34;github:numtide/flake-utils&#34;;&#xA;  };&#xA;&#xA;  outputs = { self, nixpkgs, flake-utils }:&#xA;    flake-utils.lib.eachDefaultSystem (system:&#xA;      let&#xA;        pkgs = import nixpkgs { inherit system; };&#xA;&#xA;        # Build JSON parser from local flake&#xA;        buildJsonParser = pkgs.writeShellApplication {&#xA;          name = &#34;build-json-parser&#34;;&#xA;          runtimeInputs = [ pkgs.nix ];&#xA;          text = &#39;&#39;&#xA;            echo &#34;Building JSON parser from local flake...&#34;&#xA;            &#xA;            # Create the directory where WASM files will live&#xA;            mkdir -p src/utils/jsonparser&#xA;            rm -f src/utils/jsonparser/&#xA;            &#xA;            # Build from local json-parser-flake directory&#xA;            if [ -d &#34;./json-parser-flake&#34; ]; then&#xA;              result=$(nix build ./json-parser-flake --no-link --print-out-paths)&#xA;              if [ -d &#34;$result/pkg&#34; ]; then&#xA;                cp &#34;$result&#34;/pkg/ src/utils/jsonparser/&#xA;                echo &#34;✅ JSON parser built successfully&#34;&#xA;              else&#xA;                echo &#34;❌ No pkg directory found in build result&#34;&#xA;                exit 1&#xA;              fi&#xA;            else&#xA;              echo &#34;❌ json-parser-flake directory not found&#34;&#xA;              echo &#34;Please create a symlink: ln -s ../json-parser json-parser-flake&#34;&#xA;              exit 1&#xA;            fi&#xA;          &#39;&#39;;&#xA;        };&#xA;&#xA;        # Clean up function&#xA;        cleanJsonParser = pkgs.writeShellApplication {&#xA;          name = &#34;clean-json-parser&#34;;&#xA;          text = &#39;&#39;&#xA;            rm -f src/utils/jsonparser/&#xA;            echo &#34;✅ Cleaned JSON parser artifacts&#34;&#xA;          &#39;&#39;;&#xA;        };&#xA;&#xA;      in&#xA;      {&#xA;        # Apps for easy access&#xA;        apps = {&#xA;          build-parser = flake-utils.lib.mkApp {&#xA;            drv = buildJsonParser;&#xA;          };&#xA;          clean-parser = flake-utils.lib.mkApp { &#xA;            drv = cleanJsonParser; &#xA;          };&#xA;        };&#xA;&#xA;        # Packages&#xA;        packages = {&#xA;          build-parser = buildJsonParser;&#xA;          clean-parser = cleanJsonParser;&#xA;        };&#xA;&#xA;        # Development environment&#xA;        devShells.default = pkgs.mkShell {&#xA;          buildInputs = with pkgs; [ nodejs20 yarn ];&#xA;          shellHook = &#39;&#39;&#xA;            echo &#34;Frontend Development Environment&#34;&#xA;            echo &#34;Available commands:&#34;&#xA;            echo &#34;  yarn parser:build  - Build JSON parser WASM&#34;&#xA;            echo &#34;  yarn parser:clean  - Clean WASM artifacts&#34;&#xA;            echo &#34;  yarn dev          - Start development server&#34;&#xA;            echo &#34;&#34;&#xA;            echo &#34;First time setup:&#34;&#xA;            echo &#34;  ln -s ../json-parser json-parser-flake&#34;&#xA;            echo &#34;  yarn install&#34;&#xA;            echo &#34;  yarn parser:build&#34;&#xA;          &#39;&#39;;&#xA;        };&#xA;      });&#xA;}&#xA;&#xA;Setup Instructions:&#xA;&#xA;Create both json-parser/ and web-app/ directories&#xA;&#xA;In web-app/, create a symlink: ln -s ../json-parser json-parser-flake&#xA;&#xA;Run nix develop in web-app/&#xA;&#xA;Run yarn install&#xA;&#xA;Run yarn parser:build to build the WASM module&#xA;&#xA;Run yarn dev to start the development server&#xA;&#xA;TypeScript Integration Patterns&#xA;&#xA;The WebAssembly integration generates TypeScript definitions through wasm-bindgen, enabling type-safe consumption of Rust functionality. The following example demonstrates typical usage patterns:&#xA;&#xA;src/utils/jsonparser/index.ts:&#xA;&#xA;// Wrapper service for the WebAssembly JSON parser&#xA;import init, { &#xA;  parsejson, &#xA;  stringifyjson, &#xA;  validatejson,&#xA;  ParseOptions &#xA;} from &#39;./jsonparser&#39;;&#xA;&#xA;export class JsonService {&#xA;  private initialized = false;&#xA;  &#xA;  async initialize(): Promisevoid {&#xA;    if (!this.initialized) {&#xA;      await init();&#xA;      this.initialized = true;&#xA;    }&#xA;  }&#xA;  &#xA;  async parseJson(jsonString: string): Promisestring {&#xA;    await this.initialize();&#xA;    return parsejson(jsonString);&#xA;  }&#xA;  &#xA;  async stringifyJson(jsonString: string, options: { indentsize: number, sortkeys: boolean }): Promisestring {&#xA;    await this.initialize();&#xA;    const wasmOptions = new ParseOptions(options.indentsize, options.sortkeys);&#xA;    return stringifyjson(jsonString, wasmOptions);&#xA;  }&#xA;  &#xA;  async validateJson(jsonString: string): Promiseboolean {&#xA;    await this.initialize();&#xA;    return validatejson(jsonString);&#xA;  }&#xA;}&#xA;&#xA;src/index.html:&#xA;&#xA;!DOCTYPE html&#xA;html&#xA;head&#xA;    titleJSON Parser Demo/title&#xA;    style&#xA;        body { &#xA;            font-family: Arial, sans-serif; &#xA;            margin: 20px; &#xA;            background-color: #f9f9f9;&#xA;        }&#xA;        .container {&#xA;            max-width: 800px;&#xA;            margin: 0 auto;&#xA;            background: white;&#xA;            padding: 20px;&#xA;            border-radius: 8px;&#xA;            box-shadow: 0 2px 4px rgba(0,0,0,0.1);&#xA;        }&#xA;        textarea { &#xA;            width: 100%; &#xA;            height: 200px; &#xA;            margin: 10px 0; &#xA;            padding: 10px;&#xA;            border: 2px solid #ddd;&#xA;            border-radius: 4px;&#xA;            font-family: &#39;Courier New&#39;, monospace;&#xA;        }&#xA;        button { &#xA;            padding: 10px 20px; &#xA;            margin: 5px; &#xA;            border: none;&#xA;            border-radius: 4px;&#xA;            background: #007bff;&#xA;            color: white;&#xA;            cursor: pointer;&#xA;        }&#xA;        button:hover {&#xA;            background: #0056b3;&#xA;        }&#xA;        button:disabled {&#xA;            background: #6c757d;&#xA;            cursor: not-allowed;&#xA;        }&#xA;        pre { &#xA;            background: #f5f5f5; &#xA;            padding: 15px; &#xA;            border-radius: 4px; &#xA;            border-left: 4px solid #007bff;&#xA;            font-family: &#39;Courier New&#39;, monospace;&#xA;            overflow-x: auto;&#xA;        }&#xA;        .error { &#xA;            border-color: #dc3545 !important; &#xA;            background-color: #fff5f5;&#xA;        }&#xA;        .valid { &#xA;            border-color: #28a745 !important; &#xA;            background-color: #f5fff5;&#xA;        }&#xA;        .status {&#xA;            padding: 10px;&#xA;            margin: 10px 0;&#xA;            border-radius: 4px;&#xA;        }&#xA;        .status.error {&#xA;            background: #f8d7da;&#xA;            color: #721c24;&#xA;            border: 1px solid #f5c6cb;&#xA;        }&#xA;        .status.valid {&#xA;            background: #d4edda;&#xA;            color: #155724;&#xA;            border: 1px solid #c3e6cb;&#xA;        }&#xA;    /style&#xA;/head&#xA;body&#xA;    div id=&#34;app&#34;/div&#xA;/body&#xA;/html&#xA;&#xA;src/hooks/useJsonService.ts:&#xA;&#xA;import { useState, useEffect } from &#39;preact/hooks&#39;;&#xA;import { JsonService } from &#39;../utils/jsonparser&#39;;&#xA;&#xA;export const useJsonService = () =  {&#xA;  const [jsonService] = useState(() =  new JsonService());&#xA;  const [isInitialized, setIsInitialized] = useState(false);&#xA;&#xA;  useEffect(() =  {&#xA;    const initializeService = async () =  {&#xA;      try {&#xA;        await jsonService.initialize();&#xA;        setIsInitialized(true);&#xA;      } catch (error) {&#xA;        console.error(&#39;Failed to initialize JSON service:&#39;, error);&#xA;      }&#xA;    };&#xA;&#xA;    initializeService();&#xA;  }, [jsonService]);&#xA;&#xA;  return { jsonService, isInitialized };&#xA;};&#xA;&#xA;src/components/JsonEditor.tsx:&#xA;&#xA;import { useState } from &#39;preact/hooks&#39;;&#xA;import { useJsonService } from &#39;../hooks/useJsonService&#39;;&#xA;&#xA;interface JsonEditorProps {}&#xA;&#xA;export const JsonEditor = ({}: JsonEditorProps) =  {&#xA;  const { jsonService, isInitialized } = useJsonService();&#xA;  const [jsonText, setJsonText] = useState(&#39;{&#34;name&#34;: &#34;test&#34;, &#34;value&#34;: 123}&#39;);&#xA;  const [output, setOutput] = useState(&#39;&#39;);&#xA;  const [status, setStatus] = useState{ message: string; type: &#39;error&#39; | &#39;valid&#39; | &#39;&#39; }({&#xA;    message: &#39;&#39;,&#xA;    type: &#39;&#39;&#xA;  });&#xA;  const [inputClass, setInputClass] = useState(&#39;&#39;);&#xA;&#xA;  const handleFormat = async () =  {&#xA;    if (!isInitialized) {&#xA;      setStatus({ message: &#39;⏳ WebAssembly module is still loading...&#39;, type: &#39;error&#39; });&#xA;      return;&#xA;    }&#xA;&#xA;    try {&#xA;      const formatted = await jsonService.stringifyJson(jsonText, {&#xA;        indentsize: 2,&#xA;        sortkeys: true&#xA;      });&#xA;      &#xA;      setOutput(formatted);&#xA;      setStatus({ message: &#39;✅ JSON formatted successfully&#39;, type: &#39;valid&#39; });&#xA;      setInputClass(&#39;valid&#39;);&#xA;    } catch (error) {&#xA;      setStatus({ message: ❌ Format error: ${error}, type: &#39;error&#39; });&#xA;      setInputClass(&#39;error&#39;);&#xA;      setOutput(&#39;&#39;);&#xA;    }&#xA;  };&#xA;&#xA;  const handleValidate = async () =  {&#xA;    if (!isInitialized) {&#xA;      setStatus({ message: &#39;⏳ WebAssembly module is still loading...&#39;, type: &#39;error&#39; });&#xA;      return;&#xA;    }&#xA;&#xA;    try {&#xA;      const isValid = await jsonService.validateJson(jsonText);&#xA;      &#xA;      if (isValid) {&#xA;        setStatus({ message: &#39;✅ JSON is valid&#39;, type: &#39;valid&#39; });&#xA;        setInputClass(&#39;valid&#39;);&#xA;      } else {&#xA;        setStatus({ message: &#39;❌ JSON is invalid&#39;, type: &#39;error&#39; });&#xA;        setInputClass(&#39;error&#39;);&#xA;      }&#xA;    } catch (error) {&#xA;      setStatus({ message: ❌ Validation error: ${error}, type: &#39;error&#39; });&#xA;      setInputClass(&#39;error&#39;);&#xA;    }&#xA;  };&#xA;&#xA;  const handleInputChange = (event: Event) =  {&#xA;    const target = event.target as HTMLTextAreaElement;&#xA;    setJsonText(target.value);&#xA;    setInputClass(&#39;&#39;);&#xA;    setStatus({ message: &#39;&#39;, type: &#39;&#39; });&#xA;  };&#xA;&#xA;  return (&#xA;    div className=&#34;container&#34;&#xA;      h1JSON Parser Demo/h1&#xA;      pThis demo uses a Rust library compiled to WebAssembly for JSON processing./p&#xA;      &#xA;      div&#xA;        h3Input JSON:/h3&#xA;        &lt;textarea&#xA;          value={jsonText}&#xA;          onInput={handleInputChange}&#xA;          className={inputClass}&#xA;          placeholder=&#34;Enter JSON here...&#34;&#xA;          disabled={!isInitialized}&#xA;        /  div&#xA;          button onClick={handleFormat} disabled={!isInitialized}&#xA;            Format JSON&#xA;          /button&#xA;          button onClick={handleValidate} disabled={!isInitialized}&#xA;            Validate JSON&#xA;          /button&#xA;        /div&#xA;        &#xA;        {status.message &amp;&amp; (&#xA;          div className={status ${status.type}}&#xA;            {status.message}&#xA;          /div&#xA;        )}&#xA;        &#xA;        h3Output:/h3&#xA;        pre{output || &#39;No output yet...&#39;}/pre&#xA;        &#xA;        {!isInitialized &amp;&amp; (&#xA;          div className=&#34;status error&#34;&#xA;            ⏳ Loading WebAssembly module...&#xA;          /div&#xA;        )}&#xA;      /div&#xA;    /div&#xA;  );&#xA;};&#xA;&#xA;src/index.tsx:&#xA;&#xA;import { render } from &#39;preact&#39;;&#xA;import { JsonEditor } from &#39;./components/JsonEditor&#39;;&#xA;&#xA;const App = () =  {&#xA;  return JsonEditor /;&#xA;};&#xA;&#xA;render(App /, document.getElementById(&#39;app&#39;)!);&#xA;&#xA;tsconfig.json:&#xA;&#xA;{&#xA;  &#34;compilerOptions&#34;: {&#xA;    &#34;target&#34;: &#34;ES2020&#34;,&#xA;    &#34;module&#34;: &#34;ES2020&#34;,&#xA;    &#34;moduleResolution&#34;: &#34;node&#34;,&#xA;    &#34;strict&#34;: true,&#xA;    &#34;esModuleInterop&#34;: true,&#xA;    &#34;skipLibCheck&#34;: true,&#xA;    &#34;forceConsistentCasingInFileNames&#34;: true,&#xA;    &#34;jsx&#34;: &#34;react-jsx&#34;,&#xA;    &#34;jsxImportSource&#34;: &#34;preact&#34;,&#xA;    &#34;outDir&#34;: &#34;./dist&#34;,&#xA;    &#34;rootDir&#34;: &#34;./src&#34;&#xA;  },&#xA;  &#34;include&#34;: [&#34;src//&#34;],&#xA;  &#34;exclude&#34;: [&#34;nodemodules&#34;, &#34;dist&#34;]&#xA;}&#xA;&#xA;The generated TypeScript definitions (created automatically by wasm-bindgen) will be in src/utils/jsonparser/jsonparser.d.ts and provide complete type information for the WebAssembly functions.&#xA;&#xA;This integration pattern combines Rust&#39;s performance characteristics with TypeScript&#39;s type safety, eliminating manual interface management between the two language ecosystems.&#xA;&#xA;Multi-Repository Coordination&#xA;&#xA;Complex projects often involve multiple repositories that must coordinate shared dependencies. The following example demonstrates how Nix flakes can manage multiple TypeScript applications that consume the same WebAssembly library while maintaining branch-specific version alignment.&#xA;&#xA;A coordination flake can manage the entire project ecosystem:&#xA;&#xA;{&#xA;  description = &#34;All my TypeScript/Rust projects working together&#34;;&#xA;&#xA;  inputs = {&#xA;    nixpkgs.url = &#34;github:NixOS/nixpkgs/nixos-unstable&#34;;&#xA;    &#xA;    # The Rust JSON parser&#xA;    json-parser.url = &#34;github:my-org/json-parser&#34;;&#xA;    &#xA;    # TypeScript applications that use it&#xA;    web-app.url = &#34;github:my-org/web-app&#34;;&#xA;    dashboard.url = &#34;github:my-org/dashboard&#34;;&#xA;    &#xA;    # Various integration widgets&#xA;    widget-a.url = &#34;github:my-org/widget-a&#34;;&#xA;    widget-b.url = &#34;github:my-org/widget-b&#34;;&#xA;  };&#xA;&#xA;  outputs = inputs@{ self, nixpkgs, ... }:&#xA;    let&#xA;      system = &#34;x8664-linux&#34;;&#xA;      pkgs = import nixpkgs { inherit system; };&#xA;      &#xA;      # Collect all the packages I can build&#xA;      allPackages = pkgs.lib.foldl&#39; (acc: input: &#xA;        acc // (input.packages.${system} or {})&#xA;      ) {} (builtins.attrValues inputs);&#xA;      &#xA;    in&#xA;    {&#xA;      # One development environment to rule them all&#xA;      devShells.${system}.default = pkgs.mkShell {&#xA;        buildInputs = with pkgs; [&#xA;          # Basic tools&#xA;          git&#xA;          &#xA;          # Language toolchains - exact versions&#xA;          nodejs20&#xA;          yarn&#xA;          rustc&#xA;          wasm-pack&#xA;          &#xA;          # Development tools&#xA;          typescript&#xA;          eslint&#xA;          prettier&#xA;        ] ++ (builtins.attrValues allPackages);&#xA;        &#xA;        shellHook = &#39;&#39;&#xA;          echo &#34;🚀 Full-stack development environment&#34;&#xA;          echo &#34;Rust: json-parser&#34;&#xA;          echo &#34;TypeScript: web-app, dashboard, widget-a, widget-b&#34;&#xA;          echo &#34;&#34;&#xA;          echo &#34;Quick commands:&#34;&#xA;          echo &#34;  nix run .#build-all-parsers  - Build all WASM modules&#34;&#xA;          echo &#34;  nix run .#dev-all           - Start all dev servers&#34;&#xA;        &#39;&#39;;&#xA;      };&#xA;      &#xA;      # Convenience scripts&#xA;      packages.${system} = {&#xA;        # Build all the WebAssembly modules at once&#xA;        build-all-parsers = pkgs.writeShellScriptBin &#34;build-all-parsers&#34; &#39;&#39;&#xA;          echo &#34;🦀 Building all Rust WebAssembly modules...&#34;&#xA;          ${inputs.json-parser.packages.${system}.default}/bin/build-parser&#xA;          echo &#34;✅ All WASM modules built&#34;&#xA;        &#39;&#39;;&#xA;        &#xA;        # Start all development servers&#xA;        dev-all = pkgs.writeShellScriptBin &#34;dev-all&#34; &#39;&#39;&#xA;          echo &#34;🌐 Starting all development servers...&#34;&#xA;          &#xA;          # Start each app in the background&#xA;          cd ${inputs.web-app} &amp;&amp; yarn dev &amp;&#xA;          WEBPID=$!&#xA;          &#xA;          cd ${inputs.dashboard} &amp;&amp; yarn dev &amp;&#xA;          DASHBOARDPID=$!&#xA;          &#xA;          cd ${inputs.widget-a} &amp;&amp; yarn dev &amp;&#xA;          WIDGETAPID=$!&#xA;          &#xA;          # Clean up when we exit&#xA;          trap &#34;kill $WEBPID $DASHBOARDPID $WIDGETAPID&#34; EXIT&#xA;          wait&#xA;        &#39;&#39;;&#xA;      };&#xA;    };&#xA;}&#xA;&#xA;This coordination approach provides a unified development environment that encompasses the entire project ecosystem, eliminating version management concerns and ensuring consistent toolchain availability across all components.&#xA;&#xA;Branch-Based Dependency Management&#xA;&#xA;Multi-repository projects require coordination between corresponding development branches. For example, a web application&#39;s development branch should consume the development version of its WebAssembly dependencies, while production branches should use stable releases.&#xA;&#xA;This coordination can be achieved through package.json script integration:&#xA;&#xA;{&#xA;  &#34;scripts&#34;: {&#xA;    &#34;parser:build&#34;: &#34;nix run .#build-parser&#34;,&#xA;    &#34;parser:clean&#34;: &#34;nix run .#clean-parser&#34;,&#xA;    &#34;dev&#34;: &#34;yarn parser:build &amp;&amp; webpack serve --mode development&#34;,&#xA;    &#34;build&#34;: &#34;yarn parser:build &amp;&amp; webpack --mode production&#34;,&#xA;    &#34;install-deps&#34;: &#34;yarn install&#34;&#xA;  }&#xA;}&#xA;&#xA;Environment-specific development shells can automate branch coordination:&#xA;&#xA;Environment-specific development shells&#xA;devShells.${system} = {&#xA;  # Default development environment&#xA;  default = pkgs.mkShell {&#xA;    buildInputs = with pkgs; [ nodejs20 yarn ];&#xA;    shellHook = &#39;&#39;&#xA;      echo &#34;Development environment&#34;&#xA;      echo &#34;Run &#39;yarn dev&#39; to start with current branch dependencies&#34;&#xA;    &#39;&#39;;&#xA;  };&#xA;  &#xA;  # Staging environment with beta branch dependencies&#xA;  staging = pkgs.mkShell {&#xA;    buildInputs = with pkgs; [ nodejs20 yarn ];&#xA;    shellHook = &#39;&#39;&#xA;      echo &#34;Staging environment (dependencies: beta branch)&#34;&#xA;      nix run .#build-parser-beta&#xA;    &#39;&#39;;&#xA;  };&#xA;  &#xA;  # Production environment with stable dependencies&#xA;  production = pkgs.mkShell {&#xA;    buildInputs = with pkgs; [ nodejs20 yarn ];&#xA;    shellHook = &#39;&#39;&#xA;      echo &#34;Production environment (dependencies: prod branch)&#34;&#xA;      nix run .#build-parser-prod&#xA;    &#39;&#39;;&#xA;  };&#xA;};&#xA;&#xA;This approach enables developers to enter environment-specific shells using nix develop .#staging, automatically configuring the appropriate dependency versions for the target environment.&#xA;&#xA;Advanced Patterns and Best Practices&#xA;&#xA;Caching and Performance Optimization&#xA;&#xA;Binary Caches&#xA;&#xA;Binary caches are one of Nix&#39;s most powerful features for development teams. Instead of rebuilding everything from source, Nix can download pre-built packages from cache servers. This dramatically reduces build times, especially for large Rust projects with many dependencies. Setting up your own cache (like Cachix) means your team shares build artifacts, so if one developer builds a package, everyone else gets it instantly.&#xA;&#xA;{&#xA;  nixConfig = {&#xA;    extra-substituters = [&#xA;      &#34;https://cache.nixos.org&#34;&#xA;      &#34;https://your-org.cachix.org&#34;&#xA;    ];&#xA;    extra-trusted-public-keys = [&#xA;      &#34;cache.nixos.org-1:6NCHdD59X431o0gWypbMrAURkbJ16ZPMQFGspcDShjY=&#34;&#xA;      &#34;your-org.cachix.org-1:YOURPUBLICKEYHERE&#34;&#xA;    ];&#xA;  };&#xA;}&#xA;&#xA;Incremental Builds&#xA;&#xA;Large Rust projects can take a long time to compile, especially when dependencies change. This pattern separates dependency compilation from your source code compilation. Dependencies are built once and cached, then your source code builds much faster since it reuses the pre-built dependencies. This is particularly valuable in CI/CD pipelines where you want fast feedback loops.&#xA;&#xA;buildRustPackageIncremental = { pname, src, ... }@args:&#xA;  let&#xA;    # Separate dependency building from source building&#xA;    deps = pkgs.rustPlatform.buildRustPackage (args // {&#xA;      pname = &#34;${pname}-deps&#34;;&#xA;      src = pkgs.runCommand &#34;deps-src&#34; {} &#39;&#39;&#xA;        mkdir -p $out&#xA;        cp ${src}/Cargo.{toml,lock} $out/&#xA;        mkdir $out/src&#xA;        echo &#34;fn main() {}&#34;   $out/src/main.rs&#xA;      &#39;&#39;;&#xA;      doCheck = false;&#xA;    });&#xA;  in&#xA;  pkgs.rustPlatform.buildRustPackage (args // {&#xA;    preBuild = &#39;&#39;&#xA;      cp -r ${deps}/target .&#xA;      chmod -R +w target&#xA;    &#39;&#39;;&#xA;  });&#xA;&#xA;Testing Across Languages&#xA;&#xA;Integration Testing&#xA;&#xA;When you have Rust code compiled to WebAssembly being consumed by TypeScript applications, you need tests that verify the entire pipeline works correctly. This isn&#39;t just about testing your Rust code or your TypeScript code in isolation - you need to test that the WebAssembly bindings work correctly, that data serialization/deserialization works as expected, and that the integration points behave properly under various conditions.&#xA;&#xA;integrationTests = pkgs.runCommand &#34;integration-tests&#34; {&#xA;  buildInputs = with pkgs; [&#xA;    nodejs20&#xA;    yarn&#xA;    rustc&#xA;    wasm-pack&#xA;    # Your compiled packages&#xA;    wasmLib&#xA;    frontendApp&#xA;  ];&#xA;} &#39;&#39;&#xA;  # Set up test environment&#xA;  export WASMLIBPATH=${wasmLib}/pkg&#xA;  export NODEPATH=${frontendApp}/nodemodules&#xA;  &#xA;  # Run integration test suite&#xA;  cd ${./tests}&#xA;  yarn install&#xA;  yarn test:integration&#xA;  &#xA;  touch $out&#xA;&#39;&#39;;&#xA;&#xA;Cross-Language Testing&#xA;&#xA;This pattern ensures that your WebAssembly modules actually work when called from JavaScript/TypeScript. It&#39;s common for Rust code to work perfectly in isolation but fail when compiled to WebAssembly due to differences in memory management, string handling, or async behavior. These tests catch those issues early in the development process.&#xA;&#xA;wasmIntegrationTests = pkgs.runCommand &#34;wasm-integration-tests&#34; {&#xA;  buildInputs = with pkgs; [ nodejs20 rustLib wasmLib ];&#xA;} &#39;&#39;&#xA;  # Test Rust library directly&#xA;  cd ${rustLib.src}&#xA;  cargo test --release&#xA;  &#xA;  # Test WebAssembly bindings&#xA;  cd ${./tests/wasm}&#xA;  cat   test-wasm.js &lt;&lt; &#39;EOF&#39;&#xA;  const { parsesql, formatsql } = require(&#39;${wasmLib}/pkg/core.js&#39;);&#xA;  &#xA;  // Test basic functionality&#xA;  const sql = &#34;SELECT  FROM users WHERE id = 1&#34;;&#xA;  const parsed = parsesql(sql);&#xA;  const formatted = formatsql(sql, { indentsize: 2 });&#xA;  &#xA;  console.log(&#39;Parsed:&#39;, parsed);&#xA;  console.log(&#39;Formatted:&#39;, formatted);&#xA;  &#xA;  if (!parsed || !formatted) {&#xA;    process.exit(1);&#xA;  }&#xA;  EOF&#xA;  &#xA;  ${pkgs.nodejs20}/bin/node test-wasm.js&#xA;  &#xA;  touch $out&#xA;&#39;&#39;;&#xA;&#xA;Security and Reproducibility&#xA;&#xA;Pinning Dependencies&#xA;&#xA;One of the biggest challenges in software development is ensuring that your project builds the same way today as it will in six months. Dependencies get updated, APIs change, and suddenly your build breaks. Nix flakes solve this by pinning exact versions of all dependencies, including the Nix packages themselves. This ensures that your build is completely reproducible across time and different machines.&#xA;&#xA;{&#xA;  inputs = {&#xA;    nixpkgs.url = &#34;github:NixOS/nixpkgs/nixos-23.11&#34;;&#xA;    rust-overlay = {&#xA;      url = &#34;github:oxalica/rust-overlay&#34;;&#xA;      inputs.nixpkgs.follows = &#34;nixpkgs&#34;;&#xA;    };&#xA;  };&#xA;}&#xA;&#xA;Security Scanning&#xA;&#xA;Security vulnerabilities in dependencies are a major concern for production applications. This pattern integrates security scanning directly into your build process, automatically checking both Rust crates and npm packages for known vulnerabilities. By making security scanning part of your Nix flake, you ensure it runs consistently across all environments and can&#39;t be forgotten or skipped.&#xA;&#xA;securityScan = pkgs.runCommand &#34;security-scan&#34; {&#xA;  buildInputs = with pkgs; [ cargo-audit yarn nodejs20 ];&#xA;} &#39;&#39;&#xA;  mkdir -p $out&#xA;  &#xA;  # Rust security audit&#xA;  cd ${./rust-core}&#xA;  cargo audit --json   $out/rust-audit.json&#xA;  &#xA;  # TypeScript/JavaScript security audit  &#xA;  cd ${./frontend}&#xA;  yarn audit --json   $out/yarn-audit.json || true&#xA;  &#xA;  # Check for known vulnerabilities in dependencies&#xA;  yarn audit --level moderate --json   $out/vulnerability-report.json || true&#xA;&#39;&#39;;&#xA;&#xA;Implementation Example&#xA;&#xA;This section presents a practical implementation based on patterns observed in production environments. The example demonstrates a JSON parsing library ecosystem that illustrates common multi-language coordination challenges and their solutions.&#xA;&#xA;Project Architecture&#xA;&#xA;Consider a JSON processing system consisting of:&#xA;&#xA;A high-performance Rust library compiled to WebAssembly&#xA;A primary web application for JSON editing and validation&#xA;An analytics dashboard for JSON data processing&#xA;Various integration widgets for platform embedding&#xA;&#xA;The following implementation demonstrates how Nix flakes address the coordination requirements:&#xA;&#xA;{&#xA;  description = &#34;My JSON tools ecosystem&#34;;&#xA;&#xA;  inputs = {&#xA;    nixpkgs.url = &#34;github:NixOS/nixpkgs/nixos-unstable&#34;;&#xA;    rust-overlay.url = &#34;github:oxalica/rust-overlay&#34;;&#xA;    &#xA;    # My Rust JSON parser&#xA;    json-parser.url = &#34;github:my-org/json-parser&#34;;&#xA;  };&#xA;&#xA;  outputs = { self, nixpkgs, rust-overlay, json-parser }:&#xA;    let&#xA;      system = &#34;x8664-linux&#34;;&#xA;      pkgs = import nixpkgs {&#xA;        inherit system;&#xA;        overlays = [ rust-overlay.overlays.default ];&#xA;      };&#xA;&#xA;      # The WebAssembly version of my JSON parser&#xA;      jsonParserWasm = json-parser.packages.${system}.default;&#xA;&#xA;      # Main web application&#xA;      jsonEditor = pkgs.buildNpmPackage {&#xA;        pname = &#34;json-editor&#34;;&#xA;        version = &#34;1.0.0&#34;;&#xA;        src = ./editor;&#xA;        npmDepsHash = &#34;sha256-...&#34;;&#xA;        &#xA;        # This is where the magic happens - automatic WASM integration&#xA;        preBuild = &#39;&#39;&#xA;          mkdir -p src/utils/jsonparser&#xA;          cp -r ${jsonParserWasm}/pkg/ src/utils/jsonparser/&#xA;        &#39;&#39;;&#xA;        &#xA;        buildPhase = &#39;&#39;&#xA;          yarn build&#xA;        &#39;&#39;;&#xA;      };&#xA;&#xA;      # Analytics dashboard&#xA;      dashboard = pkgs.buildNpmPackage {&#xA;        pname = &#34;json-dashboard&#34;;&#xA;        version = &#34;1.0.0&#34;;&#xA;        src = ./dashboard;&#xA;        npmDepsHash = &#34;sha256-...&#34;;&#xA;        &#xA;        preBuild = &#39;&#39;&#xA;          mkdir -p src/utils/jsonparser&#xA;          cp -r ${jsonParserWasm}/pkg/ src/utils/jsonparser/&#xA;        &#39;&#39;;&#xA;      };&#xA;&#xA;      # Embeddable widget&#xA;      widget = pkgs.buildNpmPackage {&#xA;        pname = &#34;json-widget&#34;;&#xA;        version = &#34;1.0.0&#34;;&#xA;        src = ./widget;&#xA;        npmDepsHash = &#34;sha256-...&#34;;&#xA;        &#xA;        preBuild = &#39;&#39;&#xA;          mkdir -p src/utils/jsonparser&#xA;          cp -r ${jsonParserWasm}/pkg/ src/utils/jsonparser/&#xA;        &#39;&#39;;&#xA;      };&#xA;&#xA;    in&#xA;    {&#xA;      packages.${system} = {&#xA;        inherit jsonEditor dashboard widget;&#xA;        &#xA;        # Everything bundled together for deployment&#xA;        all-apps = pkgs.runCommand &#34;json-tools-deployment&#34; {} &#39;&#39;&#xA;          mkdir -p $out/{editor,dashboard,widget}&#xA;          cp -r ${jsonEditor}/ $out/editor/&#xA;          cp -r ${dashboard}/ $out/dashboard/&#xA;          cp -r ${widget}/ $out/widget/&#xA;        &#39;&#39;;&#xA;      };&#xA;&#xA;      # My daily development environment&#xA;      devShells.${system}.default = pkgs.mkShell {&#xA;        buildInputs = with pkgs; [&#xA;          nodejs20&#xA;          yarn&#xA;          typescript&#xA;          eslint&#xA;          prettier&#xA;        ];&#xA;        &#xA;        shellHook = &#39;&#39;&#xA;          echo &#34;📝 JSON Tools Development Environment&#34;&#xA;          echo &#34;Available commands:&#34;&#xA;          echo &#34;  yarn editor:dev     - JSON editor dev server&#34;&#xA;          echo &#34;  yarn dashboard:dev  - Dashboard dev server&#34;  &#xA;          echo &#34;  yarn widget:dev     - Widget dev server&#34;&#xA;          echo &#34;  nix run json-parser#build-parser - Rebuild WASM&#34;&#xA;        &#39;&#39;;&#xA;      };&#xA;    };&#xA;}&#xA;&#xA;This architecture enables automatic propagation of changes from the Rust library to all consuming TypeScript applications, eliminating manual artifact management and reducing the risk of version inconsistencies.&#xA;&#xA;Development Workflow Analysis&#xA;&#xA;The Nix flakes approach transforms the typical development workflow through several key improvements:&#xA;&#xA;Environment Initialization: Developers begin work by executing nix develop, which provides a shell environment with precisely specified tool versions, eliminating version-related configuration issues.&#xA;&#xA;Core Library Development: Changes to the Rust library follow standard development practices, with testing via cargo test and version control through git branches.&#xA;&#xA;Frontend Integration: Frontend applications integrate updated WebAssembly modules through commands like yarn parser:build:dev, which automatically fetch, compile, and position artifacts according to the build configuration.&#xA;&#xA;Multi-Application Testing: Development server orchestration enables simultaneous testing across multiple applications:&#xA;&#xA;Concurrent development server execution&#xA;nix run .#dev-all&#xA;&#xA;Environment-Specific Deployment: Deployment processes utilize environment-specific configurations:&#xA;&#xA;Staging deployment&#xA;nix develop .#staging&#xA;yarn build:staging&#xA;&#xA;Production deployment&#xA;nix develop .#production&#xA;yarn build:prod&#xA;&#xA;This workflow ensures consistent behavior across development machines, continuous integration systems, and deployment environments, effectively eliminating environment-specific build failures.&#xA;&#xA;Performance Considerations and Optimization&#xA;&#xA;Build Performance&#xA;&#xA;Parallel Builds&#xA;&#xA;Modern development machines have multiple CPU cores, but many build systems don&#39;t take full advantage of them. This configuration ensures that both Nix builds and Rust compilation use all available CPU cores, significantly reducing build times. This is especially important for large Rust projects where compilation can be the bottleneck in your development workflow.&#xA;&#xA;{&#xA;  # Enable parallel building&#xA;  nixConfig = {&#xA;    max-jobs = &#34;auto&#34;;&#xA;    cores = 0;  # Use all available cores&#xA;  };&#xA;}&#xA;&#xA;In package definitions&#xA;buildRustPackage {&#xA;  # ... other attributes&#xA;  &#xA;  # Parallel cargo builds&#xA;  cargoBuildFlags = [ &#34;--jobs&#34; &#34;$NIXBUILDCORES&#34; ];&#xA;  &#xA;  # Parallel tests&#xA;  cargoTestFlags = [ &#34;--jobs&#34; &#34;$NIXBUILDCORES&#34; ];&#xA;}&#xA;&#xA;Build Caching Strategies&#xA;&#xA;This advanced pattern addresses one of the most frustrating aspects of Rust development: waiting for dependencies to recompile when they haven&#39;t actually changed. By separating dependency builds from source builds, you can cache the expensive dependency compilation step and only rebuild your actual source code when it changes.&#xA;&#xA;Separate dependency builds from source builds&#xA;rustDependencies = pkgs.rustPlatform.buildRustPackage {&#xA;  pname = &#34;${pname}-deps&#34;;&#xA;  inherit version src cargoLock;&#xA;  &#xA;  # Build only dependencies&#xA;  buildPhase = &#39;&#39;&#xA;    cargo build --release --frozen&#xA;  &#39;&#39;;&#xA;  &#xA;  # Don&#39;t include source files in hash&#xA;  outputHashMode = &#34;recursive&#34;;&#xA;  outputHash = &#34;sha256-...&#34;;&#xA;};&#xA;&#xA;Main package reuses dependency build&#xA;mainPackage = pkgs.rustPlatform.buildRustPackage {&#xA;  inherit pname version src cargoLock;&#xA;  &#xA;  preBuild = &#39;&#39;&#xA;    cp -r ${rustDependencies}/target .&#xA;    chmod -R +w target&#xA;  &#39;&#39;;&#xA;};&#xA;&#xA;Runtime Performance&#xA;&#xA;WebAssembly Optimization&#xA;&#xA;WebAssembly modules can be optimized for different goals: smaller file size for faster downloads, or faster execution speed. The Binaryen toolkit provides powerful optimization tools that can significantly improve your WebAssembly performance. This pattern lets you build different optimized versions for different deployment scenarios.&#xA;&#xA;optimizedWasm = pkgs.runCommand &#34;optimized-wasm&#34; {&#xA;  buildInputs = [ pkgs.binaryen ];&#xA;} &#39;&#39;&#xA;  # Size optimization&#xA;  wasm-opt -Oz ${inputWasm} -o $out/optimized.wasm&#xA;  &#xA;  # Speed optimization  &#xA;  wasm-opt -O3 ${inputWasm} -o $out/fast.wasm&#xA;  &#xA;  # Debug build&#xA;  cp ${inputWasm} $out/debug.wasm&#xA;&#39;&#39;;&#xA;&#xA;Memory Management&#xA;&#xA;WebAssembly has different memory constraints than native applications. This configuration helps you optimize memory usage during both compilation and runtime, which is crucial for WebAssembly modules that will run in memory-constrained environments like browsers or edge computing platforms.&#xA;&#xA;Configure memory limits for builds&#xA;buildRustPackage {&#xA;  # ... other attributes&#xA;  &#xA;  # Limit memory usage during compilation&#xA;  NIXCFLAGSCOMPILE = &#34;-Wl,--max-memory=2147483648&#34;;  # 2GB limit&#xA;  &#xA;  # Optimize for memory usage&#xA;  RUSTFLAGS = &#34;-C opt-level=s -C panic=abort&#34;;&#xA;}&#xA;&#xA;Debugging and Development Workflow&#xA;&#xA;Development Tools Integration&#xA;&#xA;IDE Support&#xA;&#xA;A good development environment includes all the tools you need for productive coding: language servers for intelligent code completion, debuggers for troubleshooting, and profiling tools for performance optimization. This configuration ensures that every developer on your team has access to the same high-quality development tools, regardless of their operating system or local setup.&#xA;&#xA;devShells.default = pkgs.mkShell {&#xA;  buildInputs = with pkgs; [&#xA;    # Language servers&#xA;    rust-analyzer&#xA;    typescript-language-server&#xA;    eslintd&#xA;    &#xA;    # Debugging tools&#xA;    gdb&#xA;    lldb&#xA;    nodejs20  # for Chrome DevTools and Node.js debugging&#xA;    &#xA;    # Development tools&#xA;    wasm-pack&#xA;    wasm-bindgen-cli&#xA;    &#xA;    # Profiling tools&#xA;    valgrind&#xA;    perf-tools&#xA;  ];&#xA;  &#xA;  shellHook = &#39;&#39;&#xA;    # Configure rust-analyzer&#xA;    export RUSTSRCPATH=&#34;${pkgs.rust.packages.stable.rustPlatform.rustLibSrc}&#34;&#xA;    &#xA;    # Set up debugging symbols for Rust&#xA;    export RUSTFLAGS=&#34;-C debuginfo=2&#34;&#xA;    &#xA;    # Configure TypeScript development&#xA;    export NODEOPTIONS=&#34;--openssl-legacy-provider&#34;&#xA;    &#xA;    echo &#34;🔧 Development environment ready&#34;&#xA;    echo &#34;Rust tools: rust-analyzer, cargo, wasm-pack&#34;&#xA;    echo &#34;TypeScript tools: typescript-language-server, eslint&#34;&#xA;    echo &#34;Debugging: gdb, lldb, Chrome DevTools&#34;&#xA;  &#39;&#39;;&#xA;};&#xA;&#xA;Hot Reloading&#xA;&#xA;During development, you want to see changes immediately without manually rebuilding everything. This setup watches your Rust files for changes and automatically recompiles the WebAssembly module, while simultaneously running a frontend development server that hot-reloads when the WebAssembly module updates. This creates a smooth development experience where changes in Rust code are immediately visible in your web application.&#xA;&#xA;Development server with hot reloading&#xA;devServer = pkgs.writeShellScriptBin &#34;dev-server&#34; &#39;&#39;&#xA;  # Start Rust file watcher&#xA;  ${pkgs.cargo-watch}/bin/cargo-watch -x &#39;build --target wasm32-unknown-unknown&#39; &amp;&#xA;  RUSTPID=$!&#xA;  &#xA;  # Start frontend dev server&#xA;  cd frontend&#xA;  ${pkgs.nodejs}/bin/npm run dev &amp;&#xA;  FRONTENDPID=$!&#xA;  &#xA;  # Cleanup on exit&#xA;  trap &#34;kill $RUSTPID $FRONTENDPID&#34; EXIT&#xA;  wait&#xA;&#39;&#39;;&#xA;&#xA;Testing Strategies&#xA;&#xA;Cross-Language Testing&#xA;&#xA;This comprehensive testing approach ensures that your entire multi-language stack works correctly. It tests your Rust code natively, verifies that the WebAssembly compilation works properly, checks that the TypeScript integration layer functions correctly, and runs end-to-end tests that simulate real user interactions. This catches issues that might only appear when all the pieces work together.&#xA;&#xA;crossLanguageTests = pkgs.runCommand &#34;cross-language-tests&#34; {&#xA;  buildInputs = with pkgs; [&#xA;    rustPackage&#xA;    nodePackage&#xA;    typescriptPackage&#xA;    yarn&#xA;  ];&#xA;} &#39;&#39;&#xA;  # Test Rust library&#xA;  cd ${rustSrc}&#xA;  cargo test --release&#xA;  &#xA;  # Test WebAssembly bindings&#xA;  cd ${wasmSrc}&#xA;  ${pkgs.nodejs20}/bin/node test-wasm.js&#xA;  &#xA;  # Test TypeScript integration&#xA;  cd ${typescriptSrc}&#xA;  yarn test&#xA;  yarn test:integration&#xA;  &#xA;  # Run end-to-end tests&#xA;  yarn test:e2e&#xA;  &#xA;  touch $out&#xA;&#39;&#39;;&#xA;&#xA;Deployment and Production Considerations&#xA;&#xA;Container Images&#xA;&#xA;Multi-Stage Builds&#xA;&#xA;Production container images should be as small and secure as possible. This pattern creates minimal container images that include only the runtime dependencies needed to run your application, without any of the build tools or development dependencies. This reduces attack surface, improves startup times, and reduces bandwidth costs for deployment.&#xA;&#xA;productionImage = pkgs.dockerTools.buildLayeredImage {&#xA;  name = &#34;production-app&#34;;&#xA;  tag = &#34;latest&#34;;&#xA;  &#xA;  contents = [&#xA;    # Runtime dependencies only&#xA;    pkgs.cacert&#xA;    pkgs.tzdata&#xA;  ];&#xA;  &#xA;  config = {&#xA;    Cmd = [ &#34;${finalPackage}/bin/app&#34; ];&#xA;    Env = [&#xA;      &#34;SSLCERTFILE=${pkgs.cacert}/etc/ssl/certs/ca-bundle.crt&#34;&#xA;    ];&#xA;    ExposedPorts = {&#xA;      &#34;8080/tcp&#34; = {};&#xA;    };&#xA;  };&#xA;};&#xA;&#xA;Security Hardening&#xA;&#xA;Security is crucial for production deployments. This configuration creates container images that follow security best practices: running as a non-root user, using minimal base images that reduce attack surface, and including proper metadata for security scanning and compliance. These practices help protect your application from common container-based attacks.&#xA;&#xA;hardenedImage = pkgs.dockerTools.buildImage {&#xA;  name = &#34;hardened-app&#34;;&#xA;  &#xA;  # Minimal base with security updates&#xA;  contents = [ pkgs.distroless ];&#xA;  &#xA;  config = {&#xA;    User = &#34;65534:65534&#34;;  # nobody user&#xA;    Cmd = [ &#34;${securePackage}/bin/app&#34; ];&#xA;    &#xA;    # Security labels&#xA;    Labels = {&#xA;      &#34;org.opencontainers.image.source&#34; = &#34;https://github.com/org/repo&#34;;&#xA;      &#34;org.opencontainers.image.licenses&#34; = &#34;MIT&#34;;&#xA;    };&#xA;  };&#xA;};&#xA;&#xA;Continuous Integration&#xA;&#xA;GitHub Actions Integration&#xA;&#xA;Continuous integration ensures that your code works correctly across different environments and catches problems before they reach production. This GitHub Actions configuration leverages Nix&#39;s reproducibility to create CI builds that are identical to your local development environment, while using caching to keep build times fast.&#xA;&#xA;.github/workflows/build.yml&#xA;name: Build and Test&#xA;&#xA;on: [push, pullrequest]&#xA;&#xA;jobs:&#xA;  build:&#xA;    runs-on: ubuntu-latest&#xA;    steps:&#xA;      uses: actions/checkout@v3&#xA;      uses: cachix/install-nix-action@v20&#xA;      uses: cachix/cachix-action@v12&#xA;        with:&#xA;          name: your-cache&#xA;          authToken: &#39;${{ secrets.CACHIXAUTHTOKEN }}&#39;&#xA;      &#xA;      name: Build all packages&#xA;        run: nix build .#all&#xA;      &#xA;      name: Run tests&#xA;        run: nix flake check&#xA;      &#xA;      name: Build Docker images&#xA;        run: nix build .#dockerImages&#xA;&#xA;Future Directions and Advanced Topics&#xA;&#xA;Emerging Patterns&#xA;&#xA;WebAssembly System Interface (WASI)&#xA;&#xA;WASI represents the future of WebAssembly beyond the browser. It provides a standardized system interface that allows WebAssembly modules to interact with the operating system in a secure, sandboxed way. This enables WebAssembly applications to run on servers, edge computing platforms, and other environments where you need system access but want the security and portability benefits of WebAssembly.&#xA;&#xA;wasiPackage = pkgs.rustPlatform.buildRustPackage {&#xA;  # ... standard attributes&#xA;  &#xA;  # WASI target&#xA;  CARGOBUILDTARGET = &#34;wasm32-wasi&#34;;&#xA;  &#xA;  # WASI runtime for testing&#xA;  nativeCheckInputs = [ pkgs.wasmtime ];&#xA;  &#xA;  checkPhase = &#39;&#39;&#xA;    wasmtime target/wasm32-wasi/release/${pname}.wasm&#xA;  &#39;&#39;;&#xA;};&#xA;&#xA;Edge Computing Deployment&#xA;&#xA;Edge computing brings computation closer to users for better performance and reduced latency. WebAssembly is particularly well-suited for edge deployment because of its small size, fast startup times, and security sandbox. This pattern packages your WebAssembly modules and static assets for deployment to edge computing platforms like Cloudflare Workers, Fastly Compute@Edge, or AWS Lambda@Edge.&#xA;&#xA;edgeDeployment = pkgs.runCommand &#34;edge-deployment&#34; {} &#39;&#39;&#xA;  mkdir -p $out/{functions,assets}&#xA;  &#xA;  # WebAssembly functions for edge runtime&#xA;  cp ${wasmPackage}/pkg/.wasm $out/functions/&#xA;  &#xA;  # Static assets&#xA;  cp -r ${frontendPackage}/ $out/assets/&#xA;  &#xA;  # Edge configuration&#xA;  cat   $out/edge-config.json &lt;&lt; EOF&#xA;  {&#xA;    &#34;functions&#34;: {&#xA;      &#34;api/&#34;: &#34;functions/api.wasm&#34;,&#xA;      &#34;compute/&#34;: &#34;functions/compute.wasm&#34;&#xA;    },&#xA;    &#34;assets&#34;: &#34;assets/&#34;&#xA;  }&#xA;  EOF&#xA;&#39;&#39;;&#xA;&#xA;Integration with Other Ecosystems&#xA;&#xA;Kubernetes Operators&#xA;&#xA;Kubernetes operators extend Kubernetes with custom application-specific logic. By embedding WebAssembly modules directly into your operators, you can create powerful, efficient controllers that leverage the performance and security benefits of WebAssembly while maintaining the operational benefits of Kubernetes. This is particularly useful for complex data processing or policy enforcement scenarios.&#xA;&#xA;k8sOperator = pkgs.buildGoModule {&#xA;  pname = &#34;app-operator&#34;;&#xA;  version = &#34;1.0.0&#34;;&#xA;  src = ./operator;&#xA;  &#xA;  # Include WebAssembly modules as embedded resources&#xA;  preBuild = &#39;&#39;&#xA;    mkdir -p assets&#xA;    cp ${wasmPackage}/pkg/*.wasm assets/&#xA;  &#39;&#39;;&#xA;  &#xA;  ldflags = [&#xA;    &#34;-X main.version=1.0.0&#34;&#xA;    &#34;-X main.wasmAssets=assets&#34;&#xA;  ];&#xA;};&#xA;&#xA;Serverless Functions&#xA;&#xA;Serverless platforms are increasingly supporting WebAssembly as a more efficient alternative to traditional JavaScript functions. WebAssembly functions start faster, use less memory, and can provide better performance for compute-intensive tasks. This pattern shows how to package your Rust-based WebAssembly modules for deployment to serverless platforms like AWS Lambda, Vercel Functions, or Netlify Functions.&#xA;&#xA;serverlessFunction = pkgs.runCommand &#34;serverless-function&#34; {} &#39;&#39;&#xA;  mkdir -p $out&#xA;  &#xA;  # Package WebAssembly for serverless runtime&#xA;  cat   $out/function.js &lt;&lt; EOF&#xA;  const wasm = require(&#39;./module.wasm&#39;);&#xA;  &#xA;  exports.handler = async (event) =  {&#xA;    const result = await wasm.process(event.data);&#xA;    return { statusCode: 200, body: result };&#xA;  };&#xA;  EOF&#xA;  &#xA;  cp ${wasmPackage}/pkg/module.wasm $out/&#xA;&#39;&#39;;&#xA;&#xA;Analysis and Conclusions&#xA;&#xA;The implementation patterns presented demonstrate how Nix flakes address fundamental challenges in multi-language development environments. This analysis is based on observations from production systems where these approaches have been successfully deployed.&#xA;&#xA;Technical Benefits&#xA;&#xA;The Nix flakes approach provides several measurable improvements over traditional multi-language development setups:&#xA;&#xA;Reproducible Builds: Declarative dependency specification ensures identical build environments across development, testing, and production systems, eliminating environment-specific failures.&#xA;&#xA;Automated Coordination: Programmatic integration between Rust compilation and TypeScript build processes reduces manual intervention and associated error rates.&#xA;&#xA;Version Consistency: Centralized toolchain management prevents version conflicts and ensures all team members use identical development environments.&#xA;&#xA;Branch Synchronization: Git-based dependency resolution automatically coordinates library versions with consuming applications across different development branches.&#xA;&#xA;Implementation Considerations&#xA;&#xA;Learning Curve: Nix flakes require initial investment in understanding declarative configuration concepts and Nix expression language syntax.&#xA;&#xA;Incremental Adoption: Organizations can implement Nix flakes gradually, beginning with critical compilation pipelines and expanding to encompass broader development workflows.&#xA;&#xA;Toolchain Integration: The approach integrates effectively with existing development tools while providing superior coordination capabilities.&#xA;&#xA;Scalability: The pattern scales effectively from single-developer projects to large multi-repository systems with complex dependency relationships.&#xA;&#xA;Operational Impact&#xA;&#xA;Teams implementing these patterns typically observe:&#xA;&#xA;Reduced onboarding time for new developers&#xA;Decreased environment-related build failures&#xA;Improved consistency across development and deployment environments&#xA;Enhanced collaboration efficiency through standardized toolchains&#xA;&#xA;Applicability Assessment&#xA;&#xA;Nix flakes provide particular value for projects that:&#xA;&#xA;Combine multiple programming languages with complex build requirements&#xA;Require WebAssembly compilation and integration workflows&#xA;Involve multiple repositories with interdependent components&#xA;Need consistent environments across diverse development and deployment contexts&#xA;&#xA;The approach represents a significant improvement over traditional multi-language development coordination methods, particularly for teams working with Rust-to-TypeScript integration patterns. While the initial learning investment is substantial, the long-term benefits in terms of reliability, reproducibility, and development efficiency justify adoption for suitable project contexts.&#xA;&#xA;Building Your First Multi-Language Project: A Practical Guide&#xA;&#xA;Let&#39;s walk through creating a real working example that demonstrates these concepts in action. We&#39;ll build a JSON processing application where the heavy lifting is done by a Rust library compiled to WebAssembly, consumed by a TypeScript web application.&#xA;&#xA;What You&#39;ll Need&#xA;&#xA;Before we start, make sure you have:&#xA;&#xA;Nix package manager installed with flakes enabled (installation guide)&#xA;Basic familiarity with Rust and TypeScript (don&#39;t worry, we&#39;ll explain the tricky parts)&#xA;About 30 minutes to work through this example&#xA;&#xA;Step 1: Building the Rust Core&#xA;&#xA;Let&#39;s start by creating the performance-critical part of our application - a Rust library that can parse and manipulate JSON data.&#xA;&#xA;Create a workspace for our project&#xA;mkdir json-tools-project&#xA;cd json-tools-project&#xA;&#xA;Create the Rust library&#xA;mkdir json-parser&#xA;cd json-parser&#xA;mkdir src&#xA;&#xA;Now we&#39;ll create the core files. The beauty of this approach is that once you set up the Nix flake, anyone on your team can build this project identically, regardless of what Rust version they have installed (or even if they have Rust installed at all).&#xA;&#xA;Create the files as detailed in the Rust WebAssembly section above. The key insight here is that our flake.nix file is doing something powerful - it&#39;s not just managing dependencies, it&#39;s creating a completely reproducible build environment.&#xA;&#xA;Let&#39;s test our Rust library:&#xA;&#xA;This command creates a development shell with all the right tools&#xA;nix develop&#xA;&#xA;Generate the lock file (this pins our Rust dependencies)&#xA;cargo generate-lockfile&#xA;&#xA;Exit the development shell and build the WebAssembly module&#xA;exit&#xA;nix build&#xA;&#xA;Check that we got our WebAssembly files&#xA;ls result/pkg/  # You should see .wasm, .js, and .d.ts files&#xA;&#xA;What just happened? Nix downloaded the exact version of Rust we specified, compiled our library to WebAssembly, generated TypeScript bindings, and packaged everything up. If you share this code with a teammate, they&#39;ll get exactly the same result.&#xA;&#xA;Step 2: Creating the Web Application&#xA;&#xA;Now let&#39;s build a web application that uses our Rust library. The magic here is that our web app will automatically get the latest version of our WebAssembly module whenever we rebuild it.&#xA;&#xA;Go back to our project root&#xA;cd ..&#xA;mkdir web-app&#xA;cd web-app&#xA;&#xA;Create the directory structure&#xA;mkdir -p src/{components,hooks,utils/json_parser}&#xA;&#xA;Create all the frontend files as shown in the Frontend Integration section. The key file here is our flake.nix - it contains a script that automatically builds our Rust library and copies the WebAssembly files to the right place in our web app.&#xA;&#xA;Step 3: Connecting Everything Together&#xA;&#xA;This is where the magic happens. Instead of manually copying files around, we&#39;ll create a symbolic link that tells our web app where to find our Rust library:&#xA;&#xA;Create a link to our Rust project&#xA;ln -s ../json-parser json-parser-flake&#xA;&#xA;This might seem simple, but it&#39;s solving a major problem in multi-language development: keeping different parts of your project in sync.&#xA;&#xA;Step 4: Running Your Application&#xA;&#xA;Now for the moment of truth - let&#39;s see our multi-language application in action:&#xA;&#xA;Enter the development environment (this gives us Node.js, Yarn, etc.)&#xA;nix develop&#xA;&#xA;Install our JavaScript dependencies&#xA;yarn install&#xA;&#xA;Build the WebAssembly module and copy it to our web app&#xA;yarn parser:build&#xA;&#xA;Start the development server&#xA;yarn dev&#xA;&#xA;Open your browser to http://localhost:8080. You should see a JSON editor that&#39;s powered by Rust running in WebAssembly!&#xA;&#xA;Step 5: Testing the Integration&#xA;&#xA;Let&#39;s verify that everything is working correctly:&#xA;&#xA;Try valid JSON: Paste {&#34;name&#34;: &#34;test&#34;, &#34;value&#34;: 123} and click &#34;Validate JSON&#34; - you should see a green success message&#xA;&#xA;Try invalid JSON: Enter {name: &#34;test&#34;} (missing quotes) and click &#34;Validate JSON&#34; - you should see a red error message&#xA;&#xA;Test formatting: Enter some messy JSON and click &#34;Format JSON&#34; - it should be nicely formatted&#xA;&#xA;Check the console: Open your browser&#39;s developer tools - you shouldn&#39;t see any errors&#xA;&#xA;What You&#39;ve Accomplished&#xA;&#xA;Congratulations! You&#39;ve just built a multi-language application with some pretty sophisticated features:&#xA;&#xA;Reproducible builds: Anyone can clone your project and get exactly the same development environment&#xA;Automatic integration: Changes to your Rust code automatically flow through to your web application&#xA;Type safety: Your TypeScript code has full type information about your Rust functions&#xA;Performance: Critical JSON processing happens at near-native speed thanks to WebAssembly&#xA;&#xA;Next Steps&#xA;&#xA;This foundation can be extended in many directions:&#xA;&#xA;Add more complex Rust functionality (maybe a SQL parser or image processing)&#xA;Create multiple web applications that share the same Rust core&#xA;Set up automatic testing that verifies the Rust-TypeScript integration&#xA;Deploy to production with the same reproducible build process&#xA;&#xA;The patterns you&#39;ve learned here scale from simple projects like this one all the way up to complex multi-repository systems with dozens of interconnected components.&#xA;&#xA;Beyond WebAssembly: Rust FFI and Native Libraries&#xA;&#xA;While this article focuses on WebAssembly, it&#39;s worth noting that Rust&#39;s capabilities extend far beyond the browser. Rust can compile to shared libraries (.so files on Linux, .dylib on macOS, .dll on Windows) that can be used from virtually any programming language through Foreign Function Interface (FFI).&#xA;&#xA;This means you can write performance-critical code in Rust and use it from Python, Node.js, Ruby, Go, or any other language that supports FFI. Nix makes this even more powerful by ensuring that your shared libraries are built consistently and can be easily distributed to different target platforms.&#xA;&#xA;For example, you might have a Rust library that handles cryptographic operations, compiled as both a WebAssembly module for browser use and as native shared libraries for server-side applications. Nix flakes can manage both build targets simultaneously, ensuring consistency across your entire technology stack.&#xA;&#xA;Useful Resources and Further Reading&#xA;&#xA;Getting Started with Nix&#xA;&#xA;Zero to Nix - An excellent beginner-friendly guide to learning Nix from scratch. This is the best place to start if you&#39;re new to Nix and want to understand the fundamentals before diving into flakes.&#xA;Lix - A community-driven alternative implementation of Nix that aims to improve upon the original with better performance, clearer error messages, and enhanced developer experience.&#xA;Nix Flakes Documentation - Official documentation for Nix flakes, covering all the technical details.&#xA;&#xA;Rust and WebAssembly&#xA;&#xA;The Rust and WebAssembly Book - Comprehensive guide to using Rust with WebAssembly, including best practices and optimization techniques.&#xA;Rust FFI Guide - Learn how to create shared libraries from Rust code that can be used by other programming languages.&#xA;&#xA;Multi-Language Development&#xA;&#xA;Nix Pills - Deep dive into Nix concepts and philosophy, helpful for understanding the underlying principles.&#xA;devenv - A higher-level tool built on Nix that simplifies creating development environments for multi-language projects.&#xA;&#xA;Advanced Topics&#xA;&#xA;Nix Cross Compilation - Learn how to use Nix to build software for different target platforms.&#xA;NixOS Modules - For when you want to extend beyond development environments into full system configuration.&#xA;Flake Parts - A framework for organizing complex Nix flakes in a modular way.&#xA;&#xA;These resources will help you deepen your understanding of the concepts presented in this article and explore more advanced use cases for multi-language development with Nix.]]&gt;</description>
      <content:encoded><![CDATA[<h2 id="introduction" id="introduction">Introduction</h2>

<p>Modern software development frequently requires coordinating multiple programming languages within a single project ecosystem. In this article I tried to describe the challenges of managing Rust-to-TypeScript workflows and present a solution using Nix flakes, based on patterns observed in real-world development projects.</p>

<p>The examples presented here are hypothetical but derived from actual implementation patterns used in production environments where high-performance Rust libraries are compiled to WebAssembly and consumed by TypeScript frontend applications.</p>

<h2 id="common-challenges-in-multi-language-development" id="common-challenges-in-multi-language-development">Common Challenges in Multi-Language Development</h2>

<p>Projects that combine Rust and TypeScript typically encounter several coordination challenges:</p>
<ul><li><strong>Toolchain Management</strong>: Different components require specific versions of Rust, Node.js, and associated build tools</li>
<li><strong>Build Reproducibility</strong>: Ensuring consistent compilation results across development, CI, and production environments</li>
<li><strong>Dependency Coordination</strong>: Managing the interface between Rust compilation outputs and TypeScript build processes</li>
<li><strong>Environment Consistency</strong>: Maintaining identical development setups across team members</li>
<li><strong>Branch Synchronization</strong>: Coordinating different versions of core libraries with corresponding frontend applications</li>
<li><strong>Manual Integration Steps</strong>: Copying build artifacts between repositories and maintaining build scripts</li></ul>

<p>Traditional approaches using Docker containers, shell scripts, or manual environment management often prove insufficient for addressing the full scope of these coordination challenges.</p>



<h2 id="project-structure-analysis" id="project-structure-analysis">Project Structure Analysis</h2>

<p>Consider a typical multi-repository project structure combining Rust and TypeScript components:</p>

<pre><code>project-ecosystem/
├── json-parser/              # Rust library for WebAssembly compilation
│   ├── Cargo.toml
│   ├── src/lib.rs
│   └── build.sh              # Build coordination script
├── web-application/          # Primary TypeScript application
│   ├── package.json
│   ├── webpack.config.js
│   ├── src/
│   └── src/utils/json_core/  # WebAssembly integration directory
├── dashboard/                # Secondary React application
│   ├── package.json
│   └── src/
└── integrations/             # Platform-specific TypeScript components
    ├── widget-a/
    ├── widget-b/
    └── iframe-embed/
</code></pre>

<p>This structure presents several operational challenges:</p>
<ul><li><strong>Toolchain Synchronization</strong>: Each developer must maintain compatible versions of Rust, Node.js, wasm-pack, and related tools</li>
<li><strong>Build Script Complexity</strong>: Coordination scripts become increasingly complex as they handle cross-compilation, file copying, and dependency management</li>
<li><strong>Manual Artifact Management</strong>: WebAssembly files must be manually transferred from Rust build outputs to TypeScript project directories</li>
<li><strong>Branch Coordination</strong>: Different development branches require corresponding versions of core libraries</li>
<li><strong>CI/CD Complexity</strong>: Continuous integration systems require complex Docker configurations and dependency caching strategies</li></ul>

<h2 id="nix-flakes-as-a-solution-framework" id="nix-flakes-as-a-solution-framework">Nix Flakes as a Solution Framework</h2>

<p>Nix flakes provide a declarative approach to managing multi-language development environments. The solution addresses the coordination challenges through:</p>
<ul><li><strong>Unified Dependency Declaration</strong>: Single configuration files that specify exact tool versions and dependencies</li>
<li><strong>Environment Reproducibility</strong>: Identical development environments across all team members and deployment targets</li>
<li><strong>Automated Build Coordination</strong>: Programmatic integration between Rust compilation and TypeScript build processes</li>
<li><strong>Branch-Based Dependency Management</strong>: Dynamic dependency resolution based on git branch references</li></ul>

<h2 id="foundational-flake-structure" id="foundational-flake-structure">Foundational Flake Structure</h2>

<p>A basic Nix flake for multi-language development begins with input declarations:</p>

<pre><code class="language-nix">{
  description = &#34;Multi-language development environment&#34;;

  inputs = {
    nixpkgs.url = &#34;github:NixOS/nixpkgs/nixos-unstable&#34;;
    rust-overlay.url = &#34;github:oxalica/rust-overlay&#34;;
    flake-utils.url = &#34;github:numtide/flake-utils&#34;;
  };

  outputs = { self, nixpkgs, rust-overlay, flake-utils }:
    # Implementation details follow
}
</code></pre>

<p>The inputs section declares external dependencies required for the project. The rust-overlay provides access to specific Rust toolchain versions, while flake-utils reduces boilerplate code.</p>

<p>Overlays enable precise version control for language-specific tools:</p>

<pre><code class="language-nix">let
  overlays = [
    rust-overlay.overlays.default
    (final: prev: {
      # Specify exact Node.js version
      nodejs = prev.nodejs_20;
      yarn = prev.yarn.override { nodejs = prev.nodejs_20; };
    })
  ];
  
  pkgs = import nixpkgs {
    inherit system overlays;
  };
in
</code></pre>

<p>This approach eliminates version conflicts and ensures consistent toolchain availability across development environments.</p>

<h2 id="rust-to-webassembly-compilation-pipeline" id="rust-to-webassembly-compilation-pipeline">Rust-to-WebAssembly Compilation Pipeline</h2>

<p>The core challenge in Rust-to-TypeScript integration involves establishing a reliable compilation pipeline that handles:</p>
<ol><li><p><strong>Rust Library Development</strong> – Performance-critical code implementation</p></li>

<li><p><strong>WebAssembly Cross-Compilation</strong> – Targeting the <code>wasm32-unknown-unknown</code> platform</p></li>

<li><p><strong>TypeScript Binding Generation</strong> – Creating type-safe interfaces using wasm-bindgen</p></li>

<li><p><strong>Build System Integration</strong> – Automating artifact delivery to frontend applications</p></li>

<li><p><strong>Multi-Branch Coordination</strong> – Supporting different development environments</p></li></ol>

<p>The following example demonstrates a comprehensive flake implementation for WebAssembly compilation. First, let&#39;s establish the required project structure:</p>

<p><strong>Project Structure:</strong></p>

<pre><code>json-parser/
├── flake.nix
├── Cargo.toml
├── Cargo.lock
└── src/
    └── lib.rs
</code></pre>

<p><strong>Cargo.toml:</strong></p>

<pre><code class="language-toml">[package]
name = &#34;json-parser&#34;
version = &#34;0.1.0&#34;
edition = &#34;2021&#34;

[lib]
crate-type = [&#34;cdylib&#34;]

[dependencies]
wasm-bindgen = &#34;0.2&#34;
serde = { version = &#34;1.0&#34;, features = [&#34;derive&#34;] }
serde_json = &#34;1.0&#34;

[dependencies.web-sys]
version = &#34;0.3&#34;
features = [
  &#34;console&#34;,
]
</code></pre>

<p><strong>src/lib.rs:</strong></p>

<pre><code class="language-rust">use wasm_bindgen::prelude::*;

#[wasm_bindgen]
extern &#34;C&#34; {
    #[wasm_bindgen(js_namespace = console)]
    fn log(s: &amp;str);
}

#[wasm_bindgen]
pub struct ParseOptions {
    indent_size: u32,
    sort_keys: bool,
}

#[wasm_bindgen]
impl ParseOptions {
    #[wasm_bindgen(constructor)]
    pub fn new(indent_size: u32, sort_keys: bool) -&gt; ParseOptions {
        ParseOptions { indent_size, sort_keys }
    }

    #[wasm_bindgen(getter)]
    pub fn indent_size(&amp;self) -&gt; u32 {
        self.indent_size
    }

    #[wasm_bindgen(getter)]
    pub fn sort_keys(&amp;self) -&gt; bool {
        self.sort_keys
    }
}

#[wasm_bindgen]
pub fn parse_json(json_string: &amp;str) -&gt; Result&lt;String, JsValue&gt; {
    let value: serde_json::Value = serde_json::from_str(json_string)
        .map_err(|e| JsValue::from_str(&amp;e.to_string()))?;
    
    serde_json::to_string(&amp;value)
        .map_err(|e| JsValue::from_str(&amp;e.to_string()))
}

#[wasm_bindgen]
pub fn stringify_json(json_string: &amp;str, options: &amp;ParseOptions) -&gt; Result&lt;String, JsValue&gt; {
    let value: serde_json::Value = serde_json::from_str(json_string)
        .map_err(|e| JsValue::from_str(&amp;e.to_string()))?;
    
    if options.sort_keys {
        // For simplicity, just return pretty-printed JSON
        serde_json::to_string_pretty(&amp;value)
            .map_err(|e| JsValue::from_str(&amp;e.to_string()))
    } else {
        serde_json::to_string(&amp;value)
            .map_err(|e| JsValue::from_str(&amp;e.to_string()))
    }
}

#[wasm_bindgen]
pub fn validate_json(json_string: &amp;str) -&gt; bool {
    serde_json::from_str::&lt;serde_json::Value&gt;(json_string).is_ok()
}
</code></pre>

<p><strong>flake.nix:</strong></p>

<pre><code class="language-nix">{
  description = &#34;JSON parser WebAssembly package&#34;;

  inputs = {
    nixpkgs.url = &#34;github:NixOS/nixpkgs/nixos-unstable&#34;;
    rust-overlay.url = &#34;github:oxalica/rust-overlay&#34;;
    flake-utils.url = &#34;github:numtide/flake-utils&#34;;
  };

  outputs = { self, nixpkgs, rust-overlay, flake-utils }:
    flake-utils.lib.eachDefaultSystem (system:
      let
        overlays = [ (import rust-overlay) ];
        pkgs = import nixpkgs {
          inherit system overlays;
        };

        # Rust toolchain configuration with WebAssembly target
        rustToolchain = pkgs.rust-bin.stable.latest.default.override {
          extensions = [ &#34;rust-src&#34; &#34;clippy&#34; &#34;rustfmt&#34; ];
          targets = [ &#34;wasm32-unknown-unknown&#34; ];
        };

        # WebAssembly build function
        buildJsonParserWasm = pkgs.rustPlatform.buildRustPackage rec {
          pname = &#34;json-parser&#34;;
          version = &#34;0.1.0&#34;;
          src = ./.;

          cargoLock = {
            lockFile = ./Cargo.lock;
          };

          nativeBuildInputs = with pkgs; [
            rustToolchain
            wasm-bindgen-cli
            pkg-config
          ];

          buildInputs = with pkgs; [
            openssl
          ] ++ pkgs.lib.optionals pkgs.stdenv.isDarwin [
            pkgs.darwin.apple_sdk.frameworks.Security
            pkgs.darwin.apple_sdk.frameworks.SystemConfiguration
          ];

          # Custom build phase for WebAssembly compilation
          buildPhase = &#39;&#39;
            runHook preBuild

            export HOME=$TMPDIR
            export CARGO_HOME=$TMPDIR/.cargo

            # Build the WebAssembly binary
            cargo build --lib --release --target wasm32-unknown-unknown

            # Generate TypeScript bindings
            mkdir -p pkg
            wasm-bindgen target/wasm32-unknown-unknown/release/json_parser.wasm 
              --out-dir pkg 
              --target web 
              --typescript

            runHook postBuild
          &#39;&#39;;

          # Install WebAssembly artifacts
          installPhase = &#39;&#39;
            runHook preInstall
            mkdir -p $out/pkg
            cp -r pkg/* $out/pkg/
            runHook postInstall
          &#39;&#39;;

          doCheck = false;
        };

      in
      {
        packages.default = buildJsonParserWasm;

        # Development environment
        devShells.default = pkgs.mkShell {
          buildInputs = with pkgs; [
            rustToolchain
            wasm-bindgen-cli
            pkg-config
            openssl
          ];

          shellHook = &#39;&#39;
            echo &#34;JSON Parser WebAssembly development environment&#34;
            echo &#34;Available commands:&#34;
            echo &#34;  cargo build --target wasm32-unknown-unknown&#34;
            echo &#34;  cargo test&#34;
            echo &#34;  nix build  # Build WASM package&#34;
          &#39;&#39;;
        };
      });
}
</code></pre>

<p>To use this, create the directory structure, add the files above, then run:</p>

<pre><code class="language-bash"># Generate Cargo.lock
nix develop
cargo generate-lockfile

# Build the WebAssembly package
nix build
</code></pre>

<h3 id="advanced-webassembly-build-patterns" id="advanced-webassembly-build-patterns">Advanced WebAssembly Build Patterns</h3>

<h4 id="multi-target-builds" id="multi-target-builds">Multi-Target Builds</h4>

<p>When building WebAssembly modules, you often need to support different JavaScript environments. Each target (web browsers, Node.js, bundlers like Webpack) has different requirements for how the WASM module is loaded and initialized. This pattern allows you to build all variants simultaneously, ensuring your library works across all deployment scenarios.</p>

<pre><code class="language-nix">buildMultiTargetWasm = { pname, src }:
  let
    targets = [
      { name = &#34;web&#34;; target = &#34;web&#34;; }
      { name = &#34;nodejs&#34;; target = &#34;nodejs&#34;; }
      { name = &#34;bundler&#34;; target = &#34;bundler&#34;; }
    ];
    
    buildTarget = { name, target }: pkgs.runCommand &#34;${pname}-${name}&#34; {
      nativeBuildInputs = [ rustToolchain wasm-pack ];
    } &#39;&#39;
      cp -r ${src} source
      cd source
      wasm-pack build --target ${target} --out-dir $out
    &#39;&#39;;
  in
  pkgs.symlinkJoin {
    name = &#34;${pname}-multi-target&#34;;
    paths = map buildTarget targets;
  };
</code></pre>

<h4 id="feature-based-builds" id="feature-based-builds">Feature-Based Builds</h4>

<p>Rust&#39;s feature flags allow you to compile different variants of your library with specific functionality enabled or disabled. This is particularly useful for WebAssembly where you might want a lightweight version for simple use cases and a full-featured version for complex applications. This pattern automates building multiple feature combinations.</p>

<pre><code class="language-nix">buildWasmVariants = { pname, src, variants }:
  let
    buildVariant = { name, features ? [], optimizeSize ? true }:
      buildWasmLibrary {
        inherit pname src features optimizeSize;
        version = &#34;${name}-variant&#34;;
      };
  in
  pkgs.linkFarm &#34;${pname}-variants&#34; 
    (map (variant: {
      name = variant.name;
      path = buildVariant variant;
    }) variants);
</code></pre>

<h2 id="frontend-integration-architecture" id="frontend-integration-architecture">Frontend Integration Architecture</h2>

<p>The frontend integration component addresses the coordination challenges between Rust compilation outputs and TypeScript build processes. This implementation demonstrates automated WebAssembly artifact management:</p>

<p>For a practical MVP, here&#39;s a working frontend integration. First, the project structure:</p>

<p><strong>Frontend Project Structure:</strong></p>

<pre><code>web-app/
├── flake.nix
├── package.json
├── webpack.config.js
├── tsconfig.json
├── src/
│   ├── index.html
│   ├── index.tsx
│   ├── components/
│   │   └── JsonEditor.tsx
│   ├── hooks/
│   │   └── useJsonService.ts
│   └── utils/
│       └── json_parser/
│           └── (WASM files will be placed here)
└── json-parser-flake/  # Local reference to the Rust project
</code></pre>

<p><strong>package.json:</strong></p>

<pre><code class="language-json">{
  &#34;name&#34;: &#34;json-editor-app&#34;,
  &#34;version&#34;: &#34;1.0.0&#34;,
  &#34;scripts&#34;: {
    &#34;dev&#34;: &#34;webpack serve --mode development&#34;,
    &#34;build&#34;: &#34;webpack --mode production&#34;,
    &#34;parser:build&#34;: &#34;nix run .#build-parser&#34;,
    &#34;parser:clean&#34;: &#34;nix run .#clean-parser&#34;
  },
  &#34;devDependencies&#34;: {
    &#34;webpack&#34;: &#34;^5.88.0&#34;,
    &#34;webpack-cli&#34;: &#34;^5.1.0&#34;,
    &#34;webpack-dev-server&#34;: &#34;^4.15.0&#34;,
    &#34;typescript&#34;: &#34;^5.1.0&#34;,
    &#34;ts-loader&#34;: &#34;^9.4.0&#34;,
    &#34;html-webpack-plugin&#34;: &#34;^5.5.0&#34;,
    &#34;@types/react&#34;: &#34;^18.2.0&#34;
  },
  &#34;dependencies&#34;: {
    &#34;preact&#34;: &#34;^10.19.0&#34;
  }
}
</code></pre>

<p><strong>webpack.config.js:</strong></p>

<pre><code class="language-javascript">const path = require(&#39;path&#39;);
const HtmlWebpackPlugin = require(&#39;html-webpack-plugin&#39;);

module.exports = {
  entry: &#39;./src/index.tsx&#39;,
  module: {
    rules: [
      {
        test: /\.tsx?$/,
        use: &#39;ts-loader&#39;,
        exclude: /node_modules/,
      },
    ],
  },
  resolve: {
    extensions: [&#39;.tsx&#39;, &#39;.ts&#39;, &#39;.js&#39;],
    alias: {
      &#39;react&#39;: &#39;preact/compat&#39;,
      &#39;react-dom&#39;: &#39;preact/compat&#39;
    },
  },
  output: {
    filename: &#39;bundle.js&#39;,
    path: path.resolve(__dirname, &#39;dist&#39;),
    clean: true,
  },
  plugins: [
    new HtmlWebpackPlugin({
      template: &#39;./src/index.html&#39;,
    }),
  ],
  devServer: {
    static: &#39;./dist&#39;,
    port: 8080,
  },
  experiments: {
    asyncWebAssembly: true,
  },
};
</code></pre>

<p><strong>flake.nix:</strong></p>

<pre><code class="language-nix">{
  description = &#34;Frontend application with WebAssembly integration&#34;;

  inputs = {
    nixpkgs.url = &#34;github:NixOS/nixpkgs/nixos-unstable&#34;;
    flake-utils.url = &#34;github:numtide/flake-utils&#34;;
  };

  outputs = { self, nixpkgs, flake-utils }:
    flake-utils.lib.eachDefaultSystem (system:
      let
        pkgs = import nixpkgs { inherit system; };

        # Build JSON parser from local flake
        buildJsonParser = pkgs.writeShellApplication {
          name = &#34;build-json-parser&#34;;
          runtimeInputs = [ pkgs.nix ];
          text = &#39;&#39;
            echo &#34;Building JSON parser from local flake...&#34;
            
            # Create the directory where WASM files will live
            mkdir -p src/utils/json_parser
            rm -f src/utils/json_parser/*
            
            # Build from local json-parser-flake directory
            if [ -d &#34;./json-parser-flake&#34; ]; then
              result=$(nix build ./json-parser-flake --no-link --print-out-paths)
              if [ -d &#34;$result/pkg&#34; ]; then
                cp &#34;$result&#34;/pkg/* src/utils/json_parser/
                echo &#34;✅ JSON parser built successfully&#34;
              else
                echo &#34;❌ No pkg directory found in build result&#34;
                exit 1
              fi
            else
              echo &#34;❌ json-parser-flake directory not found&#34;
              echo &#34;Please create a symlink: ln -s ../json-parser json-parser-flake&#34;
              exit 1
            fi
          &#39;&#39;;
        };

        # Clean up function
        cleanJsonParser = pkgs.writeShellApplication {
          name = &#34;clean-json-parser&#34;;
          text = &#39;&#39;
            rm -f src/utils/json_parser/*
            echo &#34;✅ Cleaned JSON parser artifacts&#34;
          &#39;&#39;;
        };

      in
      {
        # Apps for easy access
        apps = {
          build-parser = flake-utils.lib.mkApp {
            drv = buildJsonParser;
          };
          clean-parser = flake-utils.lib.mkApp { 
            drv = cleanJsonParser; 
          };
        };

        # Packages
        packages = {
          build-parser = buildJsonParser;
          clean-parser = cleanJsonParser;
        };

        # Development environment
        devShells.default = pkgs.mkShell {
          buildInputs = with pkgs; [ nodejs_20 yarn ];
          shellHook = &#39;&#39;
            echo &#34;Frontend Development Environment&#34;
            echo &#34;Available commands:&#34;
            echo &#34;  yarn parser:build  - Build JSON parser WASM&#34;
            echo &#34;  yarn parser:clean  - Clean WASM artifacts&#34;
            echo &#34;  yarn dev          - Start development server&#34;
            echo &#34;&#34;
            echo &#34;First time setup:&#34;
            echo &#34;  ln -s ../json-parser json-parser-flake&#34;
            echo &#34;  yarn install&#34;
            echo &#34;  yarn parser:build&#34;
          &#39;&#39;;
        };
      });
}
</code></pre>

<p><strong>Setup Instructions:</strong></p>
<ol><li><p>Create both <code>json-parser/</code> and <code>web-app/</code> directories</p></li>

<li><p>In <code>web-app/</code>, create a symlink: <code>ln -s ../json-parser json-parser-flake</code></p></li>

<li><p>Run <code>nix develop</code> in <code>web-app/</code></p></li>

<li><p>Run <code>yarn install</code></p></li>

<li><p>Run <code>yarn parser:build</code> to build the WASM module</p></li>

<li><p>Run <code>yarn dev</code> to start the development server</p></li></ol>

<h2 id="typescript-integration-patterns" id="typescript-integration-patterns">TypeScript Integration Patterns</h2>

<p>The WebAssembly integration generates TypeScript definitions through wasm-bindgen, enabling type-safe consumption of Rust functionality. The following example demonstrates typical usage patterns:</p>

<p><strong>src/utils/json_parser/index.ts:</strong></p>

<pre><code class="language-typescript">// Wrapper service for the WebAssembly JSON parser
import init, { 
  parse_json, 
  stringify_json, 
  validate_json,
  ParseOptions 
} from &#39;./json_parser&#39;;

export class JsonService {
  private initialized = false;
  
  async initialize(): Promise&lt;void&gt; {
    if (!this.initialized) {
      await init();
      this.initialized = true;
    }
  }
  
  async parseJson(jsonString: string): Promise&lt;string&gt; {
    await this.initialize();
    return parse_json(jsonString);
  }
  
  async stringifyJson(jsonString: string, options: { indent_size: number, sort_keys: boolean }): Promise&lt;string&gt; {
    await this.initialize();
    const wasmOptions = new ParseOptions(options.indent_size, options.sort_keys);
    return stringify_json(jsonString, wasmOptions);
  }
  
  async validateJson(jsonString: string): Promise&lt;boolean&gt; {
    await this.initialize();
    return validate_json(jsonString);
  }
}
</code></pre>

<p><strong>src/index.html:</strong></p>

<pre><code class="language-html">&lt;!DOCTYPE html&gt;
&lt;html&gt;
&lt;head&gt;
    &lt;title&gt;JSON Parser Demo&lt;/title&gt;
    &lt;style&gt;
        body { 
            font-family: Arial, sans-serif; 
            margin: 20px; 
            background-color: #f9f9f9;
        }
        .container {
            max-width: 800px;
            margin: 0 auto;
            background: white;
            padding: 20px;
            border-radius: 8px;
            box-shadow: 0 2px 4px rgba(0,0,0,0.1);
        }
        textarea { 
            width: 100%; 
            height: 200px; 
            margin: 10px 0; 
            padding: 10px;
            border: 2px solid #ddd;
            border-radius: 4px;
            font-family: &#39;Courier New&#39;, monospace;
        }
        button { 
            padding: 10px 20px; 
            margin: 5px; 
            border: none;
            border-radius: 4px;
            background: #007bff;
            color: white;
            cursor: pointer;
        }
        button:hover {
            background: #0056b3;
        }
        button:disabled {
            background: #6c757d;
            cursor: not-allowed;
        }
        pre { 
            background: #f5f5f5; 
            padding: 15px; 
            border-radius: 4px; 
            border-left: 4px solid #007bff;
            font-family: &#39;Courier New&#39;, monospace;
            overflow-x: auto;
        }
        .error { 
            border-color: #dc3545 !important; 
            background-color: #fff5f5;
        }
        .valid { 
            border-color: #28a745 !important; 
            background-color: #f5fff5;
        }
        .status {
            padding: 10px;
            margin: 10px 0;
            border-radius: 4px;
        }
        .status.error {
            background: #f8d7da;
            color: #721c24;
            border: 1px solid #f5c6cb;
        }
        .status.valid {
            background: #d4edda;
            color: #155724;
            border: 1px solid #c3e6cb;
        }
    &lt;/style&gt;
&lt;/head&gt;
&lt;body&gt;
    &lt;div id=&#34;app&#34;&gt;&lt;/div&gt;
&lt;/body&gt;
&lt;/html&gt;
</code></pre>

<p><strong>src/hooks/useJsonService.ts:</strong></p>

<pre><code class="language-typescript">import { useState, useEffect } from &#39;preact/hooks&#39;;
import { JsonService } from &#39;../utils/json_parser&#39;;

export const useJsonService = () =&gt; {
  const [jsonService] = useState(() =&gt; new JsonService());
  const [isInitialized, setIsInitialized] = useState(false);

  useEffect(() =&gt; {
    const initializeService = async () =&gt; {
      try {
        await jsonService.initialize();
        setIsInitialized(true);
      } catch (error) {
        console.error(&#39;Failed to initialize JSON service:&#39;, error);
      }
    };

    initializeService();
  }, [jsonService]);

  return { jsonService, isInitialized };
};
</code></pre>

<p><strong>src/components/JsonEditor.tsx:</strong></p>

<pre><code class="language-typescript">import { useState } from &#39;preact/hooks&#39;;
import { useJsonService } from &#39;../hooks/useJsonService&#39;;

interface JsonEditorProps {}

export const JsonEditor = ({}: JsonEditorProps) =&gt; {
  const { jsonService, isInitialized } = useJsonService();
  const [jsonText, setJsonText] = useState(&#39;{&#34;name&#34;: &#34;test&#34;, &#34;value&#34;: 123}&#39;);
  const [output, setOutput] = useState(&#39;&#39;);
  const [status, setStatus] = useState&lt;{ message: string; type: &#39;error&#39; | &#39;valid&#39; | &#39;&#39; }&gt;({
    message: &#39;&#39;,
    type: &#39;&#39;
  });
  const [inputClass, setInputClass] = useState(&#39;&#39;);

  const handleFormat = async () =&gt; {
    if (!isInitialized) {
      setStatus({ message: &#39;⏳ WebAssembly module is still loading...&#39;, type: &#39;error&#39; });
      return;
    }

    try {
      const formatted = await jsonService.stringifyJson(jsonText, {
        indent_size: 2,
        sort_keys: true
      });
      
      setOutput(formatted);
      setStatus({ message: &#39;✅ JSON formatted successfully&#39;, type: &#39;valid&#39; });
      setInputClass(&#39;valid&#39;);
    } catch (error) {
      setStatus({ message: `❌ Format error: ${error}`, type: &#39;error&#39; });
      setInputClass(&#39;error&#39;);
      setOutput(&#39;&#39;);
    }
  };

  const handleValidate = async () =&gt; {
    if (!isInitialized) {
      setStatus({ message: &#39;⏳ WebAssembly module is still loading...&#39;, type: &#39;error&#39; });
      return;
    }

    try {
      const isValid = await jsonService.validateJson(jsonText);
      
      if (isValid) {
        setStatus({ message: &#39;✅ JSON is valid&#39;, type: &#39;valid&#39; });
        setInputClass(&#39;valid&#39;);
      } else {
        setStatus({ message: &#39;❌ JSON is invalid&#39;, type: &#39;error&#39; });
        setInputClass(&#39;error&#39;);
      }
    } catch (error) {
      setStatus({ message: `❌ Validation error: ${error}`, type: &#39;error&#39; });
      setInputClass(&#39;error&#39;);
    }
  };

  const handleInputChange = (event: Event) =&gt; {
    const target = event.target as HTMLTextAreaElement;
    setJsonText(target.value);
    setInputClass(&#39;&#39;);
    setStatus({ message: &#39;&#39;, type: &#39;&#39; });
  };

  return (
    &lt;div className=&#34;container&#34;&gt;
      &lt;h1&gt;JSON Parser Demo&lt;/h1&gt;
      &lt;p&gt;This demo uses a Rust library compiled to WebAssembly for JSON processing.&lt;/p&gt;
      
      &lt;div&gt;
        &lt;h3&gt;Input JSON:&lt;/h3&gt;
        &lt;textarea
          value={jsonText}
          onInput={handleInputChange}
          className={inputClass}
          placeholder=&#34;Enter JSON here...&#34;
          disabled={!isInitialized}
        /&gt;
        
        &lt;div&gt;
          &lt;button onClick={handleFormat} disabled={!isInitialized}&gt;
            Format JSON
          &lt;/button&gt;
          &lt;button onClick={handleValidate} disabled={!isInitialized}&gt;
            Validate JSON
          &lt;/button&gt;
        &lt;/div&gt;
        
        {status.message &amp;&amp; (
          &lt;div className={`status ${status.type}`}&gt;
            {status.message}
          &lt;/div&gt;
        )}
        
        &lt;h3&gt;Output:&lt;/h3&gt;
        &lt;pre&gt;{output || &#39;No output yet...&#39;}&lt;/pre&gt;
        
        {!isInitialized &amp;&amp; (
          &lt;div className=&#34;status error&#34;&gt;
            ⏳ Loading WebAssembly module...
          &lt;/div&gt;
        )}
      &lt;/div&gt;
    &lt;/div&gt;
  );
};
</code></pre>

<p><strong>src/index.tsx:</strong></p>

<pre><code class="language-typescript">import { render } from &#39;preact&#39;;
import { JsonEditor } from &#39;./components/JsonEditor&#39;;

const App = () =&gt; {
  return &lt;JsonEditor /&gt;;
};

render(&lt;App /&gt;, document.getElementById(&#39;app&#39;)!);
</code></pre>

<p><strong>tsconfig.json:</strong></p>

<pre><code class="language-json">{
  &#34;compilerOptions&#34;: {
    &#34;target&#34;: &#34;ES2020&#34;,
    &#34;module&#34;: &#34;ES2020&#34;,
    &#34;moduleResolution&#34;: &#34;node&#34;,
    &#34;strict&#34;: true,
    &#34;esModuleInterop&#34;: true,
    &#34;skipLibCheck&#34;: true,
    &#34;forceConsistentCasingInFileNames&#34;: true,
    &#34;jsx&#34;: &#34;react-jsx&#34;,
    &#34;jsxImportSource&#34;: &#34;preact&#34;,
    &#34;outDir&#34;: &#34;./dist&#34;,
    &#34;rootDir&#34;: &#34;./src&#34;
  },
  &#34;include&#34;: [&#34;src/**/*&#34;],
  &#34;exclude&#34;: [&#34;node_modules&#34;, &#34;dist&#34;]
}
</code></pre>

<p>The generated TypeScript definitions (created automatically by wasm-bindgen) will be in <code>src/utils/json_parser/json_parser.d.ts</code> and provide complete type information for the WebAssembly functions.</p>

<p>This integration pattern combines Rust&#39;s performance characteristics with TypeScript&#39;s type safety, eliminating manual interface management between the two language ecosystems.</p>

<h2 id="multi-repository-coordination" id="multi-repository-coordination">Multi-Repository Coordination</h2>

<p>Complex projects often involve multiple repositories that must coordinate shared dependencies. The following example demonstrates how Nix flakes can manage multiple TypeScript applications that consume the same WebAssembly library while maintaining branch-specific version alignment.</p>

<p>A coordination flake can manage the entire project ecosystem:</p>

<pre><code class="language-nix">{
  description = &#34;All my TypeScript/Rust projects working together&#34;;

  inputs = {
    nixpkgs.url = &#34;github:NixOS/nixpkgs/nixos-unstable&#34;;
    
    # The Rust JSON parser
    json-parser.url = &#34;github:my-org/json-parser&#34;;
    
    # TypeScript applications that use it
    web-app.url = &#34;github:my-org/web-app&#34;;
    dashboard.url = &#34;github:my-org/dashboard&#34;;
    
    # Various integration widgets
    widget-a.url = &#34;github:my-org/widget-a&#34;;
    widget-b.url = &#34;github:my-org/widget-b&#34;;
  };

  outputs = inputs@{ self, nixpkgs, ... }:
    let
      system = &#34;x86_64-linux&#34;;
      pkgs = import nixpkgs { inherit system; };
      
      # Collect all the packages I can build
      allPackages = pkgs.lib.foldl&#39; (acc: input: 
        acc // (input.packages.${system} or {})
      ) {} (builtins.attrValues inputs);
      
    in
    {
      # One development environment to rule them all
      devShells.${system}.default = pkgs.mkShell {
        buildInputs = with pkgs; [
          # Basic tools
          git
          
          # Language toolchains - exact versions
          nodejs_20
          yarn
          rustc
          wasm-pack
          
          # Development tools
          typescript
          eslint
          prettier
        ] ++ (builtins.attrValues allPackages);
        
        shellHook = &#39;&#39;
          echo &#34;🚀 Full-stack development environment&#34;
          echo &#34;Rust: json-parser&#34;
          echo &#34;TypeScript: web-app, dashboard, widget-a, widget-b&#34;
          echo &#34;&#34;
          echo &#34;Quick commands:&#34;
          echo &#34;  nix run .#build-all-parsers  - Build all WASM modules&#34;
          echo &#34;  nix run .#dev-all           - Start all dev servers&#34;
        &#39;&#39;;
      };
      
      # Convenience scripts
      packages.${system} = {
        # Build all the WebAssembly modules at once
        build-all-parsers = pkgs.writeShellScriptBin &#34;build-all-parsers&#34; &#39;&#39;
          echo &#34;🦀 Building all Rust WebAssembly modules...&#34;
          ${inputs.json-parser.packages.${system}.default}/bin/build-parser
          echo &#34;✅ All WASM modules built&#34;
        &#39;&#39;;
        
        # Start all development servers
        dev-all = pkgs.writeShellScriptBin &#34;dev-all&#34; &#39;&#39;
          echo &#34;🌐 Starting all development servers...&#34;
          
          # Start each app in the background
          cd ${inputs.web-app} &amp;&amp; yarn dev &amp;
          WEB_PID=$!
          
          cd ${inputs.dashboard} &amp;&amp; yarn dev &amp;
          DASHBOARD_PID=$!
          
          cd ${inputs.widget-a} &amp;&amp; yarn dev &amp;
          WIDGET_A_PID=$!
          
          # Clean up when we exit
          trap &#34;kill $WEB_PID $DASHBOARD_PID $WIDGET_A_PID&#34; EXIT
          wait
        &#39;&#39;;
      };
    };
}
</code></pre>

<p>This coordination approach provides a unified development environment that encompasses the entire project ecosystem, eliminating version management concerns and ensuring consistent toolchain availability across all components.</p>

<h2 id="branch-based-dependency-management" id="branch-based-dependency-management">Branch-Based Dependency Management</h2>

<p>Multi-repository projects require coordination between corresponding development branches. For example, a web application&#39;s development branch should consume the development version of its WebAssembly dependencies, while production branches should use stable releases.</p>

<p>This coordination can be achieved through package.json script integration:</p>

<pre><code class="language-json">{
  &#34;scripts&#34;: {
    &#34;parser:build&#34;: &#34;nix run .#build-parser&#34;,
    &#34;parser:clean&#34;: &#34;nix run .#clean-parser&#34;,
    &#34;dev&#34;: &#34;yarn parser:build &amp;&amp; webpack serve --mode development&#34;,
    &#34;build&#34;: &#34;yarn parser:build &amp;&amp; webpack --mode production&#34;,
    &#34;install-deps&#34;: &#34;yarn install&#34;
  }
}
</code></pre>

<p>Environment-specific development shells can automate branch coordination:</p>

<pre><code class="language-nix"># Environment-specific development shells
devShells.${system} = {
  # Default development environment
  default = pkgs.mkShell {
    buildInputs = with pkgs; [ nodejs_20 yarn ];
    shellHook = &#39;&#39;
      echo &#34;Development environment&#34;
      echo &#34;Run &#39;yarn dev&#39; to start with current branch dependencies&#34;
    &#39;&#39;;
  };
  
  # Staging environment with beta branch dependencies
  staging = pkgs.mkShell {
    buildInputs = with pkgs; [ nodejs_20 yarn ];
    shellHook = &#39;&#39;
      echo &#34;Staging environment (dependencies: beta branch)&#34;
      nix run .#build-parser-beta
    &#39;&#39;;
  };
  
  # Production environment with stable dependencies
  production = pkgs.mkShell {
    buildInputs = with pkgs; [ nodejs_20 yarn ];
    shellHook = &#39;&#39;
      echo &#34;Production environment (dependencies: prod branch)&#34;
      nix run .#build-parser-prod
    &#39;&#39;;
  };
};
</code></pre>

<p>This approach enables developers to enter environment-specific shells using <code>nix develop .#staging</code>, automatically configuring the appropriate dependency versions for the target environment.</p>

<h2 id="advanced-patterns-and-best-practices" id="advanced-patterns-and-best-practices">Advanced Patterns and Best Practices</h2>

<h3 id="caching-and-performance-optimization" id="caching-and-performance-optimization">Caching and Performance Optimization</h3>

<h4 id="binary-caches" id="binary-caches">Binary Caches</h4>

<p>Binary caches are one of Nix&#39;s most powerful features for development teams. Instead of rebuilding everything from source, Nix can download pre-built packages from cache servers. This dramatically reduces build times, especially for large Rust projects with many dependencies. Setting up your own cache (like Cachix) means your team shares build artifacts, so if one developer builds a package, everyone else gets it instantly.</p>

<pre><code class="language-nix">{
  nixConfig = {
    extra-substituters = [
      &#34;https://cache.nixos.org&#34;
      &#34;https://your-org.cachix.org&#34;
    ];
    extra-trusted-public-keys = [
      &#34;cache.nixos.org-1:6NCHdD59X431o0gWypbMrAURkbJ16ZPMQFGspcDShjY=&#34;
      &#34;your-org.cachix.org-1:YOUR_PUBLIC_KEY_HERE&#34;
    ];
  };
}
</code></pre>

<h4 id="incremental-builds" id="incremental-builds">Incremental Builds</h4>

<p>Large Rust projects can take a long time to compile, especially when dependencies change. This pattern separates dependency compilation from your source code compilation. Dependencies are built once and cached, then your source code builds much faster since it reuses the pre-built dependencies. This is particularly valuable in CI/CD pipelines where you want fast feedback loops.</p>

<pre><code class="language-nix">buildRustPackageIncremental = { pname, src, ... }@args:
  let
    # Separate dependency building from source building
    deps = pkgs.rustPlatform.buildRustPackage (args // {
      pname = &#34;${pname}-deps&#34;;
      src = pkgs.runCommand &#34;deps-src&#34; {} &#39;&#39;
        mkdir -p $out
        cp ${src}/Cargo.{toml,lock} $out/
        mkdir $out/src
        echo &#34;fn main() {}&#34; &gt; $out/src/main.rs
      &#39;&#39;;
      doCheck = false;
    });
  in
  pkgs.rustPlatform.buildRustPackage (args // {
    preBuild = &#39;&#39;
      cp -r ${deps}/target .
      chmod -R +w target
    &#39;&#39;;
  });
</code></pre>

<h3 id="testing-across-languages" id="testing-across-languages">Testing Across Languages</h3>

<h4 id="integration-testing" id="integration-testing">Integration Testing</h4>

<p>When you have Rust code compiled to WebAssembly being consumed by TypeScript applications, you need tests that verify the entire pipeline works correctly. This isn&#39;t just about testing your Rust code or your TypeScript code in isolation – you need to test that the WebAssembly bindings work correctly, that data serialization/deserialization works as expected, and that the integration points behave properly under various conditions.</p>

<pre><code class="language-nix">integrationTests = pkgs.runCommand &#34;integration-tests&#34; {
  buildInputs = with pkgs; [
    nodejs_20
    yarn
    rustc
    wasm-pack
    # Your compiled packages
    wasmLib
    frontendApp
  ];
} &#39;&#39;
  # Set up test environment
  export WASM_LIB_PATH=${wasmLib}/pkg
  export NODE_PATH=${frontendApp}/node_modules
  
  # Run integration test suite
  cd ${./tests}
  yarn install
  yarn test:integration
  
  touch $out
&#39;&#39;;
</code></pre>

<h4 id="cross-language-testing" id="cross-language-testing">Cross-Language Testing</h4>

<p>This pattern ensures that your WebAssembly modules actually work when called from JavaScript/TypeScript. It&#39;s common for Rust code to work perfectly in isolation but fail when compiled to WebAssembly due to differences in memory management, string handling, or async behavior. These tests catch those issues early in the development process.</p>

<pre><code class="language-nix">wasmIntegrationTests = pkgs.runCommand &#34;wasm-integration-tests&#34; {
  buildInputs = with pkgs; [ nodejs_20 rustLib wasmLib ];
} &#39;&#39;
  # Test Rust library directly
  cd ${rustLib.src}
  cargo test --release
  
  # Test WebAssembly bindings
  cd ${./tests/wasm}
  cat &gt; test-wasm.js &lt;&lt; &#39;EOF&#39;
  const { parse_sql, format_sql } = require(&#39;${wasmLib}/pkg/core.js&#39;);
  
  // Test basic functionality
  const sql = &#34;SELECT * FROM users WHERE id = 1&#34;;
  const parsed = parse_sql(sql);
  const formatted = format_sql(sql, { indent_size: 2 });
  
  console.log(&#39;Parsed:&#39;, parsed);
  console.log(&#39;Formatted:&#39;, formatted);
  
  if (!parsed || !formatted) {
    process.exit(1);
  }
  EOF
  
  ${pkgs.nodejs_20}/bin/node test-wasm.js
  
  touch $out
&#39;&#39;;
</code></pre>

<h3 id="security-and-reproducibility" id="security-and-reproducibility">Security and Reproducibility</h3>

<h4 id="pinning-dependencies" id="pinning-dependencies">Pinning Dependencies</h4>

<p>One of the biggest challenges in software development is ensuring that your project builds the same way today as it will in six months. Dependencies get updated, APIs change, and suddenly your build breaks. Nix flakes solve this by pinning exact versions of all dependencies, including the Nix packages themselves. This ensures that your build is completely reproducible across time and different machines.</p>

<pre><code class="language-nix">{
  inputs = {
    nixpkgs.url = &#34;github:NixOS/nixpkgs/nixos-23.11&#34;;
    rust-overlay = {
      url = &#34;github:oxalica/rust-overlay&#34;;
      inputs.nixpkgs.follows = &#34;nixpkgs&#34;;
    };
  };
}
</code></pre>

<h4 id="security-scanning" id="security-scanning">Security Scanning</h4>

<p>Security vulnerabilities in dependencies are a major concern for production applications. This pattern integrates security scanning directly into your build process, automatically checking both Rust crates and npm packages for known vulnerabilities. By making security scanning part of your Nix flake, you ensure it runs consistently across all environments and can&#39;t be forgotten or skipped.</p>

<pre><code class="language-nix">securityScan = pkgs.runCommand &#34;security-scan&#34; {
  buildInputs = with pkgs; [ cargo-audit yarn nodejs_20 ];
} &#39;&#39;
  mkdir -p $out
  
  # Rust security audit
  cd ${./rust-core}
  cargo audit --json &gt; $out/rust-audit.json
  
  # TypeScript/JavaScript security audit  
  cd ${./frontend}
  yarn audit --json &gt; $out/yarn-audit.json || true
  
  # Check for known vulnerabilities in dependencies
  yarn audit --level moderate --json &gt; $out/vulnerability-report.json || true
&#39;&#39;;
</code></pre>

<h2 id="implementation-example" id="implementation-example">Implementation Example</h2>

<p>This section presents a practical implementation based on patterns observed in production environments. The example demonstrates a JSON parsing library ecosystem that illustrates common multi-language coordination challenges and their solutions.</p>

<h3 id="project-architecture" id="project-architecture">Project Architecture</h3>

<p>Consider a JSON processing system consisting of:</p>
<ul><li>A high-performance Rust library compiled to WebAssembly</li>
<li>A primary web application for JSON editing and validation</li>
<li>An analytics dashboard for JSON data processing</li>
<li>Various integration widgets for platform embedding</li></ul>

<p>The following implementation demonstrates how Nix flakes address the coordination requirements:</p>

<pre><code class="language-nix">{
  description = &#34;My JSON tools ecosystem&#34;;

  inputs = {
    nixpkgs.url = &#34;github:NixOS/nixpkgs/nixos-unstable&#34;;
    rust-overlay.url = &#34;github:oxalica/rust-overlay&#34;;
    
    # My Rust JSON parser
    json-parser.url = &#34;github:my-org/json-parser&#34;;
  };

  outputs = { self, nixpkgs, rust-overlay, json-parser }:
    let
      system = &#34;x86_64-linux&#34;;
      pkgs = import nixpkgs {
        inherit system;
        overlays = [ rust-overlay.overlays.default ];
      };

      # The WebAssembly version of my JSON parser
      jsonParserWasm = json-parser.packages.${system}.default;

      # Main web application
      jsonEditor = pkgs.buildNpmPackage {
        pname = &#34;json-editor&#34;;
        version = &#34;1.0.0&#34;;
        src = ./editor;
        npmDepsHash = &#34;sha256-...&#34;;
        
        # This is where the magic happens - automatic WASM integration
        preBuild = &#39;&#39;
          mkdir -p src/utils/json_parser
          cp -r ${jsonParserWasm}/pkg/* src/utils/json_parser/
        &#39;&#39;;
        
        buildPhase = &#39;&#39;
          yarn build
        &#39;&#39;;
      };

      # Analytics dashboard
      dashboard = pkgs.buildNpmPackage {
        pname = &#34;json-dashboard&#34;;
        version = &#34;1.0.0&#34;;
        src = ./dashboard;
        npmDepsHash = &#34;sha256-...&#34;;
        
        preBuild = &#39;&#39;
          mkdir -p src/utils/json_parser
          cp -r ${jsonParserWasm}/pkg/* src/utils/json_parser/
        &#39;&#39;;
      };

      # Embeddable widget
      widget = pkgs.buildNpmPackage {
        pname = &#34;json-widget&#34;;
        version = &#34;1.0.0&#34;;
        src = ./widget;
        npmDepsHash = &#34;sha256-...&#34;;
        
        preBuild = &#39;&#39;
          mkdir -p src/utils/json_parser
          cp -r ${jsonParserWasm}/pkg/* src/utils/json_parser/
        &#39;&#39;;
      };

    in
    {
      packages.${system} = {
        inherit jsonEditor dashboard widget;
        
        # Everything bundled together for deployment
        all-apps = pkgs.runCommand &#34;json-tools-deployment&#34; {} &#39;&#39;
          mkdir -p $out/{editor,dashboard,widget}
          cp -r ${jsonEditor}/* $out/editor/
          cp -r ${dashboard}/* $out/dashboard/
          cp -r ${widget}/* $out/widget/
        &#39;&#39;;
      };

      # My daily development environment
      devShells.${system}.default = pkgs.mkShell {
        buildInputs = with pkgs; [
          nodejs_20
          yarn
          typescript
          eslint
          prettier
        ];
        
        shellHook = &#39;&#39;
          echo &#34;📝 JSON Tools Development Environment&#34;
          echo &#34;Available commands:&#34;
          echo &#34;  yarn editor:dev     - JSON editor dev server&#34;
          echo &#34;  yarn dashboard:dev  - Dashboard dev server&#34;  
          echo &#34;  yarn widget:dev     - Widget dev server&#34;
          echo &#34;  nix run json-parser#build-parser - Rebuild WASM&#34;
        &#39;&#39;;
      };
    };
}
</code></pre>

<p>This architecture enables automatic propagation of changes from the Rust library to all consuming TypeScript applications, eliminating manual artifact management and reducing the risk of version inconsistencies.</p>

<h3 id="development-workflow-analysis" id="development-workflow-analysis">Development Workflow Analysis</h3>

<p>The Nix flakes approach transforms the typical development workflow through several key improvements:</p>

<p><strong>Environment Initialization</strong>: Developers begin work by executing <code>nix develop</code>, which provides a shell environment with precisely specified tool versions, eliminating version-related configuration issues.</p>

<p><strong>Core Library Development</strong>: Changes to the Rust library follow standard development practices, with testing via <code>cargo test</code> and version control through git branches.</p>

<p><strong>Frontend Integration</strong>: Frontend applications integrate updated WebAssembly modules through commands like <code>yarn parser:build:dev</code>, which automatically fetch, compile, and position artifacts according to the build configuration.</p>

<p><strong>Multi-Application Testing</strong>: Development server orchestration enables simultaneous testing across multiple applications:</p>

<pre><code class="language-bash"># Concurrent development server execution
nix run .#dev-all
</code></pre>

<p><strong>Environment-Specific Deployment</strong>: Deployment processes utilize environment-specific configurations:</p>

<pre><code class="language-bash"># Staging deployment
nix develop .#staging
yarn build:staging

# Production deployment
nix develop .#production
yarn build:prod
</code></pre>

<p>This workflow ensures consistent behavior across development machines, continuous integration systems, and deployment environments, effectively eliminating environment-specific build failures.</p>

<h2 id="performance-considerations-and-optimization" id="performance-considerations-and-optimization">Performance Considerations and Optimization</h2>

<h3 id="build-performance" id="build-performance">Build Performance</h3>

<h4 id="parallel-builds" id="parallel-builds">Parallel Builds</h4>

<p>Modern development machines have multiple CPU cores, but many build systems don&#39;t take full advantage of them. This configuration ensures that both Nix builds and Rust compilation use all available CPU cores, significantly reducing build times. This is especially important for large Rust projects where compilation can be the bottleneck in your development workflow.</p>

<pre><code class="language-nix">{
  # Enable parallel building
  nixConfig = {
    max-jobs = &#34;auto&#34;;
    cores = 0;  # Use all available cores
  };
}

# In package definitions
buildRustPackage {
  # ... other attributes
  
  # Parallel cargo builds
  cargoBuildFlags = [ &#34;--jobs&#34; &#34;$NIX_BUILD_CORES&#34; ];
  
  # Parallel tests
  cargoTestFlags = [ &#34;--jobs&#34; &#34;$NIX_BUILD_CORES&#34; ];
}
</code></pre>

<h4 id="build-caching-strategies" id="build-caching-strategies">Build Caching Strategies</h4>

<p>This advanced pattern addresses one of the most frustrating aspects of Rust development: waiting for dependencies to recompile when they haven&#39;t actually changed. By separating dependency builds from source builds, you can cache the expensive dependency compilation step and only rebuild your actual source code when it changes.</p>

<pre><code class="language-nix"># Separate dependency builds from source builds
rustDependencies = pkgs.rustPlatform.buildRustPackage {
  pname = &#34;${pname}-deps&#34;;
  inherit version src cargoLock;
  
  # Build only dependencies
  buildPhase = &#39;&#39;
    cargo build --release --frozen
  &#39;&#39;;
  
  # Don&#39;t include source files in hash
  outputHashMode = &#34;recursive&#34;;
  outputHash = &#34;sha256-...&#34;;
};

# Main package reuses dependency build
mainPackage = pkgs.rustPlatform.buildRustPackage {
  inherit pname version src cargoLock;
  
  preBuild = &#39;&#39;
    cp -r ${rustDependencies}/target .
    chmod -R +w target
  &#39;&#39;;
};
</code></pre>

<h3 id="runtime-performance" id="runtime-performance">Runtime Performance</h3>

<h4 id="webassembly-optimization" id="webassembly-optimization">WebAssembly Optimization</h4>

<p>WebAssembly modules can be optimized for different goals: smaller file size for faster downloads, or faster execution speed. The Binaryen toolkit provides powerful optimization tools that can significantly improve your WebAssembly performance. This pattern lets you build different optimized versions for different deployment scenarios.</p>

<pre><code class="language-nix">optimizedWasm = pkgs.runCommand &#34;optimized-wasm&#34; {
  buildInputs = [ pkgs.binaryen ];
} &#39;&#39;
  # Size optimization
  wasm-opt -Oz ${inputWasm} -o $out/optimized.wasm
  
  # Speed optimization  
  wasm-opt -O3 ${inputWasm} -o $out/fast.wasm
  
  # Debug build
  cp ${inputWasm} $out/debug.wasm
&#39;&#39;;
</code></pre>

<h4 id="memory-management" id="memory-management">Memory Management</h4>

<p>WebAssembly has different memory constraints than native applications. This configuration helps you optimize memory usage during both compilation and runtime, which is crucial for WebAssembly modules that will run in memory-constrained environments like browsers or edge computing platforms.</p>

<pre><code class="language-nix"># Configure memory limits for builds
buildRustPackage {
  # ... other attributes
  
  # Limit memory usage during compilation
  NIX_CFLAGS_COMPILE = &#34;-Wl,--max-memory=2147483648&#34;;  # 2GB limit
  
  # Optimize for memory usage
  RUSTFLAGS = &#34;-C opt-level=s -C panic=abort&#34;;
}
</code></pre>

<h2 id="debugging-and-development-workflow" id="debugging-and-development-workflow">Debugging and Development Workflow</h2>

<h3 id="development-tools-integration" id="development-tools-integration">Development Tools Integration</h3>

<h4 id="ide-support" id="ide-support">IDE Support</h4>

<p>A good development environment includes all the tools you need for productive coding: language servers for intelligent code completion, debuggers for troubleshooting, and profiling tools for performance optimization. This configuration ensures that every developer on your team has access to the same high-quality development tools, regardless of their operating system or local setup.</p>

<pre><code class="language-nix">devShells.default = pkgs.mkShell {
  buildInputs = with pkgs; [
    # Language servers
    rust-analyzer
    typescript-language-server
    eslint_d
    
    # Debugging tools
    gdb
    lldb
    nodejs_20  # for Chrome DevTools and Node.js debugging
    
    # Development tools
    wasm-pack
    wasm-bindgen-cli
    
    # Profiling tools
    valgrind
    perf-tools
  ];
  
  shellHook = &#39;&#39;
    # Configure rust-analyzer
    export RUST_SRC_PATH=&#34;${pkgs.rust.packages.stable.rustPlatform.rustLibSrc}&#34;
    
    # Set up debugging symbols for Rust
    export RUSTFLAGS=&#34;-C debuginfo=2&#34;
    
    # Configure TypeScript development
    export NODE_OPTIONS=&#34;--openssl-legacy-provider&#34;
    
    echo &#34;🔧 Development environment ready&#34;
    echo &#34;Rust tools: rust-analyzer, cargo, wasm-pack&#34;
    echo &#34;TypeScript tools: typescript-language-server, eslint&#34;
    echo &#34;Debugging: gdb, lldb, Chrome DevTools&#34;
  &#39;&#39;;
};
</code></pre>

<h4 id="hot-reloading" id="hot-reloading">Hot Reloading</h4>

<p>During development, you want to see changes immediately without manually rebuilding everything. This setup watches your Rust files for changes and automatically recompiles the WebAssembly module, while simultaneously running a frontend development server that hot-reloads when the WebAssembly module updates. This creates a smooth development experience where changes in Rust code are immediately visible in your web application.</p>

<pre><code class="language-nix"># Development server with hot reloading
devServer = pkgs.writeShellScriptBin &#34;dev-server&#34; &#39;&#39;
  # Start Rust file watcher
  ${pkgs.cargo-watch}/bin/cargo-watch -x &#39;build --target wasm32-unknown-unknown&#39; &amp;
  RUST_PID=$!
  
  # Start frontend dev server
  cd frontend
  ${pkgs.nodejs}/bin/npm run dev &amp;
  FRONTEND_PID=$!
  
  # Cleanup on exit
  trap &#34;kill $RUST_PID $FRONTEND_PID&#34; EXIT
  wait
&#39;&#39;;
</code></pre>

<h3 id="testing-strategies" id="testing-strategies">Testing Strategies</h3>

<h4 id="cross-language-testing-1" id="cross-language-testing-1">Cross-Language Testing</h4>

<p>This comprehensive testing approach ensures that your entire multi-language stack works correctly. It tests your Rust code natively, verifies that the WebAssembly compilation works properly, checks that the TypeScript integration layer functions correctly, and runs end-to-end tests that simulate real user interactions. This catches issues that might only appear when all the pieces work together.</p>

<pre><code class="language-nix">crossLanguageTests = pkgs.runCommand &#34;cross-language-tests&#34; {
  buildInputs = with pkgs; [
    rustPackage
    nodePackage
    typescriptPackage
    yarn
  ];
} &#39;&#39;
  # Test Rust library
  cd ${rustSrc}
  cargo test --release
  
  # Test WebAssembly bindings
  cd ${wasmSrc}
  ${pkgs.nodejs_20}/bin/node test-wasm.js
  
  # Test TypeScript integration
  cd ${typescriptSrc}
  yarn test
  yarn test:integration
  
  # Run end-to-end tests
  yarn test:e2e
  
  touch $out
&#39;&#39;;
</code></pre>

<h2 id="deployment-and-production-considerations" id="deployment-and-production-considerations">Deployment and Production Considerations</h2>

<h3 id="container-images" id="container-images">Container Images</h3>

<h4 id="multi-stage-builds" id="multi-stage-builds">Multi-Stage Builds</h4>

<p>Production container images should be as small and secure as possible. This pattern creates minimal container images that include only the runtime dependencies needed to run your application, without any of the build tools or development dependencies. This reduces attack surface, improves startup times, and reduces bandwidth costs for deployment.</p>

<pre><code class="language-nix">productionImage = pkgs.dockerTools.buildLayeredImage {
  name = &#34;production-app&#34;;
  tag = &#34;latest&#34;;
  
  contents = [
    # Runtime dependencies only
    pkgs.cacert
    pkgs.tzdata
  ];
  
  config = {
    Cmd = [ &#34;${finalPackage}/bin/app&#34; ];
    Env = [
      &#34;SSL_CERT_FILE=${pkgs.cacert}/etc/ssl/certs/ca-bundle.crt&#34;
    ];
    ExposedPorts = {
      &#34;8080/tcp&#34; = {};
    };
  };
};
</code></pre>

<h4 id="security-hardening" id="security-hardening">Security Hardening</h4>

<p>Security is crucial for production deployments. This configuration creates container images that follow security best practices: running as a non-root user, using minimal base images that reduce attack surface, and including proper metadata for security scanning and compliance. These practices help protect your application from common container-based attacks.</p>

<pre><code class="language-nix">hardenedImage = pkgs.dockerTools.buildImage {
  name = &#34;hardened-app&#34;;
  
  # Minimal base with security updates
  contents = [ pkgs.distroless ];
  
  config = {
    User = &#34;65534:65534&#34;;  # nobody user
    Cmd = [ &#34;${securePackage}/bin/app&#34; ];
    
    # Security labels
    Labels = {
      &#34;org.opencontainers.image.source&#34; = &#34;https://github.com/org/repo&#34;;
      &#34;org.opencontainers.image.licenses&#34; = &#34;MIT&#34;;
    };
  };
};
</code></pre>

<h3 id="continuous-integration" id="continuous-integration">Continuous Integration</h3>

<h4 id="github-actions-integration" id="github-actions-integration">GitHub Actions Integration</h4>

<p>Continuous integration ensures that your code works correctly across different environments and catches problems before they reach production. This GitHub Actions configuration leverages Nix&#39;s reproducibility to create CI builds that are identical to your local development environment, while using caching to keep build times fast.</p>

<pre><code class="language-yaml"># .github/workflows/build.yml
name: Build and Test

on: [push, pull_request]

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - uses: cachix/install-nix-action@v20
      - uses: cachix/cachix-action@v12
        with:
          name: your-cache
          authToken: &#39;${{ secrets.CACHIX_AUTH_TOKEN }}&#39;
      
      - name: Build all packages
        run: nix build .#all
      
      - name: Run tests
        run: nix flake check
      
      - name: Build Docker images
        run: nix build .#dockerImages
</code></pre>

<h2 id="future-directions-and-advanced-topics" id="future-directions-and-advanced-topics">Future Directions and Advanced Topics</h2>

<h3 id="emerging-patterns" id="emerging-patterns">Emerging Patterns</h3>

<h4 id="webassembly-system-interface-wasi" id="webassembly-system-interface-wasi">WebAssembly System Interface (WASI)</h4>

<p>WASI represents the future of WebAssembly beyond the browser. It provides a standardized system interface that allows WebAssembly modules to interact with the operating system in a secure, sandboxed way. This enables WebAssembly applications to run on servers, edge computing platforms, and other environments where you need system access but want the security and portability benefits of WebAssembly.</p>

<pre><code class="language-nix">wasiPackage = pkgs.rustPlatform.buildRustPackage {
  # ... standard attributes
  
  # WASI target
  CARGO_BUILD_TARGET = &#34;wasm32-wasi&#34;;
  
  # WASI runtime for testing
  nativeCheckInputs = [ pkgs.wasmtime ];
  
  checkPhase = &#39;&#39;
    wasmtime target/wasm32-wasi/release/${pname}.wasm
  &#39;&#39;;
};
</code></pre>

<h4 id="edge-computing-deployment" id="edge-computing-deployment">Edge Computing Deployment</h4>

<p>Edge computing brings computation closer to users for better performance and reduced latency. WebAssembly is particularly well-suited for edge deployment because of its small size, fast startup times, and security sandbox. This pattern packages your WebAssembly modules and static assets for deployment to edge computing platforms like Cloudflare Workers, Fastly Compute@Edge, or AWS Lambda@Edge.</p>

<pre><code class="language-nix">edgeDeployment = pkgs.runCommand &#34;edge-deployment&#34; {} &#39;&#39;
  mkdir -p $out/{functions,assets}
  
  # WebAssembly functions for edge runtime
  cp ${wasmPackage}/pkg/*.wasm $out/functions/
  
  # Static assets
  cp -r ${frontendPackage}/* $out/assets/
  
  # Edge configuration
  cat &gt; $out/edge-config.json &lt;&lt; EOF
  {
    &#34;functions&#34;: {
      &#34;api/*&#34;: &#34;functions/api.wasm&#34;,
      &#34;compute/*&#34;: &#34;functions/compute.wasm&#34;
    },
    &#34;assets&#34;: &#34;assets/*&#34;
  }
  EOF
&#39;&#39;;
</code></pre>

<h3 id="integration-with-other-ecosystems" id="integration-with-other-ecosystems">Integration with Other Ecosystems</h3>

<h4 id="kubernetes-operators" id="kubernetes-operators">Kubernetes Operators</h4>

<p>Kubernetes operators extend Kubernetes with custom application-specific logic. By embedding WebAssembly modules directly into your operators, you can create powerful, efficient controllers that leverage the performance and security benefits of WebAssembly while maintaining the operational benefits of Kubernetes. This is particularly useful for complex data processing or policy enforcement scenarios.</p>

<pre><code class="language-nix">k8sOperator = pkgs.buildGoModule {
  pname = &#34;app-operator&#34;;
  version = &#34;1.0.0&#34;;
  src = ./operator;
  
  # Include WebAssembly modules as embedded resources
  preBuild = &#39;&#39;
    mkdir -p assets
    cp ${wasmPackage}/pkg/*.wasm assets/
  &#39;&#39;;
  
  ldflags = [
    &#34;-X main.version=1.0.0&#34;
    &#34;-X main.wasmAssets=assets&#34;
  ];
};
</code></pre>

<h4 id="serverless-functions" id="serverless-functions">Serverless Functions</h4>

<p>Serverless platforms are increasingly supporting WebAssembly as a more efficient alternative to traditional JavaScript functions. WebAssembly functions start faster, use less memory, and can provide better performance for compute-intensive tasks. This pattern shows how to package your Rust-based WebAssembly modules for deployment to serverless platforms like AWS Lambda, Vercel Functions, or Netlify Functions.</p>

<pre><code class="language-nix">serverlessFunction = pkgs.runCommand &#34;serverless-function&#34; {} &#39;&#39;
  mkdir -p $out
  
  # Package WebAssembly for serverless runtime
  cat &gt; $out/function.js &lt;&lt; EOF
  const wasm = require(&#39;./module.wasm&#39;);
  
  exports.handler = async (event) =&gt; {
    const result = await wasm.process(event.data);
    return { statusCode: 200, body: result };
  };
  EOF
  
  cp ${wasmPackage}/pkg/module.wasm $out/
&#39;&#39;;
</code></pre>

<h2 id="analysis-and-conclusions" id="analysis-and-conclusions">Analysis and Conclusions</h2>

<p>The implementation patterns presented demonstrate how Nix flakes address fundamental challenges in multi-language development environments. This analysis is based on observations from production systems where these approaches have been successfully deployed.</p>

<h3 id="technical-benefits" id="technical-benefits">Technical Benefits</h3>

<p>The Nix flakes approach provides several measurable improvements over traditional multi-language development setups:</p>

<p><strong>Reproducible Builds</strong>: Declarative dependency specification ensures identical build environments across development, testing, and production systems, eliminating environment-specific failures.</p>

<p><strong>Automated Coordination</strong>: Programmatic integration between Rust compilation and TypeScript build processes reduces manual intervention and associated error rates.</p>

<p><strong>Version Consistency</strong>: Centralized toolchain management prevents version conflicts and ensures all team members use identical development environments.</p>

<p><strong>Branch Synchronization</strong>: Git-based dependency resolution automatically coordinates library versions with consuming applications across different development branches.</p>

<h3 id="implementation-considerations" id="implementation-considerations">Implementation Considerations</h3>

<p><strong>Learning Curve</strong>: Nix flakes require initial investment in understanding declarative configuration concepts and Nix expression language syntax.</p>

<p><strong>Incremental Adoption</strong>: Organizations can implement Nix flakes gradually, beginning with critical compilation pipelines and expanding to encompass broader development workflows.</p>

<p><strong>Toolchain Integration</strong>: The approach integrates effectively with existing development tools while providing superior coordination capabilities.</p>

<p><strong>Scalability</strong>: The pattern scales effectively from single-developer projects to large multi-repository systems with complex dependency relationships.</p>

<h3 id="operational-impact" id="operational-impact">Operational Impact</h3>

<p>Teams implementing these patterns typically observe:</p>
<ul><li>Reduced onboarding time for new developers</li>
<li>Decreased environment-related build failures</li>
<li>Improved consistency across development and deployment environments</li>
<li>Enhanced collaboration efficiency through standardized toolchains</li></ul>

<h3 id="applicability-assessment" id="applicability-assessment">Applicability Assessment</h3>

<p>Nix flakes provide particular value for projects that:</p>
<ul><li>Combine multiple programming languages with complex build requirements</li>
<li>Require WebAssembly compilation and integration workflows</li>
<li>Involve multiple repositories with interdependent components</li>
<li>Need consistent environments across diverse development and deployment contexts</li></ul>

<p>The approach represents a significant improvement over traditional multi-language development coordination methods, particularly for teams working with Rust-to-TypeScript integration patterns. While the initial learning investment is substantial, the long-term benefits in terms of reliability, reproducibility, and development efficiency justify adoption for suitable project contexts.</p>

<h2 id="building-your-first-multi-language-project-a-practical-guide" id="building-your-first-multi-language-project-a-practical-guide">Building Your First Multi-Language Project: A Practical Guide</h2>

<p>Let&#39;s walk through creating a real working example that demonstrates these concepts in action. We&#39;ll build a JSON processing application where the heavy lifting is done by a Rust library compiled to WebAssembly, consumed by a TypeScript web application.</p>

<h3 id="what-you-ll-need" id="what-you-ll-need">What You&#39;ll Need</h3>

<p>Before we start, make sure you have:</p>
<ul><li>Nix package manager installed with flakes enabled (<a href="https://nixos.org/download.html" rel="nofollow">installation guide</a>)</li>
<li>Basic familiarity with Rust and TypeScript (don&#39;t worry, we&#39;ll explain the tricky parts)</li>
<li>About 30 minutes to work through this example</li></ul>

<h3 id="step-1-building-the-rust-core" id="step-1-building-the-rust-core">Step 1: Building the Rust Core</h3>

<p>Let&#39;s start by creating the performance-critical part of our application – a Rust library that can parse and manipulate JSON data.</p>

<pre><code class="language-bash"># Create a workspace for our project
mkdir json-tools-project
cd json-tools-project

# Create the Rust library
mkdir json-parser
cd json-parser
mkdir src
</code></pre>

<p>Now we&#39;ll create the core files. The beauty of this approach is that once you set up the Nix flake, anyone on your team can build this project identically, regardless of what Rust version they have installed (or even if they have Rust installed at all).</p>

<p>Create the files as detailed in the Rust WebAssembly section above. The key insight here is that our <code>flake.nix</code> file is doing something powerful – it&#39;s not just managing dependencies, it&#39;s creating a completely reproducible build environment.</p>

<p>Let&#39;s test our Rust library:</p>

<pre><code class="language-bash"># This command creates a development shell with all the right tools
nix develop

# Generate the lock file (this pins our Rust dependencies)
cargo generate-lockfile

# Exit the development shell and build the WebAssembly module
exit
nix build

# Check that we got our WebAssembly files
ls result/pkg/  # You should see .wasm, .js, and .d.ts files
</code></pre>

<p>What just happened? Nix downloaded the exact version of Rust we specified, compiled our library to WebAssembly, generated TypeScript bindings, and packaged everything up. If you share this code with a teammate, they&#39;ll get exactly the same result.</p>

<h3 id="step-2-creating-the-web-application" id="step-2-creating-the-web-application">Step 2: Creating the Web Application</h3>

<p>Now let&#39;s build a web application that uses our Rust library. The magic here is that our web app will automatically get the latest version of our WebAssembly module whenever we rebuild it.</p>

<pre><code class="language-bash"># Go back to our project root
cd ..
mkdir web-app
cd web-app

# Create the directory structure
mkdir -p src/{components,hooks,utils/json_parser}
</code></pre>

<p>Create all the frontend files as shown in the Frontend Integration section. The key file here is our <code>flake.nix</code> – it contains a script that automatically builds our Rust library and copies the WebAssembly files to the right place in our web app.</p>

<h3 id="step-3-connecting-everything-together" id="step-3-connecting-everything-together">Step 3: Connecting Everything Together</h3>

<p>This is where the magic happens. Instead of manually copying files around, we&#39;ll create a symbolic link that tells our web app where to find our Rust library:</p>

<pre><code class="language-bash"># Create a link to our Rust project
ln -s ../json-parser json-parser-flake
</code></pre>

<p>This might seem simple, but it&#39;s solving a major problem in multi-language development: keeping different parts of your project in sync.</p>

<h3 id="step-4-running-your-application" id="step-4-running-your-application">Step 4: Running Your Application</h3>

<p>Now for the moment of truth – let&#39;s see our multi-language application in action:</p>

<pre><code class="language-bash"># Enter the development environment (this gives us Node.js, Yarn, etc.)
nix develop

# Install our JavaScript dependencies
yarn install

# Build the WebAssembly module and copy it to our web app
yarn parser:build

# Start the development server
yarn dev
</code></pre>

<p>Open your browser to <code>http://localhost:8080</code>. You should see a JSON editor that&#39;s powered by Rust running in WebAssembly!</p>

<h3 id="step-5-testing-the-integration" id="step-5-testing-the-integration">Step 5: Testing the Integration</h3>

<p>Let&#39;s verify that everything is working correctly:</p>
<ol><li><p><strong>Try valid JSON</strong>: Paste <code>{&#34;name&#34;: &#34;test&#34;, &#34;value&#34;: 123}</code> and click “Validate JSON” – you should see a green success message</p></li>

<li><p><strong>Try invalid JSON</strong>: Enter <code>{name: &#34;test&#34;}</code> (missing quotes) and click “Validate JSON” – you should see a red error message</p></li>

<li><p><strong>Test formatting</strong>: Enter some messy JSON and click “Format JSON” – it should be nicely formatted</p></li>

<li><p><strong>Check the console</strong>: Open your browser&#39;s developer tools – you shouldn&#39;t see any errors</p></li></ol>

<h3 id="what-you-ve-accomplished" id="what-you-ve-accomplished">What You&#39;ve Accomplished</h3>

<p>Congratulations! You&#39;ve just built a multi-language application with some pretty sophisticated features:</p>
<ul><li><strong>Reproducible builds</strong>: Anyone can clone your project and get exactly the same development environment</li>
<li><strong>Automatic integration</strong>: Changes to your Rust code automatically flow through to your web application</li>
<li><strong>Type safety</strong>: Your TypeScript code has full type information about your Rust functions</li>
<li><strong>Performance</strong>: Critical JSON processing happens at near-native speed thanks to WebAssembly</li></ul>

<h3 id="next-steps" id="next-steps">Next Steps</h3>

<p>This foundation can be extended in many directions:</p>
<ul><li>Add more complex Rust functionality (maybe a SQL parser or image processing)</li>
<li>Create multiple web applications that share the same Rust core</li>
<li>Set up automatic testing that verifies the Rust-TypeScript integration</li>
<li>Deploy to production with the same reproducible build process</li></ul>

<p>The patterns you&#39;ve learned here scale from simple projects like this one all the way up to complex multi-repository systems with dozens of interconnected components.</p>

<h2 id="beyond-webassembly-rust-ffi-and-native-libraries" id="beyond-webassembly-rust-ffi-and-native-libraries">Beyond WebAssembly: Rust FFI and Native Libraries</h2>

<p>While this article focuses on WebAssembly, it&#39;s worth noting that Rust&#39;s capabilities extend far beyond the browser. Rust can compile to shared libraries (<code>.so</code> files on Linux, <code>.dylib</code> on macOS, <code>.dll</code> on Windows) that can be used from virtually any programming language through Foreign Function Interface (FFI).</p>

<p>This means you can write performance-critical code in Rust and use it from Python, Node.js, Ruby, Go, or any other language that supports FFI. Nix makes this even more powerful by ensuring that your shared libraries are built consistently and can be easily distributed to different target platforms.</p>

<p>For example, you might have a Rust library that handles cryptographic operations, compiled as both a WebAssembly module for browser use and as native shared libraries for server-side applications. Nix flakes can manage both build targets simultaneously, ensuring consistency across your entire technology stack.</p>

<h2 id="useful-resources-and-further-reading" id="useful-resources-and-further-reading">Useful Resources and Further Reading</h2>

<h3 id="getting-started-with-nix" id="getting-started-with-nix">Getting Started with Nix</h3>
<ul><li><strong><a href="https://zero-to-nix.com/" rel="nofollow">Zero to Nix</a></strong> – An excellent beginner-friendly guide to learning Nix from scratch. This is the best place to start if you&#39;re new to Nix and want to understand the fundamentals before diving into flakes.</li>
<li><strong><a href="https://lix.systems/" rel="nofollow">Lix</a></strong> – A community-driven alternative implementation of Nix that aims to improve upon the original with better performance, clearer error messages, and enhanced developer experience.</li>
<li><strong><a href="https://nixos.org/manual/nix/stable/command-ref/new-cli/nix3-flake.html" rel="nofollow">Nix Flakes Documentation</a></strong> – Official documentation for Nix flakes, covering all the technical details.</li></ul>

<h3 id="rust-and-webassembly" id="rust-and-webassembly">Rust and WebAssembly</h3>
<ul><li><strong><a href="https://rustwasm.github.io/docs/book/" rel="nofollow">The Rust and WebAssembly Book</a></strong> – Comprehensive guide to using Rust with WebAssembly, including best practices and optimization techniques.</li>
<li><strong><a href="https://doc.rust-lang.org/nomicon/ffi.html" rel="nofollow">Rust FFI Guide</a></strong> – Learn how to create shared libraries from Rust code that can be used by other programming languages.</li></ul>

<h3 id="multi-language-development" id="multi-language-development">Multi-Language Development</h3>
<ul><li><strong><a href="https://nixos.org/guides/nix-pills/" rel="nofollow">Nix Pills</a></strong> – Deep dive into Nix concepts and philosophy, helpful for understanding the underlying principles.</li>
<li><strong><a href="https://devenv.sh/" rel="nofollow">devenv</a></strong> – A higher-level tool built on Nix that simplifies creating development environments for multi-language projects.</li></ul>

<h3 id="advanced-topics" id="advanced-topics">Advanced Topics</h3>
<ul><li><strong><a href="https://nixos.org/manual/nixpkgs/stable/#chap-cross" rel="nofollow">Nix Cross Compilation</a></strong> – Learn how to use Nix to build software for different target platforms.</li>
<li><strong><a href="https://nixos.org/manual/nixos/stable/index.html#sec-writing-modules" rel="nofollow">NixOS Modules</a></strong> – For when you want to extend beyond development environments into full system configuration.</li>
<li><strong><a href="https://flake.parts/" rel="nofollow">Flake Parts</a></strong> – A framework for organizing complex Nix flakes in a modular way.</li></ul>

<p>These resources will help you deepen your understanding of the concepts presented in this article and explore more advanced use cases for multi-language development with Nix.</p>
]]></content:encoded>
      <guid>https://sgolovin.live/managing-multi-language-development-environments-with-nix-flakes</guid>
      <pubDate>Tue, 29 Jul 2025 11:49:53 +0000</pubDate>
    </item>
    <item>
      <title>Rust inside Common Lisp</title>
      <link>https://sgolovin.live/rust-inside-common-lisp?pk_campaign=rss-feed</link>
      <description>&lt;![CDATA[I started a pet project that combines Rust (for performance and its robust library ecosystem) with Common Lisp (for its interactivity and Condition System), aiming to get the best of both worlds. This setup allows me to build core functionality in Rust and still benefit from the interactive development features and flexibility of Lisp.&#xA;&#xA;Below, you’ll find a brief explanation of how to expose Rust functions and pointers to Common Lisp, and how to set up callbacks from Lisp back to Rust.&#xA;&#xA;!--more--&#xA;&#xA;Rust Side&#xA;&#xA;To share a Rust function with Common Lisp, we mark it with #[nomangle] and specify the C ABI by using extern &#34;C&#34;. That way, we can safely call it from Lisp. Here’s an example of exposing a Rust function that moves an editor’s cursor up:&#xA;&#xA;[nomangle]&#xA;pub extern &#34;C&#34; fn movecursorup(editorptr: mut Editor) {&#xA;    let editor: &amp;mut Editor = unsafe { &amp;mut editorptr };&#xA;    editor.cursor.moveup(&amp;editor.text);&#xA;}&#xA;&#xA;In this function, the pointer editorptr is passed in from Lisp. We convert it into a mutable reference (using unsafe { &amp;mut editorptr }) and then invoke moveup on the editor’s cursor. The unsafe block is required because we’re manually dereferencing a raw pointer.&#xA;&#xA;Below is another example that shows how we can run the editor and call back into Lisp when needed. Notice that we have optional callback parameters in the function signature:&#xA;&#xA;fn runeditor(&#xA;    path: PathBuf,&#xA;    oninitiate: Optionextern &#34;C&#34; fn(mut Editor) - cvoid  ,&#xA;    onkeypressed: Optionextern &#34;C&#34; fn(const cchar, const cchar) - cvoid  ,&#xA;) -  i32 {&#xA;    // ... some code&#xA;&#xA;    let mut editor = Editor::new(text, Some(path), onkeypressed);&#xA;&#xA;    if let Some(callback) = oninitiate {&#xA;        callback(&amp;mut editor);&#xA;    }&#xA;&#xA;    let result = editor.run();&#xA;    &#xA;    // ... some code&#xA;&#xA;    match result {&#xA;        Ok() =  0,&#xA;        Err(err) =  {&#xA;            eprintln!(&#34;Editor error: {err}&#34;);&#xA;            1&#xA;        }&#xA;    }&#xA;}&#xA;&#xA;Here, we receive an optional callback function pointer (oninitiate) which, if present, will be called with a pointer to the Editor. This mechanism can be useful for setting up or initializing data from Common Lisp before the editor runs. We can also call other callbacks (like onkeypressed) from within Rust whenever a key is pressed, passing any necessary data back to Lisp.&#xA;&#xA;Common Lisp Side&#xA;&#xA;For the Lisp side, we can use CFFI to interact with Rust. CFFI allows us to define foreign functions, callbacks, and work directly with pointers. Here’s a minimal example:&#xA;&#xA;;; some code&#xA;&#xA;(defcfun &#34;cruneditor&#34; :int &#xA;         (path :string)&#xA;         (on-initiate :pointer)&#xA;         (on-key-pressed :pointer))&#xA;         &#xA;(defcfun &#34;movecursorup&#34; :void&#xA;         (editor :pointer))&#xA;         &#xA;(defcallback on-initiate :void ((editor :pointer))&#xA;  (setq editor editor))&#xA;&#xA;defcfun &#34;cruneditor&#34; defines a foreign function named cruneditor that we can call from Lisp. It takes a string (the path) and two pointers (the callbacks).&#xA;&#xA;defcfun &#34;movecursor_up&#34; defines another foreign function that corresponds to the Rust function we exposed earlier.&#xA;&#xA;defcallback on-initiate creates a callback function in Lisp that matches the signature expected by Rust. Here, we simply capture the editor pointer in a global Lisp variable called editor. This pointer can then be used elsewhere in the Lisp code to interact with the same editor object that Rust is operating on.&#xA;&#xA;By combining these pieces, we can call into Rust for performance-critical tasks or to leverage Rust libraries, while still using Lisp’s REPL-driven workflow and powerful Condition System for interactive development and rapid iteration. This hybrid approach gives us the best of both ecosystems.]]&gt;</description>
      <content:encoded><![CDATA[<p>I started a pet project that combines Rust (for performance and its robust library ecosystem) with Common Lisp (for its interactivity and Condition System), aiming to get the best of both worlds. This setup allows me to build core functionality in Rust and still benefit from the interactive development features and flexibility of Lisp.</p>

<p>Below, you’ll find a brief explanation of how to expose Rust functions and pointers to Common Lisp, and how to set up callbacks from Lisp back to Rust.</p>



<h2 id="rust-side" id="rust-side">Rust Side</h2>

<p>To share a Rust function with Common Lisp, we mark it with <code>#[no_mangle]</code> and specify the C ABI by using <code>extern &#34;C&#34;</code>. That way, we can safely call it from Lisp. Here’s an example of exposing a Rust function that moves an editor’s cursor up:</p>

<pre><code class="language-rust">#[no_mangle]
pub extern &#34;C&#34; fn move_cursor_up(editor_ptr: *mut Editor) {
    let editor: &amp;mut Editor = unsafe { &amp;mut *editor_ptr };
    editor.cursor.move_up(&amp;editor.text);
}
</code></pre>

<p>In this function, the pointer <code>editor_ptr</code> is passed in from Lisp. We convert it into a mutable reference (using <code>unsafe { &amp;mut *editor_ptr }</code>) and then invoke <code>move_up</code> on the editor’s cursor. The <code>unsafe</code> block is required because we’re manually dereferencing a raw pointer.</p>

<p>Below is another example that shows how we can run the editor and call back into Lisp when needed. Notice that we have optional callback parameters in the function signature:</p>

<pre><code class="language-rust">fn run_editor(
    path: PathBuf,
    on_initiate: Option&lt;extern &#34;C&#34; fn(*mut Editor) -&gt; c_void&gt;,
    on_key_pressed: Option&lt;extern &#34;C&#34; fn(*const c_char, *const c_char) -&gt; c_void&gt;,
) -&gt; i32 {
    // ... some code

    let mut editor = Editor::new(text, Some(path), on_key_pressed);

    if let Some(callback) = on_initiate {
        callback(&amp;mut editor);
    }

    let result = editor.run();
    
    // ... some code

    match result {
        Ok(_) =&gt; 0,
        Err(err) =&gt; {
            eprintln!(&#34;Editor error: {err}&#34;);
            1
        }
    }
}
</code></pre>

<p>Here, we receive an optional callback function pointer (<code>on_initiate</code>) which, if present, will be called with a pointer to the <code>Editor</code>. This mechanism can be useful for setting up or initializing data from Common Lisp before the editor runs. We can also call other callbacks (like <code>on_key_pressed</code>) from within Rust whenever a key is pressed, passing any necessary data back to Lisp.</p>

<h2 id="common-lisp-side" id="common-lisp-side">Common Lisp Side</h2>

<p>For the Lisp side, we can use <a href="https://common-lisp.net/project/cffi/" rel="nofollow">CFFI</a> to interact with Rust. CFFI allows us to define foreign functions, callbacks, and work directly with pointers. Here’s a minimal example:</p>

<pre><code class="language-lisp">;; some code

(defcfun &#34;c_run_editor&#34; :int 
         (path :string)
         (on-initiate :pointer)
         (on-key-pressed :pointer))
         
(defcfun &#34;move_cursor_up&#34; :void
         (editor :pointer))
         
(defcallback on-initiate :void ((editor :pointer))
  (setq *editor* editor))
</code></pre>
<ol><li><p><strong><code>defcfun &#34;c_run_editor&#34;</code></strong> defines a foreign function named <code>c_run_editor</code> that we can call from Lisp. It takes a string (the path) and two pointers (the callbacks).</p></li>

<li><p><strong><code>defcfun &#34;move_cursor_up&#34;</code></strong> defines another foreign function that corresponds to the Rust function we exposed earlier.</p></li>

<li><p><strong><code>defcallback on-initiate</code></strong> creates a callback function in Lisp that matches the signature expected by Rust. Here, we simply capture the editor pointer in a global Lisp variable called <code>*editor*</code>. This pointer can then be used elsewhere in the Lisp code to interact with the same editor object that Rust is operating on.</p></li></ol>

<p>By combining these pieces, we can call into Rust for performance-critical tasks or to leverage Rust libraries, while still using Lisp’s REPL-driven workflow and powerful Condition System for interactive development and rapid iteration. This hybrid approach gives us the best of both ecosystems.</p>
]]></content:encoded>
      <guid>https://sgolovin.live/rust-inside-common-lisp</guid>
      <pubDate>Fri, 21 Mar 2025 16:27:30 +0000</pubDate>
    </item>
    <item>
      <title>Exploring Jujutsu: A Modern Spin on Git</title>
      <link>https://sgolovin.live/exploring-jujutsu-a-modern-spin-on-git?pk_campaign=rss-feed</link>
      <description>&lt;![CDATA[I still remember the day I first switched from SVN (Subversion) to Git. It felt like the future had suddenly arrived on my doorstep, offering the power to commit, branch, and merge in ways that SVN couldn’t easily match. Over time, Git became my bread and butter. Now, in 2025 and beyond, I once again find myself on the verge of a version-control transformation—this time by experimenting with Jujutsu.&#xA;&#xA;!--more--&#xA;&#xA;Why try Jujutsu?&#xA;&#xA;There’s a certain comfort in sticking with Git. After all, it’s ubiquitous, battle-tested, and has an ocean of resources supporting it. But sometimes, small frustrations or inefficiencies start piling up, making you wonder if there&#39;s a better way. Jujutsu (abbreviated as jj) stands out because:&#xA;&#xA;It’s backward compatible with Git. You can continue working in the same repositories as colleagues who haven’t adopted jj. When you need to create a pull request, you can simply push a Git branch without impacting your teammates’ workflows.&#xA;&#xA;Stacked branches/PRs are simpler. If you have multiple features building on top of one another, jj presents them in a neatly stacked layout, making it clearer how changes progress.&#xA;&#xA;You can combine commits or branches freely. With Jujutsu, you don’t have to wait for a branch to get merged into main before building on it. You can reference multiple changesets in a new feature branch, which is incredibly liberating for rapid development.&#xA;&#xA;It has better conflict markers. Merge conflicts become easier to parse, thanks to a cleaner layout.&#xA;&#xA;Of course, nothing’s perfect. Jujutsu has a few downsides:&#xA;&#xA;It doesn’t have Git hooks. If you rely heavily on hooks to enforce policies (e.g., commit-msg or pre-receive hooks), you’ll have to find alternatives or revert to Git for those tasks.&#xA;&#xA;It has its own philosophy. Even though it’s compatible with Git, jj still approaches version control differently, which can be an uphill battle if your entire team isn’t on board.&#xA;&#xA;What’s the Philosophy of Jujutsu?&#xA;&#xA;Jujutsu aims to simplify Git workflows. There are no traditional “branches” here, and there’s no separate “commit” step after making changes. Instead, you work with a sequence of changesets. When you need a new set of changes, you create it with jj new and begin editing files. Once you’re ready, you can add a description (using jj desc) and finalize the changeset. For convenience, jj commit combines these actions—though it’s not the same as git commit because it just appends a description and seals your changes into a new changeset.&#xA;&#xA;For example, here’s a sample working history:&#xA;&#xA;❯ jj&#xA;@  klqwozss sergey.golovin@protonmail.com 2025-02-02 09:53:05 bc1c8fa9&#xA;│  fix: clear terminal before going back to the editor&#xA;◆  zurmvopq sergey.golovin@protonmail.com 2025-02-01 20:46:22 main githead() e2286213&#xA;│  chore: update colors of status bar&#xA;~&#xA;&#xA;To create a new changeset, just type:&#xA;&#xA;❯ jj new&#xA;Working copy now at: nmkklpov 77f465f2 (empty) (no description set)&#xA;Parent commit      : klqwozss bc1c8fa9 fix: clear terminal before going back to the editor&#xA;&#xA;You can now add or modify files, and those changes are saved automatically. When you’re ready, you can describe the changeset with jj desc or do it all at once with jj commit.&#xA;&#xA;Do not wait! Just add this to your working branch&#xA;&#xA;One of the coolest things about Jujutsu is that you don&#39;t need to wait for a feature to be merged into a main branch to build on top of it. Consider this history:&#xA;&#xA;❯ jj&#xA;@  ltyrpuno sergey.golovin@protonmail.com 2025-02-02 10:07:35 1be73fee&#xA;│  (no description set)&#xA;│ ○  lkqryomk sergey.golovin@protonmail.com 2025-02-02 10:06:54 3e036390&#xA;│ │  (no description set)&#xA;│ ○  pzvmvtpw sergey.golovin@protonmail.com 2025-02-02 10:04:46 ffe78f2a&#xA;├─╯  (no description set)&#xA;○  nmkklpov sergey.golovin@protonmail.com 2025-02-02 10:04:31 githead() 2a5dfd41&#xA;│  (no description set)&#xA;│ ○  uyyszzsz sergey.golovin@protonmail.com 2025-02-02 10:06:16 d9ba2396&#xA;│ │  (no description set)&#xA;│ ○  knmruutr sergey.golovin@protonmail.com 2025-02-02 10:05:12 b7498368&#xA;├─╯  (no description set)&#xA;○  klqwozss sergey.golovin@protonmail.com 2025-02-02 09:53:05 bc1c8fa9&#xA;│  fix: clear terminal before going back to the editor&#xA;&#xA;If you want to create a new changeset that includes ltyrpuno, lkqryomk, and uyyszzsz, you could do:&#xA;&#xA;❯ jj new u lk lt&#xA;Working copy now at: ptsmnxmz 362396b6 (conflict) (empty) (no description set)&#xA;Parent commit      : uyyszzsz d9ba2396 (no description set)&#xA;Parent commit      : lkqryomk 3e036390 (no description set)&#xA;Parent commit      : ltyrpuno 1be73fee (no description set)&#xA;Added 2 files, modified 1 files, removed 0 files&#xA;There are unresolved conflicts at these paths:&#xA;c.txt    2-sided conflict&#xA;d.txt    2-sided conflict&#xA;&#xA;Conflicts can arise, but once you resolve them in Jujutsu, it remembers those resolutions. Future merges with the same changes won’t pester you again with the same conflicts.&#xA;&#xA;After resolving conflicts, you might see something like this:&#xA;&#xA;@  ptlrysws sergey.golovin@protonmail.com 2025-02-02 10:15:14 1e9b3710&#xA;│  (no description set)&#xA;○      ptsmnxmz sergey.golovin@protonmail.com 2025-02-02 10:14:51 git_head() 0078f3ff&#xA;├─┬─╮  (no description set)&#xA;│ │ ○  ltyrpuno sergey.golovin@protonmail.com 2025-02-02 10:07:35 1be73fee&#xA;│ │ │  (no description set)&#xA;│ ○ │  lkqryomk sergey.golovin@protonmail.com 2025-02-02 10:06:54 3e036390&#xA;│ │ │  (no description set)&#xA;│ ○ │  pzvmvtpw sergey.golovin@protonmail.com 2025-02-02 10:04:46 ffe78f2a&#xA;│ ├─╯  (no description set)&#xA;│ ○  nmkklpov sergey.golovin@protonmail.com 2025-02-02 10:04:31 2a5dfd41&#xA;│ │  (no description set)&#xA;○ │  uyyszzsz sergey.golovin@protonmail.com 2025-02-02 10:06:16 d9ba2396&#xA;│ │  (no description set)&#xA;○ │  knmruutr sergey.golovin@protonmail.com 2025-02-02 10:05:12 b7498368&#xA;├─╯  (no description set)&#xA;○  klqwozss sergey.golovin@protonmail.com 2025-02-02 09:53:05 bc1c8fa9&#xA;│  fix: clear terminal before going back to the editor&#xA;◆  zurmvopq sergey.golovin@protonmail.com 2025-02-01 20:46:22 main e2286213&#xA;│  chore: update colors of status bar&#xA;~&#xA;&#xA;You can also edit previous changesets. For instance, if you need to modify nmkklpov, type jj edit n, make changes, and then return to your last commit. Any updates in the earlier changeset flow naturally into subsequent changes. This approach works just as well for long chains of stacked changesets. By running jj git fetch, then jj git push, you can synchronize all updated branches with the remote.&#xA;&#xA;Though Jujutsu technically doesn’t use “branches,” it does support bookmarks, which are analogous to Git branches. You can manage these bookmarks via commands like jj branch create new-branch or jj b c new-branch, and then push them to your remote as you would with any regular Git branch.&#xA;&#xA;Conclusion&#xA;&#xA;Jujutsu offers a refreshing take on version control, blending Git compatibility with a simpler, more flexible workflow. Its stacked changesets make it easier to experiment, resolve conflicts once, and move forward without getting bogged down. While there are trade-offs—especially if your team depends on certain Git-exclusive features—Jujutsu’s approach to branching, committing, and merging can drastically streamline everyday tasks. If you’re feeling the limitations of Git, exploring Jujutsu might just open new doors in your development workflow.]]&gt;</description>
      <content:encoded><![CDATA[<p>I still remember the day I first switched from SVN (Subversion) to Git. It felt like the future had suddenly arrived on my doorstep, offering the power to commit, branch, and merge in ways that SVN couldn’t easily match. Over time, Git became my bread and butter. Now, in 2025 and beyond, I once again find myself on the verge of a version-control transformation—this time by experimenting with <a href="https://github.com/martinvonz/jj" rel="nofollow">Jujutsu</a>.</p>



<h2 id="why-try-jujutsu" id="why-try-jujutsu">Why try Jujutsu?</h2>

<p>There’s a certain comfort in sticking with Git. After all, it’s ubiquitous, battle-tested, and has an ocean of resources supporting it. But sometimes, small frustrations or inefficiencies start piling up, making you wonder if there&#39;s a better way. Jujutsu (abbreviated as <code>jj</code>) stands out because:</p>
<ol><li><p><strong>It’s backward compatible with Git.</strong> You can continue working in the same repositories as colleagues who haven’t adopted <code>jj</code>. When you need to create a pull request, you can simply push a Git branch without impacting your teammates’ workflows.</p></li>

<li><p><strong>Stacked branches/PRs are simpler.</strong> If you have multiple features building on top of one another, <code>jj</code> presents them in a neatly stacked layout, making it clearer how changes progress.</p></li>

<li><p><strong>You can combine commits or branches freely.</strong> With Jujutsu, you don’t have to wait for a branch to get merged into <code>main</code> before building on it. You can reference multiple changesets in a new feature branch, which is incredibly liberating for rapid development.</p></li>

<li><p><strong>It has better conflict markers.</strong> Merge conflicts become easier to parse, thanks to a cleaner layout.</p></li></ol>

<p>Of course, nothing’s perfect. Jujutsu has a few downsides:</p>
<ol><li><p><strong>It doesn’t have Git hooks.</strong> If you rely heavily on hooks to enforce policies (e.g., commit-msg or pre-receive hooks), you’ll have to find alternatives or revert to Git for those tasks.</p></li>

<li><p><strong>It has its own philosophy.</strong> Even though it’s compatible with Git, <code>jj</code> still approaches version control differently, which can be an uphill battle if your entire team isn’t on board.</p></li></ol>

<h2 id="what-s-the-philosophy-of-jujutsu" id="what-s-the-philosophy-of-jujutsu">What’s the Philosophy of Jujutsu?</h2>

<p>Jujutsu aims to simplify Git workflows. There are no traditional “branches” here, and there’s no separate “commit” step after making changes. Instead, you work with a sequence of changesets. When you need a new set of changes, you create it with <code>jj new</code> and begin editing files. Once you’re ready, you can add a description (using <code>jj desc</code>) and finalize the changeset. For convenience, <code>jj commit</code> combines these actions—though it’s not the same as <code>git commit</code> because it just appends a description and seals your changes into a new changeset.</p>

<p>For example, here’s a sample working history:</p>

<pre><code class="language-bash">❯ jj
@  klqwozss sergey.golovin@protonmail.com 2025-02-02 09:53:05 bc1c8fa9
│  fix: clear terminal before going back to the editor
◆  zurmvopq sergey.golovin@protonmail.com 2025-02-01 20:46:22 main git_head() e2286213
│  chore: update colors of status bar
~
</code></pre>

<p>To create a new changeset, just type:</p>

<pre><code class="language-bash">❯ jj new
Working copy now at: nmkklpov 77f465f2 (empty) (no description set)
Parent commit      : klqwozss bc1c8fa9 fix: clear terminal before going back to the editor
</code></pre>

<p>You can now add or modify files, and those changes are saved automatically. When you’re ready, you can describe the changeset with <code>jj desc</code> or do it all at once with <code>jj commit</code>.</p>

<h2 id="do-not-wait-just-add-this-to-your-working-branch" id="do-not-wait-just-add-this-to-your-working-branch">Do not wait! Just add this to your working branch</h2>

<p>One of the coolest things about Jujutsu is that you don&#39;t need to wait for a feature to be merged into a main branch to build on top of it. Consider this history:</p>

<pre><code class="language-bash">❯ jj
@  ltyrpuno sergey.golovin@protonmail.com 2025-02-02 10:07:35 1be73fee
│  (no description set)
│ ○  lkqryomk sergey.golovin@protonmail.com 2025-02-02 10:06:54 3e036390
│ │  (no description set)
│ ○  pzvmvtpw sergey.golovin@protonmail.com 2025-02-02 10:04:46 ffe78f2a
├─╯  (no description set)
○  nmkklpov sergey.golovin@protonmail.com 2025-02-02 10:04:31 git_head() 2a5dfd41
│  (no description set)
│ ○  uyyszzsz sergey.golovin@protonmail.com 2025-02-02 10:06:16 d9ba2396
│ │  (no description set)
│ ○  knmruutr sergey.golovin@protonmail.com 2025-02-02 10:05:12 b7498368
├─╯  (no description set)
○  klqwozss sergey.golovin@protonmail.com 2025-02-02 09:53:05 bc1c8fa9
│  fix: clear terminal before going back to the editor
</code></pre>

<p>If you want to create a new changeset that includes <code>ltyrpuno</code>, <code>lkqryomk</code>, and <code>uyyszzsz</code>, you could do:</p>

<pre><code class="language-bash">❯ jj new u lk lt
Working copy now at: ptsmnxmz 362396b6 (conflict) (empty) (no description set)
Parent commit      : uyyszzsz d9ba2396 (no description set)
Parent commit      : lkqryomk 3e036390 (no description set)
Parent commit      : ltyrpuno 1be73fee (no description set)
Added 2 files, modified 1 files, removed 0 files
There are unresolved conflicts at these paths:
c.txt    2-sided conflict
d.txt    2-sided conflict
</code></pre>

<p>Conflicts can arise, but once you resolve them in Jujutsu, it remembers those resolutions. Future merges with the same changes won’t pester you again with the same conflicts.</p>

<p>After resolving conflicts, you might see something like this:</p>

<pre><code class="language-bash">@  ptlrysws sergey.golovin@protonmail.com 2025-02-02 10:15:14 1e9b3710
│  (no description set)
○      ptsmnxmz sergey.golovin@protonmail.com 2025-02-02 10:14:51 git_head() 0078f3ff
├─┬─╮  (no description set)
│ │ ○  ltyrpuno sergey.golovin@protonmail.com 2025-02-02 10:07:35 1be73fee
│ │ │  (no description set)
│ ○ │  lkqryomk sergey.golovin@protonmail.com 2025-02-02 10:06:54 3e036390
│ │ │  (no description set)
│ ○ │  pzvmvtpw sergey.golovin@protonmail.com 2025-02-02 10:04:46 ffe78f2a
│ ├─╯  (no description set)
│ ○  nmkklpov sergey.golovin@protonmail.com 2025-02-02 10:04:31 2a5dfd41
│ │  (no description set)
○ │  uyyszzsz sergey.golovin@protonmail.com 2025-02-02 10:06:16 d9ba2396
│ │  (no description set)
○ │  knmruutr sergey.golovin@protonmail.com 2025-02-02 10:05:12 b7498368
├─╯  (no description set)
○  klqwozss sergey.golovin@protonmail.com 2025-02-02 09:53:05 bc1c8fa9
│  fix: clear terminal before going back to the editor
◆  zurmvopq sergey.golovin@protonmail.com 2025-02-01 20:46:22 main e2286213
│  chore: update colors of status bar
~
</code></pre>

<p>You can also edit previous changesets. For instance, if you need to modify <code>nmkklpov</code>, type <code>jj edit n</code>, make changes, and then return to your last commit. Any updates in the earlier changeset flow naturally into subsequent changes. This approach works just as well for long chains of stacked changesets. By running <code>jj git fetch</code>, then <code>jj git push</code>, you can synchronize all updated branches with the remote.</p>

<p>Though Jujutsu technically doesn’t use “branches,” it does support <strong>bookmarks</strong>, which are analogous to Git branches. You can manage these bookmarks via commands like <code>jj branch create new-branch</code> or <code>jj b c new-branch</code>, and then push them to your remote as you would with any regular Git branch.</p>

<h2 id="conclusion" id="conclusion">Conclusion</h2>

<p>Jujutsu offers a refreshing take on version control, blending Git compatibility with a simpler, more flexible workflow. Its stacked changesets make it easier to experiment, resolve conflicts once, and move forward without getting bogged down. While there are trade-offs—especially if your team depends on certain Git-exclusive features—Jujutsu’s approach to branching, committing, and merging can drastically streamline everyday tasks. If you’re feeling the limitations of Git, exploring Jujutsu might just open new doors in your development workflow.</p>
]]></content:encoded>
      <guid>https://sgolovin.live/exploring-jujutsu-a-modern-spin-on-git</guid>
      <pubDate>Tue, 25 Feb 2025 13:39:23 +0000</pubDate>
    </item>
  </channel>
</rss>