Git Product home page Git Product logo

maple-diffusion's Introduction

🍁 Maple Diffusion

Maple Diffusion runs Stable Diffusion models locally on macOS / iOS devices, in Swift, using the MPSGraph framework (not Python).

Maple Diffusion should be capable of generating a reasonable image in a minute or two on a recent iPhone (I get around ~2.3s / step on an iPhone 13 Pro).

To attain usable performance without tripping over iOS's 4GB memory limit, Maple Diffusion relies internally on FP16 (NHWC) tensors, operator fusion from MPSGraph, and a truly pitiable degree of swapping models to device storage.

On macOS, Maple Diffusion uses slightly more memory (~6GB), to reach <1s / step.

Related Projects

  • Core ML Stable Diffusion (repo) is Apple's recommended way of running Stable Diffusion in Swift, using CoreML instead of MPSGraph. CoreML was originally much slower than MPSGraph (I tried it back in August), but Apple has improved CoreML performance a lot on recent macOS / iOS versions.
  • Native Diffusion (repo) is a Swift Package-ified version of this codebase with several improvements (including image-to-image)
  • Waifu Art AI (announcement, App Store link) is an iOS / macOS app for (anime-style) Stable Diffusion based on this codebase
  • Draw Things (announcement, App Store link) is an iOS app for Stable Diffusion (using an independent codebase with similar MPSGraph-based approach)

Device Requirements

Maple Diffusion should run on any Apple Silicon Mac (M1, M2, etc.). Intel Macs should also work now thanks to this PR.

Maple Diffusion should run on any iOS device with sufficient RAM (≥6144MB RAM definitely works; 4096MB doesn't). That means recent iPads should work out of the box, and recent iPhones should work if you can get the Increase Memory Limit capability working (to unlock 4GB of app-usable RAM). iPhone 14 variants reportedly didn't work until iOS 16.1 stable.

Maple Diffusion currently expects Xcode 14 and iOS 16; other versions may require changing build settings or just not work. iOS 16.1 (beta) was reportedly broken and always generating a gray image, but I think that's fixed

Usage

To build and run Maple Diffusion:

  1. Download a Stable Diffusion PyTorch model checkpoint (sd-v1-4.ckpt, or some derivation thereof)

  2. Download this repo

    git clone https://github.com/madebyollin/maple-diffusion.git && cd maple-diffusion
  3. Setup & install Python with PyTorch, if you haven't already.

    # may need to install conda first https://github.com/conda-forge/miniforge#homebrew
    conda deactivate
    conda remove -n maple-diffusion --all
    conda create -n maple-diffusion python=3.10
    conda activate maple-diffusion
    pip install torch typing_extensions numpy Pillow requests pytorch_lightning
  4. Convert the PyTorch model checkpoint into a bunch of fp16 binary blobs.

    ./maple-convert.py ~/Downloads/sd-v1-4.ckpt
  5. Open the maple-diffusion Xcode project. Select the device you want to run on from the Product > Destination menu.

  6. Manually add the Increased Memory Limit capability to the maple-diffusion target (this step might not be needed on iPads, but it's definitely needed on iPhones - the default limit is 3GB).

  7. Build & run the project on your device with the Product > Run menu.

maple-diffusion's People

Contributors

elrid avatar lukas1h avatar madebyollin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

maple-diffusion's Issues

Xcode compatibility

The project does not open with Xcode 13.4.1. It requires Xcode 14. This should be specified in the Readme.

in real device crashed

maple-diffusion is terrific repo,thanks to the author 👍.Now i can run it on simulator,but in real device I have encountered some issue.

Version & Device number

  • Xcode Version: 14.1 (14B47b)
  • iPhone 13: iOS16.1.1

Sign & Capabilities

  • Increased Memory Limit open

Cmmond Error
2022-11-16 22:59:47.788875+0800 maple-diffusion[2120:173361] Metal GPU Frame Capture Enabled
2022-11-16 22:59:47.790188+0800 maple-diffusion[2120:173361] Metal API Validation Enabled
2022-11-16 22:59:52.181475+0800 maple-diffusion[2120:173550] [ServicesDaemonManager] interruptionHandler is called. -[FontServicesDaemonManager connection]_block_invoke
libc++abi: terminating with uncaught exception of type std::bad_alloc: std::bad_alloc
terminating with uncaught exception of type std::bad_alloc: std::bad_alloc
(lldb)

image

Terminated due to memory issue on iPhone 14 Pro Max

I am running xcode on an intel Mac running macOS 12.6 and trying to install the app on my iPhone 14 Pro Max. After downloading a stable diffusion model checkpoint, downloading maple-diffusion.git, and running the code to convert to fp16 binary blobs, i'm getting this memory terminated error on my iPhone 14 Pro Max running iOS 16.0.3. Any ideas?

Screen Shot 2022-10-14 at 4 21 14 AM

Screen Shot 2022-10-14 at 4 35 00 AM

Fatal error: Unexpectedly found nil while unwrapping an Optional value

Thanks for sharing this, absolutely amazing!

Tried running on:
MacBook Air M1 16GB
MacOS Ventura 13.0 (22A380)
Xcode 14.0.1 (14A400)

And got this error:
maple_diffusion/MapleDiffusion.swift:18: Fatal error: Unexpectedly found nil while unwrapping an Optional value
2022-10-31 16:33:10.431161+0100 maple-diffusion[91931:1445876] maple_diffusion/MapleDiffusion.swift:18: Fatal error: Unexpectedly found nil while unwrapping an Optional value
(lldb)

Screenshot 2022-10-31 at 16 38 28

Seems like the files where generated on the bin folder:
Screenshot 2022-10-31 at 16 34 14

Any idea what could be wrong?

Cheers from Stockholm!

Variable Image size

Right now the trivial way to generate images of different sizes is to recreate the Maple Diffusion object with the new size params. Is there any way which is more efficient?

Please document the python requirements for running maple-convert.py

Assuming you have conda installed, the environment setup instructions for the maple-convert.py script seem to be:

conda deactivate
conda remove -n maple-diffusion --all
conda create -n maple-diffusion python=3.10
conda activate maple-diffusion
pip install torch typing_extensions numpy Pillow requests pytorch_lightning

Diffusing results in gray image

No matter what I input to the model, and what noise do I start with, I get the same result: just a gray image
IMG_3291

Environment:

Xcode 14.1 beta 2
iOS 16.1 beta 3
iPhone 13 Pro

python-3.9 numpy-1.23.4 torch-1.12.1 pytorch-lightning-1.7.7

./maple-convert.py should ignore non-tensor checkpoint state

Some weight checkpoints have extra non-tensor data in their state_dict. maple-convert.py should ignore this extra data rather than crashing.

 # model weights
-for k in ckpt["state_dict"]:
-    if "first_stage_model.encoder" in k: continue
-    ckpt["state_dict"][k].numpy().astype('float16').tofile(outpath / (k + ".bin"))
+for k, v in ckpt["state_dict"].items():
+    if "first_stage_model.encoder" in k:
+        continue
+    if not hasattr(v, "numpy"):
+        continue
+    v.numpy().astype('float16').tofile(outpath / (k + ".bin"))

Signing requires a development team

/Users/19wolf/Downloads/maple-diffusion/maple-diffusion.xcodeproj error project: Signing for "maple-diffusion" requires a development team. Select a development team in the Signing & Capabilities editor.

Recent Float16 removal commit causes problems on both macOS and iPadOS

This commit causes crashes 8679427

2022-10-17 21:00:17.526608-0700 maple-diffusion[89510:3383655] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** +[NSString stringWithUTF8String:]: NULL cString'
*** First throw call stack:
(
	0   CoreFoundation                      0x00000001aaff0418 __exceptionPreprocess + 176
	1   libobjc.A.dylib                     0x00000001aab3aea8 objc_exception_throw + 60
	2   Foundation                          0x00000001abe425b8 __destroy_helper_block_e8_32o40r + 0
	3   MetalPerformanceShadersGraph        0x000000020fafdfa8 MetalPerformanceShadersGraph + 155560
	4   MetalPerformanceShadersGraph        0x000000020fafda60 MetalPerformanceShadersGraph + 154208
	5   MetalPerformanceShadersGraph        0x000000020fca8578 MetalPerformanceShadersGraph + 1901944
	6   MetalPerformanceShadersGraph        0x000000020fcade0c MetalPerformanceShadersGraph + 1924620
	7   maple-diffusion                     0x0000000104dbc818 $s15maple_diffusion14MapleDiffusionC21saveMemoryButBeSlowerACSb_tcfc + 2724
	8   maple-diffusion                     0x0000000104dbbd64 $s15maple_diffusion14MapleDiffusionC21saveMemoryButBeSlowerACSb_tcfC + 60
	9   maple-diffusion                     0x0000000104d9f178 $s15maple_diffusion11ContentViewVACycfC + 360
	10  maple-diffusion                     0x0000000104dc6e1c $s15maple_diffusion0a1_B3AppV4bodyQrvg7SwiftUI4ViewPAEE15navigationTitleyQrAE18LocalizedStringKeyVFQOyAgEE5frame8minWidth05idealO003maxO00N6Height0pR00qR09alignmentQr12CoreGraphics7CGFloatVSg_A5vE9AlignmentVtFQOyAA07ContentG0V_Qo__Qo_yXEfU_ + 288
	11  SwiftUI                             0x00000001d2b2ffb8 OUTLINED_FUNCTION_119 + 92
	12  maple-diffusion                     0x0000000104dc6c58 $s15maple_diffusion0a1_B3AppV4bodyQrvg + 356
	13  maple-diffusion                     0x0000000104dc7354 $s15maple_diffusion0a1_B3AppV7SwiftUI0C0AadEP4body4BodyQzvgTW + 12
	14  SwiftUI                             0x00000001d206ee24 OUTLINED_FUNCTION_3 + 13612
	15  SwiftUI                             0x00000001d24165ac OUTLINED_FUNCTION_266 + 9284
	16  SwiftUI                             0x00000001d206e19c OUTLINED_FUNCTION_3 + 10404
	17  SwiftUI                             0x00000001d2416904 OUTLINED_FUNCTION_266 + 10140
	18  SwiftUI                             0x00000001d2277f50 OUTLINED_FUNCTION_52 + 9520
	19  AttributeGraph                      0x00000001d30cd4b8 _ZN2AG5Graph11UpdateStack6updateEv + 520
	20  AttributeGraph                      0x00000001d30cdc38 _ZN2AG5Graph16update_attributeENS_4data3ptrINS_4NodeEEEj + 424
	21  AttributeGraph                      0x00000001d30d6498 _ZN2AG5Graph20input_value_ref_slowENS_4data3ptrINS_4NodeEEENS_11AttributeIDEjPK15AGSwiftMetadataRhl + 420
	22  AttributeGraph                      0x00000001d30ed71c AGGraphGetValue + 212
	23  SwiftUI                             0x00000001d2416708 OUTLINED_FUNCTION_266 + 9632
	24  SwiftUI                             0x00000001d24168dc OUTLINED_FUNCTION_266 + 10100
	25  SwiftUI                             0x00000001d2277f50 OUTLINED_FUNCTION_52 + 9520
	26  AttributeGraph                      0x00000001d30cd4b8 _ZN2AG5Graph11UpdateStack6updateEv + 520
	27  AttributeGraph                      0x00000001d30cdc38 _ZN2AG5Graph16update_attributeENS_4data3ptrINS_4NodeEEEj + 424
	28  AttributeGraph                      0x00000001d30d6498 _ZN2AG5Graph20input_value_ref_slowENS_4data3ptrINS_4NodeEEENS_11AttributeIDEjPK15AGSwiftMetadataRhl + 420
	29  AttributeGraph                      0x00000001d30ed71c AGGraphGetValue + 212
	30  SwiftUI                             0x00000001d2b30d78 OUTLINED_FUNCTION_119 + 3612
	31  SwiftUI                             0x00000001d2b30e58 OUTLINED_FUNCTION_119 + 3836
	32  SwiftUI                             0x00000001d1dc2258 dynamic_cast_existential_1_unconditional + 19732
	33  AttributeGraph                      0x00000001d30cd4b8 _ZN2AG5Graph11UpdateStack6updateEv + 520
	34  AttributeGraph                      0x00000001d30cdc38 _ZN2AG5Graph16update_attributeENS_4data3ptrINS_4NodeEEEj + 424
	35  AttributeGraph                      0x00000001d30d5b9c _ZN2AG5Graph9value_refENS_11AttributeIDEPK15AGSwiftMetadataRh + 192
	36  AttributeGraph                      0x00000001d30ed764 AGGraphGetValue + 284
	37  SwiftUI                             0x00000001d206bf54 OUTLINED_FUNCTION_3 + 1628
	38  SwiftUI                             0x00000001d2ac0ed8 OUTLINED_FUNCTION_1 + 22084
	39  SwiftUI                             0x00000001d206d3c4 OUTLINED_FUNCTION_3 + 6860
	40  SwiftUI                             0x00000001d2cd5bc8 OUTLINED_FUNCTION_11 + 10944
	41  SwiftUI                             0x00000001d2aed868 OUTLINED_FUNCTION_11 + 416
	42  SwiftUI                             0x00000001d2aed758 OUTLINED_FUNCTION_11 + 144
	43  SwiftUI                             0x00000001d232c9c8 OUTLINED_FUNCTION_1 + 136
	44  maple-diffusion                     0x0000000104dc72d4 $s15maple_diffusion0a1_B3AppV5$mainyyFZ + 40
	45  maple-diffusion                     0x0000000104dc737c main + 12
	46  dyld                                0x00000001aab6be50 start + 2544
)
libc++abi: terminating with uncaught exception of type NSException
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** +[NSString stringWithUTF8String:]: NULL cString'
terminating with uncaught exception of type NSException

MacOS 13.0 beta, iPadOS beta 16.1b57

Create Swift Package

Thanks for this amazing work! This truly feels next level.

I saw your vision about MD being a backend to other apps, so I thought a good first step would be to turn it into a Swift Package. Here's the developer experience I envision

Easy to integrate. The simplest implementation could be:

  1. Developer adds the dependency to their project
  2. Add MapleDiffusion as an observed object on a SwiftUI view, giving it two optional arguments: localModelURL, remoteModelURL
  3. If it doesn't find the bins folder in the bundle or in the localModelURL, it starts downloading the files from the remoteModelURL and unzips to the localModelURL destination

That means a few substantial changes to this codebase, so you may not be interested in merging into this repo, but I am of course open to it. Here's what I did so far

  • Restructured the entire repo into a Swift Package
  • Put the app itself in an Example folder

To not have the app freeze the first 10 seconds, it is now loading the model asynchronously, showing the UI immediately

  • Load model in a background task
  • Added a publisher for model loading status

I also created a Combine Publisher wrapper for the callback-based generate function in the main class. Calling it, a developer gets a publisher they can pass around between pieces of their app and listen to changes.

I made a few small adjustments to the example app

  • Show progress (and stages) of the model loading while disabling the Generate button, but let users start typing a prompt
  • Added a seed input field and a randomizer button

My WIP fork is here
https://github.com/mortenjust/maple-diffusion/

Improvements suggestion

hi, @madebyollin ! in article you have asked about level1 optimisation flag - please, take a look at this code. I recommend you to compile your MPSGraph instances using suggested approach. Compiled version is usually faster :)

Fatal error: Unexpectedly found nil while unwrapping an Optional value

I ran all the steps but I get this error whenever I run the project.

2022-10-20 09:15:18.949607-0400 maple-diffusion[47219:9588173] Metal GPU Frame Capture Enabled
2022-10-20 09:15:18.949813-0400 maple-diffusion[47219:9588173] Metal API Validation Enabled
maple_diffusion/MapleDiffusion.swift:18: Fatal error: Unexpectedly found nil while unwrapping an Optional value
2022-10-20 09:15:20.564156-0400 maple-diffusion[47219:9588746] maple_diffusion/MapleDiffusion.swift:18: Fatal error: Unexpectedly found nil while unwrapping an Optional value

I am using the NovelAI model not the original Stable Diffusion.

Sideloadable IPA?

This is super impressive! Would it be possible to compile this into an IPA so it can be sideloaded onto a phone with something like Signulous or AltStore?

Memory Requirements?

What are the minimum memory requirements for iOS? Mac OS?
Would this run on an iPhone SE (2nd Gen) with only 3gb of memory?

Off-topic, but love the Hobbit references in the source.

failed assertion

2022-10-20 16:29:53.297676-0400 maple-diffusion[4294:89875] Metal GPU Frame Capture Enabled
2022-10-20 16:29:53.297919-0400 maple-diffusion[4294:89875] Metal API Validation Enabled
-[MTLDebugComputeCommandEncoder setThreadgroupMemoryLength:atIndex:]:543: failed assertion length(16640) must be <= 16384.' -[MTLDebugComputeCommandEncoder setThreadgroupMemoryLength:atIndex:]:543: failed assertion length(16640) must be <= 16384.'
CoreSimulator 857.7 - Device: iPhone 13 Pro (9DA5A0FF-129A-426D-922B-5AFFEB611685) - Runtime: iOS 16.0 (20A360) - DeviceType: iPhone 13 Pro
(lldb)

Running in simulator on intel

I Made an App for Stable Diffusion Anime Art Generation based of Maple Diffusion

Hi everyone!
This is not technically a issue though..

I just made a Stable Diffusion for Anime app based of Maple Diffusion! It's able to run 100% offline on your Apple Devices (iPhone, iPad, Mac)

The app is called “WAIFU ART AI”, it's free no ad (an experiment app for my personal study)

It supports fun styles like watercolor, sketch, anime figure design, BJD doll etc.

Uniquely supports both English and Chinese as input text to generate amazing results.

App store page: https://apps.apple.com/us/app/waifu-art-ai-local-generator/id6444585505

This app requires a 6GB RAM device, such as the iPad M1/M2, iPhone 13 Pro, iPhone 14 or iPhone 14 Pro.

Developed based on Maple Diffusion. And using AniPlus v1 as the AI model for hosting this app.

Hope you like it!

1
2Styles-EN
3
4IMG_0881
5Comb
6Screen Shot 2022-11-23 at 5 40 49 AM
7IMG_1048
8Comb
9Screen Shot 2022-11-23 at 5 50 01 AM

Mismatch between byte count of data

Followed all instructions but I get this error when I build for my mac in xcode:

assert(data.count == expectedCount, "Mismatch between byte count of data \(data.count) and expected size \(expectedCount) for \(numels) els in \(fileUrl)")

if I expand the error I see this:

Thread 2: Assertion failed: Mismatch between byte count of data 51840 and expected size 23040 for 11520 els in file:///Users/ryancossette/Library/Developer/Xcode/DerivedData/maple-diffusion-ezlrowjenuiuddbvzedttsyyclbg/Build/Products/Debug/maple-diffusion.app/Contents/Resources/bins/model.diffusion_model.input_blocks.0.0.weight.bin

If there are too many tokens in the Prompt the app crashes.

Repro steps:

  1. Paste a long prompt into the Prompt string.
  2. Tap Generate Image.

Expected:

Image gets generated, possibly with a warning about ignoring the trailing tokens of the prompt.

Actual:

The app crashes.

The following prompt causes the crash, but I don't think there's anything special about it, it just has a lot of tokens and happened to be the prompt for the first image on lexica.art when I visited it to grab an example prompt:

a photorealistic dramatic fantasy render of a beautiful woman wearing a beautiful intricately detailed japanese crow kitsune mask and clasical japanese kimono by wlop, artgerm, greg rutkowski, alphonse mucha, beautiful dynamic dramatic dark moody lighting, shadows, cinematic atmosphere, artstation, concept design art, octane render, 8 k the seeds for each individual image are : [ 7 7 9 9 7 6 1 8 1, 3 2 5 6 4 1 8 5 7 5, 4 0 0 8 6 1 3 6 7 8, 3 1 5 5 8 2 9 4 2 4, 1 7 0 9 5 4 0 5 8 2, 9 3 3 7 4 3 2 0 7, 3 3 3 0 9 2 2 3 1 3, 4 7 7 1 8 2 1 7, 4 1 7 2 0 7 6 9 5 ] 

The crash is:

Swift/Array.swift:915: Fatal error: Can't construct Array with count < 0

#0	0x00000001bf2380f8 in _swift_runtime_on_report ()
#1	0x00000001bf2d5b50 in _swift_stdlib_reportFatalErrorInFile ()
#2	0x00000001bee99ac4 in closure #1 in closure #1 in closure #1 in _assertionFailure(_:_:file:line:flags:) ()
#3	0x00000001bee99828 in closure #1 in closure #1 in _assertionFailure(_:_:file:line:flags:) ()
#4	0x00000001bee99630 in closure #1 in _assertionFailure(_:_:file:line:flags:) ()
#5	0x00000001bee99188 in _assertionFailure(_:_:file:line:flags:) ()
#6	0x00000001bee7c944 in static Array._allocateUninitialized(_:) ()
#7	0x00000001bf1d17ac in specialized Array.init(repeating:count:) ()
#8	0x00000001beef0da8 in Array.init(repeating:count:) ()
#9	0x00000001048408b0 in BPETokenizer.encode(s:) at /Users/jackpal/Developer/maple-diffusion/maple-diffusion/MapleDiffusion.swift:545
#10	0x000000010484996c in MapleDiffusion.generateLatent(prompt:negativePrompt:seed:steps:guidanceScale:completion:) at /Users/jackpal/Developer/maple-diffusion/maple-diffusion/MapleDiffusion.swift:828
#11	0x000000010484bdf0 in MapleDiffusion.generate(prompt:negativePrompt:seed:steps:guidanceScale:completion:) at /Users/jackpal/Developer/maple-diffusion/maple-diffusion/MapleDiffusion.swift:879
#12	0x000000010481f0e4 in closure #1 in ContentView.generate() at /Users/jackpal/Developer/maple-diffusion/maple-diffusion/ContentView.swift:35

How to obtain the `.ckpt` file from original stable diffusion model?

Hi, thank you for your amazing job. I am trying to convert the stable diffusion model but I met some problems.
I have found that the original stable diffusion model consists of many files, configs and model bins, but here the convert script seems only receive single file as input.
So, How to obtain the single .ckpt file from the original stable diffusion model? It's ok to save one of the module's state_dict such as unet, but how about other modules?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.