Skip to main content
import { Sandbox } from "@opencomputer/sdk";

const sandbox = await Sandbox.create();

await sandbox.files.write("/app/hello.txt", "Hello, World!");
const content = await sandbox.files.read("/app/hello.txt");
console.log(content); // "Hello, World!"

await sandbox.kill();

Reading Files

// Read as string
const text = await sandbox.files.read("/app/config.json");

// Read as bytes
const bytes = await sandbox.files.readBytes("/app/image.png");

Writing Files

Content can be a string or binary data:
// Write text
await sandbox.files.write("/app/config.json", JSON.stringify({ port: 3000 }));

// Write binary
const imageData = new Uint8Array([...]);
await sandbox.files.write("/app/logo.png", imageData);

Listing Directories

const entries = await sandbox.files.list("/app");
for (const entry of entries) {
  console.log(entry.isDir ? "📁" : "📄", entry.name, entry.size);
}

EntryInfo

FieldTypeScriptPythonType
Namenamenamestring
Is directoryisDiris_dirboolean
Full pathpathpathstring
Size in bytessizesizenumber

Managing Files

Create Directory

await sandbox.files.makeDir("/app/data");

Remove

Deletes a file or directory:
await sandbox.files.remove("/app/temp");

Check Existence

if (await sandbox.files.exists("/app/config.json")) {
  const config = await sandbox.files.read("/app/config.json");
}
exists() is a client-side convenience — it attempts a file read and returns false on error. There is no dedicated HTTP endpoint for existence checks.

Large Files

All file operations are streamed end-to-end using 256KB chunks — there is no file size limit beyond available disk space. Files up to 200MB+ transfer reliably at ~20 MB/s. The standard read(), readBytes(), and write() methods work for large files, but they buffer the full content in memory on the client side. For very large files, use the streaming methods to avoid high memory usage:
// Stream a large file download — returns a ReadableStream, not a buffer
const stream = await sandbox.files.readStream("/data/model.bin");

// Pipe to a file, or process chunks incrementally
const reader = stream.getReader();
while (true) {
  const { done, value } = await reader.read();
  if (done) break;
  // process each Uint8Array chunk...
}

// Stream a large file upload — accepts ReadableStream or Uint8Array
const fileStream = someReadableStream; // e.g. from fs.createReadStream
await sandbox.files.writeStream("/data/model.bin", fileStream);

// Uint8Array also works (sent without buffering a copy)
await sandbox.files.writeStream("/data/output.bin", largeUint8Array);

Examples

Upload and Run a Script

await sandbox.files.write("/tmp/setup.sh", `#!/bin/bash
apt-get update && apt-get install -y redis-server
redis-server --daemonize yes
`);
await sandbox.exec.run("chmod +x /tmp/setup.sh && /tmp/setup.sh");
Full reference: TypeScript SDK · Python SDK · HTTP API.