YaraWelcome to The Grid, Samir. You're standing in ten thousand square feet of raw possibility. The Global Biennale opens in six weeks, and the Nova Collective is going to fill every inch of it with art that has never existed before.
ErdemIn the browser, on hardware we do not control. The venue has 40 kiosk machines — mid-range laptops with integrated GPUs. Whatever we build must run there.
MUSEEvery masterpiece begins with a single gradient update.
You have spent your frontend career building interfaces that respond to data — rendering API responses, sorting lists, filtering search results. All of that is about working with data that already exists. Generative AI flips the script entirely. It creates data that has never existed.
Every ML model you have heard about falls into one of two camps. A discriminative model takes input and produces a label: "Is this image a cat or a dog?" A generative model takes noise and produces new data: "Here is an image that looks like a cat."
const color = `hsl(${Math.random() * 360}, 70%, 50%)`const sample = model.predict(tf.randomNormal([1, 100]))You already use controlled randomness in frontend code. Every time you call Math.random() to generate a procedural color, a particle effect, or a placeholder avatar, you are doing the conceptual equivalent of what a generative model does — producing something new from a source of randomness. The difference is that a generative model has learned what "good" output looks like.
import * as tf from '@tensorflow/tfjs';
// DISCRIMINATIVE: input data → label
// "This image is 94% likely to be a landscape"
// const label = classifier.predict(imageData);
// GENERATIVE: random noise → new data
// "Here is a brand new landscape image"
const noise = tf.randomNormal([1, 100]);
// const newImage = generator.predict(noise);
// Frontend parallel: you already generate new things from randomness
function generateParticle() {
return {
x: Math.random() * canvas.width,
y: Math.random() * canvas.height,
hue: Math.random() * 360,
velocity: Math.random() * 2 + 0.5,
};
}
// A generative model is like generateParticle() — but it has
// LEARNED the distribution of "good particles" from dataA generative model learns a probability distribution from training data. Once trained, you sample from that distribution to create new instances. The model does not memorize and replay — it captures the structure and statistics of the data, then produces novel outputs that share those properties.
import * as tf from '@tensorflow/tfjs';
// The simplest possible "generative model":
// Learn the mean and variance of data, then sample new points
const trainingData = tf.tensor1d([2.1, 1.9, 2.3, 2.0, 1.8, 2.2]);
const mean = trainingData.mean();
const variance = trainingData.sub(mean).square().mean();
const std = variance.sqrt();
// Generate new samples that look like the training data
const newSamples = tf.randomNormal([10]).mul(std).add(mean);
console.log(await newSamples.array());
// Values clustered around 2.0 — new data, same distributionCreate a function that demonstrates the generative approach: given a mean and standard deviation, generate an array of 10 new samples from a normal distribution using TensorFlow.js. Return the samples as a JavaScript array.
import * as tf from '@tensorflow/tfjs'; async function generateSamples(mean: number, std: number): Promise<number[]> { // Generate 10 samples from a normal distribution // with the given mean and standard deviation const samples = null; // your code here return samples; }
Samir sees the difference — not sorting art into categories, but conjuring entirely new pieces from learned patterns.
Next: the mathematical foundations that make generation possible