YaraBefore you can create, you need to understand the shape of what already exists. I have catalogued every color palette we have used in the last five years. Look at the pattern — warm tones cluster here, cool tones drift there. That clustering is not random. It is a distribution.
MUSEData has a shape. Learn the shape, and you can dream new points within it.
When you write Math.random(), you get a uniform distribution — every value between 0 and 1 is equally likely. But real-world data almost never looks like that. Heights cluster around an average. Colors in a sunset concentrate in warm hues. Brushstroke widths follow patterns. Understanding these distributions is the foundation of generation.
A probability distribution is a function that tells you how likely each value is. You use them in frontend code constantly — you just may not call them that.
const hue = Math.floor(Math.random() * 360)const sample = tf.randomNormal([1], mean, std)import * as tf from '@tensorflow/tfjs';
// UNIFORM: every value equally likely (Math.random)
const uniform = tf.randomUniform([1000], 0, 360);
// Histogram: flat — every hue equally represented
// NORMAL (Gaussian): values cluster around a mean
const normal = tf.randomNormal([1000], 180, 30);
// Histogram: bell curve — most values near 180, fewer at extremes
// Frontend parallel: weighted random for "natural" color palettes
function warmHue(): number {
// Not uniform! Cluster around warm tones (0-60)
// This is a crude normal distribution
const sum = Array.from({ length: 6 }, () => Math.random());
const avg = sum.reduce((a, b) => a + b) / 6;
return avg * 120; // Central limit theorem → approximate normal
}
// TensorFlow.js makes this precise
const warmHues = tf.randomNormal([100], 30, 15);
// Mean=30 (orange), StdDev=15 → most values between 0-60Three distributions appear everywhere in generative modeling:
import * as tf from '@tensorflow/tfjs';
// 1. UNIFORM — equal probability across a range
// Use: initializing weights, random augmentation
const uniformSamples = tf.randomUniform([500], -1, 1);
// 2. NORMAL (Gaussian) — bell curve around a mean
// Use: latent space inputs, noise injection, weight init
const normalSamples = tf.randomNormal([500], 0, 1);
// 3. BERNOULLI — binary coin flips (0 or 1)
// Use: dropout, binary masks, stochastic decisions
const bernoulliSamples = tf.randomUniform([500]).greater(0.5).toFloat();
// Visualize: compute simple histogram counts
async function histogram(tensor: tf.Tensor, bins: number = 10) {
const data = await tensor.array() as number[];
const min = Math.min(...data);
const max = Math.max(...data);
const step = (max - min) / bins;
const counts = new Array(bins).fill(0);
data.forEach(v => {
const bin = Math.min(Math.floor((v - min) / step), bins - 1);
counts[bin]++;
});
return counts;
}A generative model's entire job is to learn the probability distribution of its training data. Once it has captured that distribution, generating new data means sampling from it. The better the model understands the distribution, the more realistic its output.
Create a function that generates samples from both a uniform and normal distribution, returning an object with both arrays. Generate 100 uniform samples between 0 and 1, and 100 normal samples with mean 0 and standard deviation 1.
import * as tf from '@tensorflow/tfjs'; async function sampleDistributions(): Promise<{ uniform: number[], normal: number[] }> { // Generate 100 uniform samples between 0 and 1 const uniformSamples = null; // your code here // Generate 100 normal samples with mean=0, std=1 const normalSamples = null; // your code here return { uniform: uniformSamples, normal: normalSamples, }; }
Samir realizes that every dataset has a hidden shape — a distribution — and learning that shape is the first step to generating new data.
Next: how to actually draw samples from these distributions