Top 100 Unsolved Mysteries of the Universe, Episode 85: The Reionization History Problem. Picture the universe as a cosmic city whose lights have just gone out. After recombination, electrons and nuclei settled into neutral hydrogen, and the universe changed from a glowing plasma soup into an enormous dark fog. Light was no longer constantly scattered by free electrons, so the cosmic microwave background could finally escape as the ancient photograph we still read today. But later, the first stars, galaxies, quasars, and growing black holes began to ignite. Their ultraviolet and X-ray light punched holes through that hydrogen fog and stripped atoms back into charged form. That is reionization: a sleeping city lighting up block by block. A few lamps turn on first, clear bubbles appear around them, those bubbles join into corridors, and eventually most of the city is lit. The hard question is not only who switched on the lamps. It is when they started, how long the process lasted, which neighborhoods cleared first, which stayed foggy, and whether the fog wall had the same thickness everywhere. Our clues are scattered. The CMB optical depth is like a global watermark, telling us how much free-electron column density accumulated across history. High-redshift quasar Lyman-alpha absorption is like shining a flashlight through the fog, showing how much neutral hydrogen remains along particular lines of sight. Counts of early galaxies are the lamp inventory. The 21-centimeter signal is the dream tool: a three-dimensional scan of the fog itself, showing which regions are cold, heated, or already carved open. Mainstream cosmology usually writes the story as a photon budget. First stars, galaxies, and black holes produce ionizing photons; then we multiply by escape fraction, gas clumping, recombination rate, and X-ray heating history, and ask whether neutral hydrogen can be cleared on time. That accounting is useful, but the difficulty sits inside the accounting. The faint-source count, escape fraction, gas clumpiness, recombination speed, black-hole heating, and X-rays are all tangled together. Tiny galaxies may produce many photons, but feedback can blow out their gas. Black holes can heat gas far away, but they can also move the timeline earlier or make it broader. Clumpier gas recombines more easily, like glass that fogs up again just after you wipe it clean. And 21-centimeter observations must fight Galactic foregrounds, instrumental beams, calibration leftovers, and Earth's ionosphere; it is like trying to see a match behind neon signs. So mainstream work often wants a neat global curve, as if the universe used one master switch, while also admitting that the real process was patchy and environment-dependent. EFT's rewrite is not to invent a new mysterious light source. It first rewrites the map. In EFT, the early universe is not a blank geometric sheet on which lamps are evenly sprinkled. It is a high-tension, strongly mixed energy sea that later relaxes and grows a structural road network. Matter does not collect at random, and galaxies are not sprinkled uniformly through darkness. Easier collection corridors, filaments, and nodes come first; then the first winning regions form earlier sources. They are transport hubs in a city: they receive fuel first, build lamps first, and burn ionized bubbles into the surrounding fog first. Voids and edge regions are remote suburbs: less fuel, fewer lamps, slower clearing. Reionization then stops looking like the entire universe switching from black to white at once. It becomes an environment-stratified phase transition moving along the cosmic web: nodes bubble first, filaments connect light corridors, voids catch up late, and some local gas can recombine and partly refill the fog. EFT also separates the observational ledger into layers. The source side records where lamps formed first. The medium side records whether the fog was heated or ionized. The environmental side records how bubbles merged along the road network. The instrument side records which observing window translated the process for us. Flattening these layers into one average ionized fraction is like compressing a city night photo into one brightness number: useful, but it erases roads, districts, and lighting order. EFT therefore reads the CMB optical depth as the compressed total shadow, Lyman-alpha absorption as line-of-sight fog tests, high-redshift galaxies as the source inventory, and 21-centimeter mapping as the closest thing to four-dimensional tomography. Together they should ask whether, in the same redshift slice, pixels, environments, temperatures, ionization states, and positions in the cosmic web can be reconciled; whether signals show a sequence from voids to filaments to nodes; whether bright regions heated and cleared earlier; and whether remote regions preserve longer neutral tails. The evidence does not have to say the same sentence at the same time. The CMB can carry the total amount, Lyman-alpha can carry residual fog along sightlines, galaxies can carry the source inventory, and 21-centimeter data can carry spatial texture. If those accounts connect in environmental order, reionization does not need to be cut into an unrealistically smooth one-switch curve. The guardrail is important: EFT is not saying reionization observations are wrong, and it is not saying photon budgets, escape fractions, or recombination rates are useless. It rejects the habit of compressing a roaded, nodal, patchy, partly refilling cosmic construction process too early into one global switch. Reionization history records not only how bright the first sources were, but how cosmic structure wrote light, gas, heat, and observational readout into the same early map. Tap the playlist for more. Next episode: The Cosmic Dawn 21-Centimeter Signal Problem. Follow and share - our new-physics explainer series will help you see the whole universe more clearly.