How Cutting-Edge Technologies Like IMAX and Flying Cars Are Transforming Entertainment and Trans

This video explores several advanced technologies including IMAX-style sports viewing domes, flying car development, pyro drone shows, and lab-grown diamond production. It examines how these innovations are changing entertainment experiences, transportation possibilities, and material science through detailed explanations of their engineering and practical applications.

Full English Transcript of: How IMAX, Flying Cars, Pyro Drone Shows and More Work | WSJ Tech Behind

that was a big one to lab grown diamonds. We effectively taking a coke can and putting the Eiffel Tower on top of it. Tech behind explores some of today's most cuttingedge technology. Yeah, I didn't really want to go to Philly for this game. Wasn't in the budget either. And this is the next best thing. We start with a planetarium style venue where fans pay $200 to watch sports. We're coming into the dome. Really the featured area here at KSM. Uh that is a 27 meter LED dome. This is like walking into the stadium.

It's sharper than what your eyes would see from pretty much all the seating points here. And I think what we've seen is fans are willing to pay for quality. We paid I think about 140 each. Uh $180. Couple hundred bucks. Our team's on the ground shooting with five 8K cameras that are capturing those live streams. And that means for fans sitting here, it's just like being at the game. It's nothing like being at the link, but it's close. Hey, go. We're here for Sunday night football setting up for the Philadelphia game.

This is kind of a little tedious part of the operation for us where we're just kind of in here, you know, making sure everything's ready to go out on the field. In the US, people are spending more money on live sports than ever before. Kum, which is run by a former NFL player, is looking to cash in. It acquired the world's largest planetarium company in 2020 and uses that dome technology as the blueprint to go from space to the sidelines, creating a sort of in between virtual and physical space for fans to feel like they're actually at an event. So, I am in charge of putting all the cameras into position, rigging them around the venue. It should take me a couple of hours to get these up, just

getting things into place. How much of that upright do you see? You tell me when you can see the whole end zone. Kum's camera kit is key to creating the sense that you're actually at the game. For the NFL, they place four to five cameras around the stadium, hiding them in places like the field goal, foam pylons, and even on the rolling Chapman cart. Without the tech, what are we? It is core to our DNA. Your typical broadcast camera is really massive. We couldn't have placed those cameras and give the fans that kind of access if we worked with, you know, large format camera.

Using a typical fisheye lens for the massive screen they're showing the game on would come out sort of blurry and create a type of color distortion called chromatic aberration. That's because all of the colors pass through the lens at slightly different angles, which causes them to focus at different points of the sensor. So the COSM team engineered its own glass with a custom curvature to better align those wavelengths and create a sharper image in the dome. So it took about 2 years really from start to finish to create this incredible, you know, piece of glass. That's what allows us to capture 180° field of view. Fun fact, just for you, I actually designed all the markings on the lens myself.

So our 50 yard line camera is up there. So it gives you that you know seuite level VIP view of the game. Beautiful read. You also have the slash in the corner that gives you the energy of the crowd and an amazing half field experience from that side. All of that work behind the scenes comes down to how it translates in the dome. We can't have a 5minute delay that takes you out of it. You can't see the LED modules. If you see the LED modules in display, you're toast. Here we go. We're in the high wide spot. You can see here right at the 50 yardd line looking in the stadium. You can see it looks like we're just sitting in the stands with everybody else.

These are actually magnetized panels. Uh and that's a really a nod to us being a planetarium company. We've been building dome displays for decades. That laid us a foundation that we're able then to build upon and hey, let's figure out how we don't just fly around the stars in the universe together, but how do we go baseline at a basketball game or a sideline of a football game? That is crazy. Now, I've got to say, we can never actually compete with that fan experience when you're in the stadium. We don't want to do that. That is so special. But we want to be the next best option for the fans that might not be able to get to that venue.

Yeah, I didn't really want to go to Philly for this game. Wasn't in the budget either, and this is the next best thing. So far, KSM has one venue in LA and one in Dallas. And they were designed just like new stadiums are with a range of ticket prices from around $11 for general admission up to hundreds of dollars for nicer seats. We certainly started with this idea of we wanted the seating to be more like you're in a suite, right? You can see very comfortable seating here. You're not sitting in bleacher seat. It's more private though. You have somebody that brings you your drinks, brings you your food. The seats are way more comfortable. Just like a stadium business model, COSM stacks its calendar with as many events as possible from

documentary screenings to films and of course sports. Ticket prices change with demand as well. We do want pricing to be approachable. That is the core. But we leverage dynamic tech pricing much like others in the market do. And we it's a delicate balance we have to strike. But from a capital and funding perspective, these things don't build themselves. Look, we're trying to take this global. That's the ambition. We're opening Atlanta next year. We're opening Detroit next year. We're open in Cleveland in 2027. We don't just have 40 events a year, 80 events a year. We have 1,200 plus events a year. We started with sports. However, we're expanding into other live elements as well. You can

look at performance and music and all these other things that we can layer on. What do fans want? What was it going to take to get them off the couch to go out and interact and experience things? Pivotal's electric vehicle takeoff and landing aircraft or EV tall looks a little different from the rest of the market. It's tiny with just one seat and weighs less than some motorcycles. You don't even need a pilot's license to fly it. It was designed that way on purpose. We went down the ultralite path because frankly it was not as regulated. We could iterate quickly, go through design iterations, test, build, test, build, test build. But that advantage comes

with trade-offs, limiting power, range, and commercial uses. This is the tech behind EV talls. An EV tall has two main differences from a traditional plane, and it's all in the name. The first is in how it takes off and lands. You don't need a lot of room to take off or land. For traditional aircrafts, they need a long runway to take off and generate lift on their wings on their air foils from gaining speed this way. And a V tall aircraft actually can use the propellers to generate lift in a vertical way. That's just like a helicopter, but that's where the similarities end for a helicopter. So, if you lost your main propeller, you basically wouldn't have any power um to generate lift on

the aircraft. For us, we have eight propellers on the aircraft and you could lose any one and you would still be able to fly the aircraft. These aircraft are also electric. Pivotals is powered by eight huge batteries. You don't have to worry about all of the maintenance that comes with engine maintenance for a traditional gasoline engine or a diesel engine or even a jet engine. Also far less moving parts from something like a jet engine. The company also designed it with fewer parts in mind.

Most EV tall construction that I've reviewed in detail are far more complex and that creates additional cost. It creates additional weight. It creates a challenge for reliability cuz each of those is a moving part that can fail. Pivotal says its EV tall has 12 control surfaces or the parts that allow a pilot to control how an aircraft flies. Eight of them are the propellers and the other four are these eleons which provide pitch, roll, and altitude control in the air. Another design choice gets rid of even more moving parts. Once airborne, the entire aircraft tilts to fly forward. Traditionally, for a lot of electric vertical takeoff and landing aircraft, they will tilt either the wing set or

some of the propellers or they will have propellers that are designed primarily to help you with cruise flight or propellers designed for uh vertical lift. The tilt aircraft allows us to use the same propellers to provide vertical takeoff and cruise blown lift. As it takes off, the entire aircraft tilts back, thrusting the pilot from sitting upright to lying flat. Then the pilot can initiate cruise mode, which tilts the aircraft forward, so it can move horizontally up to 63 mph. Pivotal's aircraft don't require a pilot's license to fly because of its FAA ultralight aircraft classification. Perspective pilots take a twoe training from the company itself, which includes time in a

simulator. The Wall Street Journal's auto columnist Dan Neil underwent pilot training in September. It was uh an investment of actually a couple of weeks uh of training uh there at Palo Alto in their offices, much of which uh I spent being really motion sick. Still after the training and with no prior flying experience, he was in the air. That was a big one. My confidence level when I got into the machine was such that I could actually have a good time. It feels that easy because the pilot doesn't have to do most of the flying. The aircraft is largely controlled by the company's software. Takeoff and landing is largely autonomous. Once in the air, the pilot controls speed, direction, and altitude using one of these joysticks.

The thumb stick here um lets you go up and down in terms of elevation off the ground. Uh the joystick going up and down allows you to tilt the aircraft to control the aircraft stability. Pivotal software relies on a variety of sensors. So we have the pedo tubes that allow us to get to the air sensors on the in the flight electronics. And so those are letting us know what the wind speed and the air pressure is. Inside behind the pilot there's actually three uh flight controllers. We have GPS sensors and a variety of other sensors that allow the aircraft to understand where it is in space and what's going on with the environment around it. It's using

that information to understand how to change the speed of the propellers or to move the eleons based on what's happening around it. So, for example, the aircraft will keep itself in the same position even when there's high winds or no winds. And so it's using the wind measurements to ensure that it's still holding its location. Pivotal's EV tall starts at $190,000. The company has already sold and delivered five aircraft in early access. That's ahead of a lot of the industry in the US thanks to the lack of regulation the company faces with an ultralight aircraft. But to qualify for that

category, the aircraft excluding certain safety devices needs to stay under 254 lb. With a parachute and flotation devices included, pivotals comes to 348 lb. We make trade-offs all the time. Uh we have to not put things the aircraft would love to put in. We have to be very choosy about, you know, radios, about um be beacon lights, navigational lights or uh anti-colision lights. we don't have extra airbags or you know any number of other things that you could enhance the aircraft with to make it even more survivable in the event of a crash landing. The other big sacrifice its range and flight time. The batteries on these EV talls can only support around a 20 mile or 20 minute flight.

It's very mission limited because we're trying to squeeze into a 300 some odd pound uh weight envelope. While a 20-minute flight might make sense for a joy ride, that poses a significant hurdle for the company's commercial goals like supporting emergency services. And while the FAA allows ultralight aircraft to fly in uncontrolled airspace, it also bans flying over any congested area. In other words, almost anywhere people live and gather. The company says it expects to have a path to FAA certification for a larger EV tall in the coming year or two. Pivotal is also evaluating how the aircraft can be used in the military, but it acknowledges significant tweaks would be needed, like making it hybrid instead of just electric that would

increase the aircraft's range. The number one mission in our mind is contested logistics. And that is having a fleet of aircraft that are capable each of carrying 2,000 lbs or more, 300 m or more, and can do it with the cargo on board or can do it with the cargo slung below the aircraft. But that's a long way away. For now, even recreational pilots, a big part of Pivotal's market, are skeptical. One thing aviators uh flyers have said to me by way of push back and says, "Yeah, okay. It's you don't need an airport, but it can't go far enough to take me anywhere I want to go." But if the company expands beyond the ultralight category, that could change.

What we're doing is we're perfecting a platform that's got some room to scale that will actually become in a different implementation. uh likely using the same flight computers, many of the same sensors, uh the same architecture of motors, same overall architecture of the aircraft, a larger typecertified aircraft. In a matter of minutes, this English Premier League field or pitch will completely disappear under the stands. The best way to describe it is to talk about the Rubik's Cube. The engineering behind this stadium unlocks new revenue streams by transforming this English football pitch into a stage for world tours, a boxing ring, and even an NFL

field. We explored the inner workings of this UK stadium to see how underground mechanics hide a 9,000 ton grass field to reveal an entirely new turf underneath. This is the tech behind a stadium transformation. Hotspur Stadium has become a little bit of the blueprint for many, right? Long gone are the days of just having one use case. How do you keep a building open 365 days a year? Rolled up in a parking garage beneath this stadium is all 120 yards of turf that make up an official NFL field. As part of its global expansion, the NFL plays games in international stadiums each year. And this is the only one outside of the US that's been

specifically designed for the NFL. That's because the field of play for English football or soccer is a different size than American football and switching between the two needs to happen within a matter of hours. So, if we weren't able to transition in 36 48 hours from uh from football into NFL, it would be a complete disaster for us to either postpone, reschedu, or cancel an event. Technically, the NFL could play on the natural grass, but it would cost the stadium time and money to convert back to English football. So the NFL uses artificial turf when it plays here.

The sizes of our athletes and the intensity of our game, you can obviously get some level of damage. Additionally, we come with more lines, more logos, more colors than most sports are used to. So the ability for them to sort of protect their playing surface and allow us to play on another playing surface allows for that kind of shared business model that works so well here at Tottenham. That business case gave rise to this engineering breakthrough. The world's first retractable field that splits into three pieces. It allows Tottenham to keep its grass in pristine condition while also hosting and taking a cut of the revenue from other events like NFL games.

The more events we stage, the more money that we can put into football. Here's how it works. We've got a number of turns that we need to make to get all the faces facing in the right direction in order to solve the puzzle. To make way for the NFL field, the grass is split up into three massive sections weighing 3,000 tons each. Motorized platforms roll each of the trays into a parking garage under the stands. Most retractable fields roll all the way outside of the stadium. But Tottenham Stadium is in a dense urban area, so there's not much space for a spare field to sit outside. So the team clears this section of the stands to make way for the grass to be stored underneath. It has to be split into

three sections in order to fit around the structural columns that are supporting the stands above. But first, the outside perimeter of the field is lowered 1.8 m or around 5 ft below the first row of seats. A very American design is to have the field 1.8 m below. That's because compared to English football, American football teams have way more people standing on the sidelines. If you get 150 team members either side of the field all stood up, then rows 1 2 3 4 5 can't see anything. You can't see past the players that are stood up. Each section is slowly lowered using a carefully monitored hydraulic system. When these are coming down, it's really important that we keep them level. If

they were to get stuck at an angle, it would become really hard to rectify. We'd have to bring cranes in and things like that. If one side is just 25 mm lower than the other, an alarm will go off and the process will pause until they're aligned again. The next step is to split the pitch. If you look down at the floor now, we're currently stood on our center tray. You might be able to just very, very faintly see the join there. And what you'll see is this east section just move away from the center. Before that can happen, George has to unchain the pitch trays to allow them to separate.

Part of the original design that it was going to be held together with hydraulics. So you would have two hydraulic accumulators and they would keep a constant squeeze on it. Now the issue with that is if you had a failure, the risk is that it would just start pushing itself apart. So to sort of make sure that can't happen, we chain the trays together with lever hoists. And each section sits on a uh a steel train is the best way to describe it, which is about 110 m long by 33 m wide. And then section by section, we slide them out underneath the south sand. Next, they cover the giant drains, which help suck out any rain water that accumulates in the grass pitch. Then, the crew spends the next four to five hours clicking in the final puzzle

pieces of a PVC graded floor, which acts as a sort of shock pad, evening out the surface above the rails and creating an air gap between the asphalt and NFL turf. Everything down to the to millimeters measured. We want the athlete to be able to walk the field, understand what they're on, and then go out and play, focus on football, and not have any aspect of change under their foot, uh, be felt when they're out there performing. When the stadium was first built, the NFL field was stored underneath the grass pitch all year round, but it created long seams the NFL wasn't used to. So now they use a new field that's rolled out in shorter sections across the width of the field.

We have 37 rolls of NFL field, all of which weigh somewhere between 3 and 4 tons. and we bring those into the stadium bowl and we lay those out over the shock pad. Each section of the NFL field is labeled so that they go out in the right order. When we lay the NFL turf, the uh forklifts are really tall. Um so we raise that just to give them a bit of extra room to come through. They're then rolled out and joined together with a Velcro like seam. During our inspection, checking each of those seams, we walk every single one of them by the foot, ensuring that one, they're tight and put together to the

point where there's not going to be any separation. and then two, ensuring that the infill and all the product meets the expectations as it was designed. In around 36 to 48 hours, the stadium is transformed for an entirely different sport. Meanwhile, the area under the stands is being transformed into a sort of grow lab for the natural grass. What we're trying to do really is just put it to bed, put it to sleep for as long as we need to put it under the stand. We put a growth inhibitor on a few days before we bring it out. we really start to flood it with our uh artificial grow lights and then we try to get the new growth just before we bring it out. That growth is managed by controlling

the type of waveforms that hit the grass. Red light maximizes photosynthesis and causes the grass to grow quickly, while blue light promotes strong roots that anchor the grass. That combination is key for a football field that needs to repair itself after the wear and tear of a match. So, the lights down here are made with a mix of the two colors. This light is perfect for growing grass from seed. If you put a load of seed down first, put a pitch in storage, it's like shooting straight up. The lights that go above the pitch when it's not in storage project a fuller spectrum of color. I'll be honest, if you were to put one of these pictures in your backyard, you would have a nightmare maintaining it cuz they're very needy. There's a lot of

work, more work than anyone can ever appreciate that goes into keeping that grass at the level. I mean, every day or every couple of days, we will move the pitch around about a meter, a couple of meters back or forwards, and that just spreads that light across and reduces the risk of seeing different shaded areas when it pitch comes back out. That's because if the surface isn't even, it could affect how the players are able to move around the pitch. But tending to the grass just inches below the ceiling isn't easy. The guys are doing backbreaking work. They are bent down. They are sometimes lying down pushing these mowers trying to get it cut. It takes a lot of work to uh to maintain this.

All of this effort behind the scenes helps fuel the business model that keeps the lights on even when the home team isn't playing. It's a multi-purpose uh stadium, right? So, while it's very common in the US with new builds, you know, Hard Rock Stadium, Metife, Sofi Stadium, Allegant, even Levis's, it's very rare in the rest of the world and specifically in the UK because a lot of times stadiums are built specifically for football. But that's changing as new stadiums are being built to host a wider range of events with new crowds and revenue streams.

We sell about 60,000 points of beer for a football game. We sell about 120,000 points of beer for a for an NFL game. Since Tottenham Stadium opened in 2019, newer stadiums like the Santiago Bernabo in Madrid have pushed the retractable field tech even further. Here, the grass descends into a four-story underground greenhouse, freeing up even more real estate in the stadium to host other events. The engineering that goes into developing a system where a natural grass tray could live in one location, split apart, move into another location, continue to grow and be managed to the highest level of expectations. And then

installing a artificial surface that would meet the, you know, professional level expectations that we have at the NFL is not exactly easy. Although maybe simple, definitely not easy. As well as having a stadium that can be truly multifunctional, host all the all of the events that we want it to hold, generate the income that we need to be able to put back into our football. It's the people that really make it happen. This is a synthetic diamond. It doesn't look anything like jewelry because that's not what it's for. We went inside one of the world's most innovative and closed off labs to see how it makes synthetic diamonds that are used to

build everything from drills to speakers to lasers. It requires extreme conditions like plasma heated to more than 3600° or pressures of 800,000 lb per square in. Be effectively taking a Coke can and putting the Eiffel Tower on top of it. But many scientists think they've only scratched the surface of what's possible, and companies are racing to unlock diamond's full potential. This is the tech behind synthetic diamonds. Just south of Oxford, England, Element 6 is one of the world's leading synthetic diamond labs.

We are the largest western supplier of advanced materials and we are one of the largest labs in the world directed to that. It makes synthetic diamonds in two different ways. The first is by using high pressure high temperature reactors or HPHT. In HPHT, manufacturers start with a capsule that contains a carbon source, some metals, and some tiny diamond particle or seed. They put it into the reactor, heat it to more than 2500° F, and pressurize it above 800,000 lb per square in. For comparison, the pressure used to extract oil from rock formations in fracking is about 2 to 15,000 pounds per square inch. Under these conditions, the carbon and metals dissolve into a flux, and the carbon atoms crystallize on the diamond seed.

These machines are mimicking what mother nature does to grow her diamonds, but just in a very short time scale. So it takes just minutes or weeks rather than billions of years to create this a composite that will be processed into a finished synthetic diamond. These synthetic diamonds are mainly used to cut other materials like in drills or mining equipment. They also shape things like the glass on smartphones. The other key technology used to make synthetic diamonds is chemical vapor deposition or CVD. In CBD, diamond seeds are arranged on a surface that's placed into a chamber which is filled with gases.

We're using very high purity gases with our carbon source, energizing those with microwaves to create a plasma, an ionized gas, and then controlling the temperature and pressure of that environment to effectively get carbon atoms to rain down onto our gross surface and start building our diamond atomic layer by atomic layer. CBD gives scientists more specific control than high pressure, high temperature reactors, but the growth rates are much slower. CBD is typically used to make synthetic diamonds for high-tech applications like optics, thermal management for high power devices or quantum technology. After they're grown, the synthetic diamonds have to be processed, machined to size, and tested. And then finally, they're ready to be

used. The atomic structure of diamonds creates their clarity and sparkle, but it also makes them extremely strong, which is very useful for things like manufacturing. What is intriguing is it's so simple. It's the ultimate covealently bonded structure, just a couple of carbon atoms. But mother nature in her genius imparts such superlative properties to that simple structure that it keeps us going for decades. Diamonds are transparent to visible, ultraviolet, and infrared light, which makes them useful for optical windows in lasers or welding equipment. They excel at dissipating heat, which can help keep electronics from overheating. And they can handle

extremely high voltages before breaking down, which is why some scientists are studying them as a material for semiconductors. Synthetic diamond is the supreme material on most properties. That makes it the ideal candidate for many, many applications, most of which we know, but some still undiscovered. But while synthetic diamonds have advanced other technologies, companies like Element 6 are still trying to make synthetic diamonds a multi-billion dollar industry on their own, the key is to get better at making lots of synthetic diamonds with very specific defects. Defects can dramatically change the appearance and properties of

diamonds. For instance, diamonds don't typically conduct electricity because all of their carbon atoms are so tightly bound. But that changes when scientists introduce defects through a process called doping. So for me, it's sort of the next 1015 years about okay, we've got perfect diamond. Now, how do we unlock the potential by deliberately engineering changes in it to create new potential? There are lots of changes scientists can make to synthetic diamonds to alter their properties. For example, this synthetic diamond is pink because of a specific kind of defect where a carbon atom is replaced by nitrogen and the

spot next to it in the molecule's lattice structure is left empty. These pink synthetic diamonds with a nitrogen vacancy defect are ideally suited to work as sensors. That's because the defect has quantum properties associated with it, which are highly sensitive to small changes in the diamond's environment. It can detect very low variations in magnetic fields, in electric fields, in temperature. And it's that property of the nitro basin defect in synthetic diamond that makes it so useful for quantum sensing, quantum computing, and other types of quantum applications. Element 6 is also working with DARPA, part of the US Department of Defense, to explore the use of synthetic diamonds for systems

that could run in extreme environments. We're waiting for markets to emerge so we can scale. There's still very much some hard engineering science questions still to be done. Another factor is cost. Some scientists think synthetic diamonds could one day be used in technology as ubiquitous as semiconductors. But whereas silicon is cheap and widely available, high-grade synthetic diamonds are not. I think in in many of the emerging applications, it is going to take you know 5 10 um 15 years for it to really um come alive. It's as much how it can be integrated, choosing the right partners as it is the material itself. But for now, there's still a lot of

research that needs to happen to make more of these applications possible. Today, diamond's been a still a market niches, high value, hard one niches where diamond unlocks huge potential, but the market scaling hasn't been on the diamond. It's been more on what that potential enables. I think some of the things we've talked about today, semiconductors will be one, quantum devices, if they can really unlock some of the ability to measure things you can't measure before. These are scale markets. So, I think we're hopeful and that's partly why we're here. This is something that can't be achieved with fireworks alone. It's made using pyro drones. Hundreds of tiny explosiveladen quadcopters piloted to an

exact formation in space firing pyro technics in perfect choreography. But while these shows may look simple, getting to this point has taken extensive engineering, complex flight plans, and a fair amount of trial and error. Anytime you attach any sort of explosive device to anything, it creates an additional level of risk. What are the things that are going to cause a show to go wobbly? And the list is endless, you know. And running a drone show isn't without its challenges. Watch it. This is the tech behind Pyro Drones. To see exactly how a Pyro drone show comes together, aerial display company Sky Elements invited the Wall Street Journal to a test of a new show where each drone can carry up to six

fireworks. The drones it's using are these $1,500 customized quadcopters that are made in South Korea. In addition to their usual RGB light for illuminations, each drone is kitted out with a pyrochnic, a mount for it, and a firing module. We'll use what they call close proximate fireworks. So, it's a very specific type of firework that's usually designed to be used inside. The pyrochnics are attached to each drone using these 3D printed mounts. Each pyro then has an electronic match attached to it that's ignited from this firing module, launching the firework the moment it receives the signal to do

so. You'll load the script onto the remote. It'll say, you know, at 1 minute 10 seconds, module number one, Q number one, you're going to fire. Each of those firing cues are relayed to the firing module with millisecond accuracy. Meaning designers can create a chasing cascade of launches like this. But for that timing to work, each drone needs to be in a precise location. So to map out the display, drone show providers choreograph their flights in 3D animation software. For safety, each drone needs to be kept at least 6 1/2 ft apart. But maintaining that distance as the show moves requires thinking in

three dimensions. You could build a 3D model and then attach drones to the surface of that model. If I have a horse, I'm going to have the horse's legs running. His tail is going to be moving. his head's going to be bobbing back and forth. We have to take all of that into consideration. And so if a horse's head moves, that drone is going to, you know, potentially collide with something. So we're going to offset it into 3D space where the audience is never going to see. Once the show is mapped out, certain drones are selected to be firework carriers. Where within the design do we want the pyro drones to be? If it's a giant 10 in the sky, do we want the 10 to be pyro drones and make a 10 out of the pyro

drones? The idea is to use the pyro to give the drone show moments of impact. The biggest weakness of a drone show is it doesn't have a finale. When you're doing a firework display, usually you save the last 10 to 15% of the product to be shot in the last 20 seconds. Well, with a drone show, I can't just send up more drones. So, having the ability to have pyrochnics really takes it to the next level. But adding explosives affects the design of the show as weight, thrust, and firing direction all have to be considered. We don't want to be dropping sparks on top of another drone. So, we have to really pay attention to what's directly below each of the drones as they fire their pyro.

To solve this, drones with pyro are often separated out in a 3D plane from other drones in the show. Once the animation is locked in, Brian's team then has to use an algorithm to figure out the safest and most efficient path for the drones to fly between their formations, ensuring they avoid crossing with each other or hitting potential obstacles. It's why you'll often see this twinkling cluster between formations. That's the algorithm at work. Finally, the show itself is all contained within this bounding box, a geoence that's designed to instantly shut down any drones that malfunction and attempt to leave the area. And then I can know if I'm lifting drones, I'm going to avoid this light pole here. You know, if I'm taking off in a football

field, there's a light pole right next to where I'm taking off. And that's important because crashing drones, especially ones that could soon be carrying up to six fireworks each is a real concern. 3 minutes in, spectators started shouting that drones were falling from the sky. See, unlike consumer drones, light show drones don't have any sensors or cameras. That means that this drone only knows its own flight path and has no information telling it that it's part of a show or surrounded by hundreds of other drones or near a crowd. What they do have is a very sophisticated GPS.

It's called an RTK GPS. That GPS system combines satellite data with that of a local base station to provide centimeter accurate positioning, allowing the drones to fly in precise patterns. But even this can be fooled. If you jam the communications to the drones, they're not going to get that precise signal. They're going to be accurate to six feet. Well, when you have a bunch of drones that are only accurate to six feet, they're likely to hit each other or likely to have problems. And even with safety devices like geo fences in place, mistakes can and do happen. In Orlando on December 21st, 2024, an incident occurred at a sky element show where some of the drones they were using, not pyrochnic ones, collided, fell from the sky, and raced towards the

audience. seriously injuring a minor, according to an NTSB report. The investigation said that some of the drones hadn't received their final flight paths and that the show's center point was not completely aligned. Sky Elements said that after some of the drones collided, their GPS systems malfunctioned, causing them to fly away. Since the incident, the company says it has improved its show operation software and its workflow protocols, including adding individual geoence bubbles to each drone. If that drone varies more than a couple of feet, we immediately turn the motors off. But the event has led to the industry as a whole, including Sky Elements, trying to develop unified safety standards. That was sort of the catalyzing event.

Neils Thorgesen is a drone show technology provider who has helped spearhead a group attempting to come up with safety standards for all drone shows. He says that despite the fact that there are established and well-run players like Sky Elements, there are also many younger companies. It's been a little bit wild west in the drone show industry. I mean, people buy 5000 drones and get some software and they can quickly get in over their heads. They have to learn by making all the same mistakes that everybody else has already made. Niels believes one standard should be that drones have multiple layers of redundancy for their navigation, GPS, and power systems. However, it's tricky. You can't have limitless

redundancy. Part of the challenge around creating universal standards is that pyro drone shows are very new. The FAA only gave clearance for them in 2024. And as such, many industry players have established different ways of doing things like choosing whether to keep the pyro firing mechanism separate from its flight system as Sky Elements does or unifying it as Verge Arrow does. I don't want to risk the ability of not turning off the pyrotectics because if they're on the ground and they go off that might cause a fire. Obviously, you know, we think the way we're doing things is the best, right? And

everybody's going to have their biases. The final step is to set up the physical show. A lengthy process given that hundreds of drones have to be charged, loaded with specific pyro, and laid out on a grid. Having been set back for several hours by heavy rain, the team began unpacking. We've got 600 pyro drones and a large team. I think we've got probably 15 people over there right now. We are attaching all the pyro to all the drones. We're making sure that the right device goes onto the right drone. We're probably maybe an hour or so out. Allocating over 1,000 individual pyro shots in a perfect sequence to 600 specific drones spread out at random across a grid isn't a quick process.

See, with RGB drones, it doesn't matter which drone goes where, as the lights can be assigned and switched. But with physical pyro, the show turns much more complicated because I have to specify that a specific drone is in a specific spot at a specific time because I have attached a specific piece of pyro to it. To figure out which drones to attach pyro to, Brian's team uses software to roll the show back to its starting grid. With up to six pyros on each drone, it's not easy. Okay, let me skip fill this one. Fill, fill, skip, fill, skip, skip. Okay, next row. As night falls, the show launch window comes and goes.

We never try to do anything this complex. When you do that, you break stuff. By midnight, the team are still hours away from being able to launch and so decide to cancel. We missed our window to be able to do it. And so to be mindful to the community and everyone around us, um, we decided to pull the plug. The main issue, the sheer volume of pyro that needed to be sequentially loaded into the drones. We learned what we had done was way too difficult to even figure out where to put things. We thought we had it in a nice orderly fashion, but it wasn't enough. Maybe we were a bit too ambitious, but that's what Sky Elements does. We're wanting to push the

envelope. So, while it may have taken around 2 years for Sky Elements to secure FAA approval for pyro drone shows, could be a little while longer before the shows become more commonplace. It's not just you. Capture, the online test to tell whether you're a human or a robot, has been getting harder. This is one of the guys who invented it. Oh, yeah. I fail them all the time. I never know how much to say there's a traffic light if it's only like a tiny little corner of it. The problem is that capture was designed to keep malicious bots out of certain websites. But every

time you've solved a test, you've actually made those bots smarter. As more and more data was fed into these perceptual systems, they simply got a lot better at solving the perceptual tasks. So what does it take to design a puzzle that can outsmart a bot but still be solved by any human? This is the tech behind Capture. So capture was first used around the same time that Yahoo began giving out free email addresses. This was the year 2000. There were people who were writing programs to abuse different web services. And there was no easy way to stop them. And the idea was there should be a test that is really quick that humans can pass but computers could not.

Louis's test looked like this. A string of letters slightly warped and distorted that the end user had to input into a text field. He called it capture, which stands for completely automated public cheuring test to tell computers and humans apart. And he took advantage of the fact that computers at the time weren't very good at something called OCR, optical character recognition. The way it works is a simple game of pairs. The program scans the image and compares any shapes that emerge, like this letter P, against a database of different letters in different fonts. The second it finds a match, it has identified the letter. But if that P was warped,

overlapping another letter, or had marks through it, then the program will struggle to find anything in its database that matches and won't be able to identify the letter. Simple, but enough to keep out the malicious bots. The technology behind them was relatively primitive when it came to having access to what a letter looked like. But that advantage wouldn't last long. Here's Luis from 2010. Is there some way in which we can use those 10 seconds while you're typing a capture for something that's useful? In 2007, Recapture was launched, an updated version that also helped to scan books. At the time, groups like the New York

Times and the Internet Archive were in the process of trying to digitize old literature. Since optical character recognition wasn't very accurate at the time, the digitization had errors. So, to try and help resolve those errors, Recapture began showing two words from a scammed document. One was a control word, a word the computer knew, but the other was a word it couldn't quite identify. In answering the capture, half of your answer was being used to pass the test, but the other half was used to tell the computer what word it was looking at, helping to improve OCR. Not only are you authenticating yourself as a human, but in addition, you're helping to digitize books and newspapers.

The technology was so efficient that Google acquired it in 2009. But a new problem emerged because captures relied on machines being bad at reading, and we just taught them to be very good at it. Eventually, the bots were able to get captures right more frequently than humans. The test needed to evolve. So, in 2012, Google deployed this image-based capture. There was a switch to go from the story characters to the harder problem of distinguishing certain things in images where you have to pick all the ones that have a traffic light or a bicycle or whatever.

Interestingly, when it comes to boxes like this, where it's not exactly clear if this counts as a traffic light or not, there isn't actually a right or wrong answer. The way to get it right is just what the majority of the human population says. But for machines, this was a huge new challenge. After all, they'd only just learned how to read. See, with textbased captures, the machines only had to identify a limited range of variables, letters and numbers from a black and white background. But now with image captures, they had to be able to identify anything and spot an object in a very busy background. However, computer vision was just around the corner. Whereas you and I would process the stream of perceptual input,

the computer vision system is basically taking the pixels and uh processes strings of pixels vectorzed we would say. And as it processes these uh images, it's picking up patterns in the images. It's picking up patterns in the pixels. Now for any computer vision system to perform well, it needs a lot of labeled data. It's being trained to identify cars by having many um images of cars. The problem was that image based capture was essentially a data labeling task. By solving it, you were once again generating data that could help a bot defeat it.

The systems just kept getting better and better cuz they kept getting bigger and bigger. A new approach was needed. So in 2014, Google launched the no capture one simple box. But this version of capture wasn't looking at whether you clicked the box. It was looking at how you clicked it and how you interacted with the rest of the internet. See, if you write code to make an object move to a certain point, like a cursor, the simplest version will make it move in a straight line at a constant speed, like a robot. But humans naturally aren't that accurate. We overshoot. We don't move in a perfectly straight line. And that is what this version of capture was looking for, human flaws. It was also

monitoring things like your internet history and your typing speed. If the history was just a string of repeated attacks on the same website, you're probably a bot. But if you stopped halfway through the day to browse for shoes or look at cat videos, you might just be human. By 2018, Google had done away with the tickbox entirely and launched Recapture V3, based solely on that hidden data. But even flawed human characteristics is something that a smart algorithm can eventually learn to mimic. So, are the bots always destined to win? Well, perhaps. See, despite other companies coming up with new and inventive tests, there's a quirk at the very heart of capture that means it's likely the bots will always be able to

win eventually. And to understand why, you need to go back to before the internet was invented to the man who capture is named after, Alan Churing. He's considered one of the founding fathers of AI after he penned this paper, computing machinery and intelligence. In it, he describes a method to test whether you're talking to a human or a machine. You had a questioner right behind the screen and and trying to differentiate between a human and a computer machine giving answers. This now has famously become known as the Turing test.

The problem with capture is that the questioner isn't human. It's a computer and therefore any information that's input into the computer has the potential to be used by AI models to train capture defeating bots. Even Louis's original paper on captures says that the technology will act as a security measure for websites and advance the field of AI. You have the research world almost continually catching up and surpassing the capture world. But the solution may be to take the test out of the computer and into the real world. We all have mobile phones and they have so many different sensors within them. being able to tilt the phone on instruction or being able to take a few steps in one direction or another

using the sensors in the phone. But in order for us to truly prove that we're human online, we may have to find an entirely new approach. I think the question now is not are we doomed. It's it's how do we seize control? Again, I would not be designing a capture today. I don't I think that's a losing battle. This is too hard. laser projectors, new large format film cameras, custom processing tools. This is the side of IMAX you don't see in theaters. It involves a huge amount of technology. And the film can cost $2,000 per minute to produce.

All to take movies from looking like this to this. It's a very immersive uh presentation format. It really creates a suspension of disbelief. This is the tech behind IMAX. This is the place where we do all the assembly of our critical components in the IMAX laser projector. When IMAX went digital to accommodate Hollywood movies, it had to make a lot of changes. IMAX's original projectors used 15,000 W Xenon bulbs to illuminate one frame at a time. Since each frame was so large, the resulting image could be two. Digital projectors often use similar bulbs, but the resulting images aren't as large or as bright. That's

because standard digital projectors have to split the light using a prism. Because of this and some scattering that happens in the prisms, the final images have lower contrast. So, when IMAX wanted to adapt to digital, it had to design something totally new. Instead of using one bulb and splitting the light, it built a new projector with no prisms and three different color lasers. This is the alignment process for making sure that mirror is placed correctly so it could create that full color image in the projector to high accuracy. And this is down to microns of precision in terms of placement of these modulators.

This precision is important to meeting the same standard with digital video that IMAX established with film. Hundreds of miles above the Earth, IMAX film cameras have gone everywhere from the International Space Station 250 m above Earth to the Titanic 12,000 ft underwater. People are sort of aruck by the scale of everything, the size of it. If you start with that big scale, that's the best way to end with that big scale. IMAX film cameras shoot on 65mm film with 15 perforations, more than 10 times the area of standard 35mm film. And while many top-of-the-line digital cameras shoot in 4K, 65mm film has an approximate resolution of 18K with

richer texture. There's something about film emotion, just making color from dyes and getting texture from film grain that's very unique and it just gives a different aesthetic, a different sense of color, a different sense of texture from digital media that at this scale, at the size of the IMAX frames, it just looks very unique and almost irreproducible by any other means. But while this film creates beautiful images, it was never well suited for Hollywood movies. First, there's the issue of motors like this, which is used to pull the giant film through the camera. It's so loud, it's hard to record clean dialogue, and it can be distracting to actors. The cameras are also heavy, around 55 lb when you add the lens, viewfinder, and

1,000 ft of film. and they require specific training to operate them, which is why most movies only use IMAX cameras to film certain dramatic sequences. It's not like a digital system where you can just turn the camera on, roll it, and then just run it all day. 1,000 ft of film in the IMAX camera only lasts for 3 minutes. IMAX only has nine film cameras, and they're all decades old. Our current analog fleet does go back to the late '90s. It's not as easy to find all the suppliers and all the specialists that maybe there was 30 years ago. But based on the success of movies shot

at least partially on film like The Dark Knight and Oenheimer, IMAX is in the process of making an entirely new film camera. It hopes to release it to some filmmakers within roughly 1 year. I know it does sound a little strange out there to modernize a film camera at this day and age. uh but uh we wanted to bring uh a large format film camera to a position where we can actually compete with the new modern digital cameras. It's made out of mostly of composite materials, carbon fiber, honeycomb sandwich panels, the same stuff that your Formula 1 is made out of and titanium, the same stuff that jet fighters are actually made out of. And besides the body, it's getting new

optics, a 5- in full color display for user interaction, wireless connectivity, all the while the camera's actually taking 15 perf IMAX analog film in 18K. Uh, that's super exciting. In addition to custom cameras and projectors, IMAX works with filmmakers and studios to enhance the visuals and sound of the movies in its theaters. That process is called remastering and it ensures films don't look too grainy on its larger screens. This is what IMAX calls one of its iconic theaters. Its stadium design is unusually steep to give every seat a clear view. Up to 12 speakers localize sound so everyone hears movies at the same volume. The

screen has an aspect ratio up to 2/3 taller than regular theaters. It's curved to look the same from every angle, and it's made with perforated vinyl and coated in a proprietary silver paint, so it reflects light with uniform brightness. Many IMAX theaters are smaller than this, so they can fit in standard movie complexes. But all of them, the IMAX theaters that play film and the smaller ones that play digital movies, are monitored and calibrated every day from operation centers like this. All of the theaters are connected through the internet through a live line and that machine language is telling us things like I'm too hot, I'm too cold, I'm offscreen, I'm on screen, I have pass calibration, I haven't passed

calibration. So the technicians in the room here can log into the systems, take control of them. So 95% of the time we fix it right here. The company relies on these operation centers to make the IMAX experience consistent across theaters. Why immersive is so important is it enables uh viewers to kind of feel like they're right there in the content. And it's uh what we often call in the business a suspension of disbelief where you get caught up in the uh content that's being played in front of you and you feel like you're right a part of it.

English Subtitles

Read the full English subtitles of this video, line by line.

Loading subtitles...