Spectricity was founded in 2017 to create compact, low-cost sensors for smartphones. It’s been working quietly to develop that hardware ever since and revealed its first product, the S1 multispectral sensor, earlier this month. It may be a few years before the company’s new tech finds its way into phones, but when it does, you could see the benefit in more color-accurate photos.
The S1 multispectral sensor is technically a camera, but it’s not one that will shoot the photos you share on social media and send to friends. CEO Vincent Mouret describes it as akin to the depth and 3D sensors you find on some phones. It would have a place of honor on a smartphone’s camera bump, but you won’t have a “multispectral” button in your phone app. Instead, the S1 would capture more information about lighting, allowing your phone to take images with accurate colors and better white balance.
Spectricity has some examples of what’s possible with its technology (below). The S1 multispectral sensor captures 16 color channels, whereas the standard camera sensors in your phone today only capture three: red, green, and blue. The S1 covers that range but also reaches deeper into the infrared than typical cameras. While the resolution is just VGA (800 x 600), it doesn’t need a lot of pixels to do its job. It’s all about collecting more data about a scene based on the spectral signature of light. The company says this helps the “color-blind” main camera produce natural colors and unbiased skin tones.
Computational mobile photography has changed how we take photos on the go — it’s no longer about having the most pixels possible, but that’s not stopping Samsung. The best smartphone cameras, like those from Google and Apple, lean heavily on AI algorithms to render accurate, consistent colors and sharpen details. Google even developed a whole new system to produce more equitable skin tones, but maybe the S1 will make that easier and more universally available. Generally, the more data your camera system can collect, the better those algorithms match what your eye sees. Spectricity is not changing the photography game tomorrow, though.
Spectricity says the S1 can be manufactured at scale, but it doesn’t expect to have the hardware available for smartphone OEMs until 2024, with higher volumes shipping in 2025. Then, we’ll have to see which companies embrace the technology. Plenty of clever smartphone technologies have failed to catch on, so there’s no guarantee you’ll ever use a phone with a Spectricity color sensor. If you do, though, it’ll probably be in 2026 or later, which is when Mouret believes the S1 could realistically appear in high-end smartphones.
Even though Cyberpunk 2077 launched more than two years ago, it’s still able to bring everything but the beefiest gaming rigs to their knees with ray tracing enabled. Now help is on the way for Nvidia 40-series GPU owners in the form of DLSS 3 and frame generation. Although Nvidia first touted this technology at the RTX 4090’s launch in September, it’s only now rolling out. Nvidia will soon add an even more punishing ray tracing mode to Cyberpunk called RT Overdrive.
CD Projekt RED announced the update via tweet; it’s available as a 6GB patch for people who own the game. DLSS 3 is only available for people who own an RTX 4090, 4080, or 4070 Ti. It adds support for DLSS 3 to Cyberpunk 2077, including both Super Resolution (default) and Frame Generation modes. In addition, it adds Nvidia Reflex to combat the latency introduced by frame generation. Nvidia promises substantial performance gains with DLSS 3 in Cyberpunk if you have a 40-series GPU. At 4K with maxed-out settings, you should be able to hit 138fps with an RTX 4090. Those with an RTX 4080 will be sitting pretty at 102fps. If you have an RTX 4070 Ti, you’ll have to play at 1440p. However, you should be able to achieve 136fps, according to Nvidia via PCGamer.
RT Overdrive mode (above) is an enhanced version of ray tracing that will likely only be possible to run with frame generation. It includes many advancements over the current ray tracing implementation to make lighting more realistic. According to Nvidia, RT Overdrive supports RTX Direct Illumination (RTDI), which adds millions of light sources to a scene. RT Overdrive will also allow light rays to bounce multiple times instead of just once, as in their current form. This will apply to global illumination, reflections, and self-reflections. In addition, it will render ray-traced reflections at full resolution for even more clarity.
Other new features include Shader Execution Reordering, which efficiently parallelizes threads tracing rays without losing image quality. New opacity microamps boost performance by encoding the opacity of objects onto geometry. This frees the GPU from having to evaluate the opacity when it’s calculating ray traversal. Finally, it also adds a real-time denoiser program that reduces noise in the final image without impacting performance.
The arrival of DLSS 3 is good news for folks who dropped a chunk of change on a new GPU and have been waiting to see how it does in Cyberpunk with DLSS 3. This game is essentially the new Crysis, as it’s the only title that can really stress a GPU. It’s also funny as the game is more than two years old at this point, which shows how punishing it was for older GPUs at launch. Though it certainly had a rocky start, to put it lightly, CD Projekt RED has been whittling away at it ever since. The company even apologized for the buggy launch and has been dropping huge updates ever since.
For now, DLSS 3 is only available on RTX 40-series GPUs. The company states it’s theoretically possible to enable it on older GPUs such as Ampere or Turing, but it would not run well. However, that didn’t stop a Redditor from enabling it on their RTX 2070 GPU recently. For now, it’ll remain as a 40-series exclusive unless you’re the type who likes to tweak config files. Even then, it’s probably not going to run very well. The good news for gamers is that RTX 4080 and 4070 Ti GPUs are readily available. RTX 4090s are out there too, but you’ll have to pay a premium for them.
The success of OnePlus may not be entirely thanks to Carl Pei, but the company did change dramatically (and not for the better) when he left. Pei went on to found Nothing Technology, but the firm’s first smartphone, the aptly named Nothing Phone 1, never fully launched in the US. Nothing’s next release, however, will come to the US later in 2023, and you’ll never guess what it’s called. Yes, the Nothing Phone 2.
In an interview with Inverse, Pei confirmed the Nothign Phone 2’s existence for the first time. That was something of a foregone conclusion, but now we know that it’s launching this year, and you’ll be able to buy it in the US. Technically, you could get the Nothing Phone 1 later in 2022 as part of the company’s beta program, but the device lacked many of the cellular bands used on US carriers.
The Nothing Phone 2 will be designed with the US market in mind. “We decided to make the US our No. 1 priority in terms of markets,” Pei said. Apparently, the decision to eschew the US last time around wasn’t personal. Pei says his “hands were tied,” but now the company has more resources and is able to do the legwork to make a phone for the US. Unlike most markets in the world. US carriers maintain tight control over their networks, requiring OEMs to undergo long certification processes to get access to the latest 5G connectivity.
Pei further confirmed that the company’s second Android phone will be more powerful than its first, which turned heads with its distinctive transparent design. However, it was a mid-range phone starting at just €469. The next phone will be more capable and probably more expensive, but Pei cautions it will not be a “flagship” phone, which usually suggests top-of-the-line specs and a price tag approaching (or exceeding) $1,000. The Nothing Phone 2 might be somewhere in the range currently occupied by the Google Pixel 7 and OnePlus’ latest devices.
We don’t know any specifics of the pricing or launch date — just that you’ll be able to purchase the phone in the US this year. Presumably, that means the phone will be available unlocked, but it’s not impossible that Nothing could partner with one or more carriers. OnePlus’ presence in the US exploded after it linked up with T-Mobile to release its first carrier-branded phone in 2018. Perhaps Pei will attempt to replicate that success with Nothing’s first US launch.
Physicists in China report that they have built a tractor beam capable of moving objects on the macroscale. It’s counter-intuitive; like laser cooling, the system does the opposite of what you might expect when you point a laser at it. Instead of pushing, the laser pulls.
In the latest issue of Optics Express, the group reports that when using a 90mW laser, their tractor beam in a box can produce about a micronewton of pulling force. The setup is deceptively simple. The scientists vapor-coated a sliver of glass with reflective gold, and then stuck a flake of cross-linked graphene to the other side. Then, they pointed blue, cyan, and green lasers at the flake of graphene. Lo and behold, it moved toward the laser emitter.
Graphene’s 3D molecular structure (Credit: iLexx/Getty Images)
The setup is built on established technology. Optical tweezers and solar sails also use light to move things around. However, optical tweezers usually confine themselves to objects the size of single molecules. Not so for this experiment, says the team.
“In previous studies, the light-pulling force was too small to pull a macroscopical object,” said research team member Lei Wang from China’s Qingdao University of Science and Technology. “With our new approach, the light pulling force has a much larger amplitude. In fact, it is more than three orders of magnitude larger than the light pressure used to drive a solar sail, which uses the momentum of photons to exert a small pushing force.”
Spooky Action at a Distance
The device works partly by way of graphene’s unique properties. Graphene is optically absorptive, meaning it retains some percent of the energy when photons hit it. It’s also a semiconductor and an effective heat pipe. So effective, the paper concludes, that when the scientists pointed the laser at the graphene sandwich, the graphene carried that energy right to the far side of the piece. Thermodynamics says that hot things emit more energy than cold things, all else being equal. In the lab environment, that differential heating was enough to make the object move.
(Video credit: Wang et al., 2023)
The researchers did their work in a tightly controlled low-pressure environment. This cut down on optical scattering that might have confounded the experiment. However, it also agrees with prior work suggesting that tractor beams might be most useful in space. When you’re moving things around with lasers, it turns out the laser is most effective when it isn’t also moving around random bits of stuff.
“Our technique provides a non-contact and long-distance pulling approach, which may be useful for various scientific experiments,” said Wang. “The rarefied gas environment we used to demonstrate the technique is similar to what is found on Mars. Therefore, it might have the potential for one day manipulating vehicles or aircraft on Mars.”
L. Wang, S. Wang, Q. Zhao, X. Wang, “Macroscopic laser pulling based on Knudsen force in rarefied gas,” Opt. Express, 31, 2, 2665-2674 (2023).
A few weeks ago at CES, we were surprised by the paucity of PCIe 5.0 drives at the show. The technology has been talked about for at least a year, it’s now supported by both AMD and Intel, and we, the speed-obsessed PC-building crowd, have our wallets out. Sadly, we were left to conclude that PCIe Gen 5 SSDs needed a few more months in the oven. That still might be true, but at least one drive is finally for sale to the gaming public. It’s in Japan, so not ideal, but it’s a start.
The drive is being sold by a company named CFD Gaming, which is a new one to us. It’s a 2TB model that offers up to 10GB/s sequential read speeds. For sequential writes, it can deliver up to 9.5GB/s. That’s in contrast to around 7GB/s offered by today’s PCIe 4.0 drives. That’s a decent amount of uplift from current drives, but on the low end for a Gen 5 SSD overall. Some of the SSDs at CES boasted up to 14GB/s in sequential reads.
The spec itself tops out at around 16GB/s as well, but it’ll never go that high in the real world due to overhead. Random read and write speeds are listed as 1.5 million and 1.25 million IOPS, respectively. According to HotHardware, the drive will be offered in 1TB and 4TB capacities. It was originally supposed to have launched in November but was delayed for some reason.
We sure hope that tiny fan isn’t as loud as it looks like it would be. (Image: PC Watch)
The drive includes a notably huge active cooling solution. This was anticipated for these next-gen drives, as they were rumored to get toasty. The cooler is reportedly permanently attached too, and the vendor warns against installing it in cases with poor airflow. Sorry ITX builders, you might have to stick with PCIe 4.0 drives for a while.
The drive uses a Phison E26 controller and an X4 PCIe 5.0 connection. HotHardware previously tested the drive and it showed mixed results. Though it was almost able to hit its listed 10GB/s speeds in sequential tests, it was mid-pack in random access benchmarks. That could lead to benefits for consumers, as most of us aren’t running random access tasks at high queue depths. Overall, though, what will most likely happen is the average user won’t feel any difference between it and a PCIe 4.0 drive.
Pricing is also a concern. The drive is listed as costing 49,980 Japanese Yen. That works out to $383 or so, which currently is about twice the cost of a 2TB PCIe 4.0 drive. Given the steep increase in motherboard costs as well as DDR5 prices, we can see many users passing on these drives at first. Maybe in a year or two when they’re priced competitively with older drives, they’ll be a more compelling purchase.
NASA’s Mars Reconnaissance Orbiter (MRO) has been faithfully capturing the Red Planet’s surface and atmosphere for nearly 17 years. Normally the data MRO sends to scientists on Earth is relatively mundane, relating to things like variations in ozone or climate. But MRO recently captured something that’ll make just about anyone smile: a piece of Mars’ surface that looks like a teddy bear.
On Dec. 12, 2022, MRO snapped a photo of Mars’ surface from a height of about 156 miles (251 kilometers). The resulting image depicts what looks like a teddy bear’s face, complete with two eyes, a snout, a dark nose, and a smiling mouth; all that’s missing are two round ears. The University of Arizona, which helped build MRO’s camera, shared the photo Wednesday on its High Resolution Imaging Science Experiment (HiRISE) blog.
“There’s a hill with a V-shaped collapse structure (the nose), two craters (the eyes), and a circular fracture pattern (the head),” the post reads. “The circular fracture pattern might be due to the settling of a deposit over a buried impact crater. Maybe the nose is a volcanic or mud vent and the deposit could be lava or mud flows?”
This isn’t the first time humans have found humorous or downright uncanny imagery in space. Scientists and casual internet users have been seeing faces and objects on planetary and lunar surfaces for years (though some findings are admittedly more of a stretch than others). HiRISE found a Muppet on Mars back in 2018; two years later, the European Space Agency spotted an angel and a heart on the same planet. China’s lunar rover located what many thought to be a mysterious moon cube or hut in late 2021, but around the start of the new year, Chang’e 4 confirmed the object was just a rock.
Mars’ teddy bear may also just be mud and rock, but it symbolizes the charming human tendency to find familiar shapes in just about anything. This phenomenon is called pareidolia, and it’s what allows humans to see animals in the clouds, faces in stucco, or Harambe the gorilla (remember him?) in Cheetos. With MRO’s help, we’ll hopefully find a number of recognizable figures on Mars’ surface in the future.
The rumor mill is spinning up again with some truly behemoth GPUs on the horizon. Nvidia is currently hoisting the fps champion’s belt over its virtual head with the RTX 4090. Now the company may take things even further with an RTX 4090 Ti and even a Titan-class GPU. It was previously rumored Nvidia had ditched the Titan version due to it being just, well, too much. Now photos have appeared showing what it could look like, and Nvidia was right; it’s a bit ridiculous. The 4090 Ti looks somewhat sane in comparison. It will allegedly have Titan-class specs, but with only 24GB of fast G6X memory.
News of Nvidia’s possible future plans comes from notorious leaker kopite7kimi for the RTX 4090 Ti. Plus, the photos of the Titan card come from the appropriately named MEGAsizedGPU. On the Ada Lovelace flagship GPU, it was previously leaked that Nvidia had come very far in developing this GPU prior to launch. It seems it wanted to have it in its back pocket in case AMD’s RDNA3 GPUs took it by surprise. Way back in May of last year, we saw part of the cooler, which bore an RTX 4090 Ti engraving on it.
(Image: Chiphell forums)
The new flagship Ada card will offer higher clocks, more CUDA cores and L2 cache, and faster memory than the current RTX 4090. CUDA count will go from 16,384 to 18,176. That’s an 11% increase over the existing GPU. Memory capacity will remain at 24GB, but Nvidia will swap out the existing 21Gb/s chips in favor of faster 24Gb/s Micron dies. L2 cache will be modestly bumped from 72MB to 90MB. Boost clocks will probably approach 3GHz or so. The leaker says these improvements nudge board power up to 600W. However, it’s unclear if that’s the absolute maximum when overclocked or not. It is not credible that these additions would add 150W to the card’s power consumption.
As far as the Titan card goes, this GPU has been discussed previously but was rumored to be cancelled. It was said at the time that it was just sucking up too much power, making it too impractical. However, new photos have emerged showing an unusual cooling design. This lends credence to the notion that Nvidia is far along in its development. However, whether it releases it is still an open question. The new photos from MEGAsizedGPU show a bracket with vertically stacked output connectors. We’ve never seen anything like it before, and it indicates it would have a novel vertical PCB.
The cooler for the Titan class is so big it has to go next go the PCB instead of being under it. (Image: @MEGAsizedGPU)
As far as specs go, nothing has changed since the last time it was leaked. It’s still being reported as an outlandish quad-slot, 800W GPU. That’s especially true since the new 16-pin ATX 3.0 power cable can only provide 600W in a single cable. We suppose Nvidia could put two of them on the GPU as shown in a previous leak. The card will sport 18,176 CUDA cores and 48GB of 24Gb/s G6X memory. This card was previously referred to as “the beast,” which is a fitting title.
It’s unclear if Nvidia will release either of these GPUs as it really has no reason to given what AMD is offering. However, we could see Nvidia launching the Titan card if only to satisfy that audience. The previous RTX Titan card is from the Turing era, so it’s quite long in the tooth. Nvidia also might be saving these huge dies for its even-pricier data center cards.
The Internet Archive lives up to its name, creating a backup of information and content that would otherwise be lost to history as technology barrels forward. The archive hosts web page snapshots, Android APKs, and a new project from the MAME emulator team: The Calculator Drawer. This collection of calculator emulators runs the gamut from the kid-friendly Electronic Number Muncher to the venerable TI-83 Plus. And they all work just like the originals.
You probably know “MAME” mostly for its arcade machine emulation — it originally stood for “Multiple Arcade Machine Emulator” but was merged with a general emulation project to cover a wide array of vintage devices. Hence, the new MAME Calculator Drawer on the Internet Archive.
The drawer includes both graphing calculators, as well as simple calculators, all emulated in MAME. Most of them even have an additional layer on top to represent the original hardware. That means you can click on the keypads and controls as they originally existed, right down to the “on” button on many of the more advanced machines. Like the real deal, they won’t do anything until you turn on the power. Without the MAME Artwork system, the emulated part of many of these devices would simply be a string of LCD block numbers. MAME can create vector graphic representations of the hardware, but most of the calculators use real photos of the device for the artwork layer.
The TI-92 cost $180 when it came out in 1996. Now you can emulate it for free in your browser.
It’s a lot more fun to see a representation of the calculator itself rather than using your keyboard to input numbers. That said, you can use your keyboard for input if you prefer. To use any of the emulated calculators, just open the page and click the Start button. Each page includes some basic information about the devices, including the release date, price, and hardware specs. There’s also a full-screen option if you want the technology of yesteryear to fill your screen. That’s more useful on mobile — most of the calculators are roughly smartphone-shaped.
As you can imagine, there are a lot of Texas Instruments graphing calculators, including some of the more advanced versions you rarely, if ever, saw in the wild. The TI v200 and TI-92 (above) are more like tiny computers than calculators, and they’re much more complicated to operate as a result. If you need a little refresher on how to use these retro tabulators, The Internet Archive also has a collection of manuals linked on the Calculator Drawer page.
The ocean is teeming with plastic. As the world continues to rely on plastic for everything from single-use packaging to medical devices, an estimated 10 million tons of the stuff ends up in our oceans every year. Such severe pollution presents obvious risks for marine life and even life on land. Mother Nature herself appears to be working to mitigate these risks.
A study published this month by scientists in the Netherlands suggests that the sun might break down plastics floating on the ocean’s surface. A team of marine specialists from the Royal Netherlands Institute for Sea Research (NIOZ) has found that in simulative ocean settings, ultraviolet (UV) light—the kind emitted by the sun—gradually degrades plastics, helping to reduce seawater pollution and potentially resolving what scientists call the “Missing Plastic Paradox.”
Environmentalists, marine biologists, and other researchers have been studying ocean-polluting plastics for some time now, but the Missing Plastic Paradox has been a constant head-scratcher. While scientists know approximately how much plastic enters the ocean on a regular basis, they can’t actually locate some of it—it’s just gone. This naturally begs the question: Where does all that plastic end up?
In the NIOZ lab, researchers simulated ocean pollution and the sun’s UV rays by mixing up a “plastic soup” consisting of seawater and common plastics. These plastics included the most common polluters found on the ocean’s surface: polyethylene-terephthalate (PET), polystyrene (PS), polyethylene (PE), and polypropylene (PP). Each piece of plastic was barely larger than a microplastic, imitating the shape and size of pollutants capable of floating rather than sinking.
A 460-watt halogen lamp beamed UV rays similar to solar UV-A/B light at the plastic soup while a shaker table imitated the motion of ocean waves. Over the course of several days, the researchers monitored the plastic particles’ physical integrity. They found that UV rays broke each plastic particle into smaller pieces, eventually creating nanoplastics (plastics that are so small, they’re invisible to the naked eye) and molecules like the ones found in crude oil. These can chemically dissolve or be broken down further by bacteria.
Based on the rate of degradation measured in their experiment, the NIOZ researchers estimate that the sun breaks down common surface-level plastic pollutants by 1.7% to 2.3% per year. At this rate, anywhere from 7% to 22% of plastic ever released to the sea could have already been broken down by UV rays. While these statistics are encouraging, the scientists warn that this isn’t an umbrella solution to plastic pollution. Such a rate of degradation is too slow to fully rid the ocean of all its plastics; additionally, test plastics released organic carbon, carbon dioxide, carbon monoxide, methane, and other gasses as they broke down. The solution, they write, is still to mitigate which plastics are produced in the first place.
The White House announced Friday that it had signed an administrative agreement with the European Union focused on “responsible advancements” in artificial intelligence. The agreement, which builds on existing tech pacts, will prioritize international collaboration while directing AI resources toward specific industries.
The new US-EU Artificial Intelligence Collaboration agreement will bring AI experts together in an effort to address “major global challenges” under a joint development model. Although National Security Advisor Jake Sullivan’s statement doesn’t specify which specific challenges the AI Collaboration agreement seeks to address, it does share that the US and EU will collaboratively direct research efforts toward five areas of interest: extreme weather and climate forecasting, emergency response management, health and medicine, electric grid optimization, and agriculture optimization.
The AI Collaboration agreement carves a clear path forward for the US-EU Trade and Technology Council (TTC), which in December identified AI risk management as a new goal. It also builds on the Declaration for the Future of the Internet, a political commitment among the US and 60 other countries that seeks to advance “a positive vision for the Internet and digital technologies.” Among the Declaration’s priorities are protecting human rights and digital privacy, promoting the free flow of information, and encouraging affordable web connectivity.
Some expect the signing of the US-EU Artificial Intelligence Collaboration agreement to herald the country’s first AI laws. (As of now, the US lacks any formal rules explicitly prohibiting the use of invasive or discriminatory AI.) Others anticipate that the agreement will bring forth new product offerings for software companies and government contractors, who could make a pretty penny auditing AI systems and ensuring they meet future legal requirements.
This isn’t the first time the White House has demonstrated an interest in guiding the development of AI. Introduced in October, the Biden administration’s “AI Bill of Rights” reminds both the private and public sectors to design and maintain AI systems that mitigate human biases and allow for individual privacy. While the document serves only as a nonbinding suggestion, several federal agencies have already established offices to implement and enforce its guidelines.
The western end of Australia is dominated by a sweltering desert of ochre-colored soil and hearty shrubs, but there’s something new hiding in the outback: a radioactive capsule. Australian officials are frantically searching for the object, which was being transported between two mines when it went missing. They’re warning people in the region to steer clear of the object if they see it, as even brief exposure can be dangerous.
The capsule is tiny, just 6 x 8 mm in size. Inside the ceramic enclosure is a sample of cesium-137, a highly radioactive isotope that is used in mining equipment. Australia’s Department of Fire and Emergency Services (DFES) says the capsule was being moved from a mine near the town of Newman to one near Perth earlier this month. However, the capsule never made it, suggesting it fell off the truck somewhere on road.
Despite being so small, the capsule has a big radioactive footprint, according to DFES. The cesium-137 inside emits about 2 millisieverts per hour, which is the same dose as 10 medical X-rays or an entire year of normal background radiation at sea level. Officials say that holding the container even for a short time could cause radiation burns and increase the risk of severe illness.
03:32 PM – Chemical Spill in RADIOACTIVE SUBSTANCE RISK in parts of the Pilbara, Midwest Gascoyne, Goldfields-Midlands and Perth Metropolitan regions: https://t.co/ZSEIQDbkiJ
“Our concern is that someone will pick it up, not knowing what it is,” says Dr. Andrew Robertson, the state’s chief health officer. “They may think it is something interesting and keep it, or keep it in their room, keep it in their car, or give it to someone.” Luckily, this is a sparsely populated region, so it’s unlikely anyone will happen upon the radioactive source. However, these things have a way of getting around.
A similar radioactive capsule was lost in a Ukrainian quarry in the late 1970s. Authorities there gave up after a week of searching and went back to business as usual. The capsule eventually ended up in concrete that was used to construct an apartment building in the eastern city of Kramatorsk. From 1980 to 1989, the cesium-137 poisoned the residents of apartment 85. In all, four people died of leukemia, and 17 more received heavy doses of radiation before the object was found.
The route of the truck is known, so Australian officials are in the process of narrowing down the capsule’s possible resting place. Although, we’re talking about almost 700 miles (1,100 kilometers) of winding outback roads. In the meantime, DFES has released an image of the capsule (above) and is asking anyone who sees it to contact authorities immediately.
Pulsars are some of the most extreme and fascinating objects in the universe, and NASA’s Fermi Space Telescope has just unlocked a new way to study them. Using the orbiting observatory, astronomers have identified the first gamma-ray eclipses in “spider systems,” consisting of a pulsar and a smaller main sequence star. These are so-named as a reference to the arachnid tendency to consume one’s companion, which is what happens in these solar systems, too.
Before Fermi came online in 2008, science knew of just a few pulsars that emitted gamma rays. Today, Fermi has identified more than 300 of them. An international team of experts combed through a decade of Fermi data in search of something specific: a gamma-ray eclipse. The end goal is to accurately calculate the mass and velocity of these extreme stellar remnants, and the eclipses help get us there.
In some alien solar systems, there are two stars that age at very different rates. That can lead to a situation in which the larger of the pair may go supernova while the smaller one is still fusing hydrogen like the sun. This can lead to a spider system — the pulsar feeds off its smaller companion while superheating one side of it. According to NASA, scientists even have sub-categories based on the relationship between the two. A “Black widow” system has a star with less than 5% of the sun’s mass. A “Redback” spider system has a stellar companion weighing between 10% and 50% of a solar mass.
We can characterize spider systems using visible light and radio frequency observations — most of the time. It gets tricky when the plane of the system is aligned with ours. That makes the subtle changes too difficult to detect, but a gamma ray eclipse can reveal the truth. Using Fermi, researchers found seven spider systems exhibiting this phenomenon. Since gamma rays only come from the pulsar, their disappearance in the data indicates the smaller companion has eclipsed the pulsar. Pulsars emit radiation like clockwork, so just a few missing photons is enough to reveal an eclipse. With this data in hand, scientists can calculate the system’s tilt, and from that, its mass and velocity.
Take PSR B1957+20, for example — earlier estimates suggested this Black window system was tilted 65 degrees with a pulsar 2.4 times as massive as the sun. That made it one of the largest known. However, the new study shows this system has a tilt of 84 degrees, meaning the pulsar’s mass is just 1.8 times that of the sun.
The team believes that once the models are fine-tuned, Fermi will be able to answer some nagging questions about spider systems. For example, does the mass stolen from the companions make them the most massive population of pulsars? B1957 ended up smaller than we thought, but it could go the other way just as easily.
Intel CEO Pat Gelsinger holds an 18A SRAM wafer. (Credit: Intel)
Intel has reported its earnings for all of 2022 as well as Q4, and it’s so bad that analysts are likely diving for their thesauruses to properly characterize it. “Historic collapse” is how one summarized the losses. One just said there are simply “no words.” Intel reported its worst earnings in more than 20 years. Though the company’s earnings were still within its guidance, they came in at the very low end and mark a historic downturn for the company. The news caused Intel’s stock to fall almost 10% in value. Its earning reports are available in various forms on its investment website.
For 2022, Intel earned $63.1 billion in total, a 20% decline from its 2021 earnings. Its Q4 revenue was $14 billion, a precipitous 32% drop from the same quarter last year. One analyst notes this is the largest year-over-year decline in the company’s history. It posted a net loss of $664 million for the quarter, which almost matches its worst quarterly loss in history: In 2017, it reported a loss of $687 million in the fourth quarter.
Though Intel ended 2022 with $8 billion in profit, last year it made $19.1 billion. That’s a remarkable 60% reduction, which is why the word “collapse” is being thrown around. Its gross margin for Q4 of 39.2% is the lowest in decades as well. Intel used to get 60% margins not that long ago.
As far as where the hits came from, it’s in both data center and client computing. It earned $6.6 billion on the client side, which is down 36% from last year’s Q4. Total revenue for client computing in 2022 dropped 23% compared with 2021. Its Data Center and AI (DCAI) group’s revenue fell 33% YoY, and 15% for the year as a whole. The only bright spots were gains in Mobileye, Intel Foundry Services, and its graphics division. All three divisions posted increases, with its foundry services posting a surprising 30% improvement for the quarter.
Despite the grim report, Intel says it’s still on target to achieve its long-term goals. It notes it’s still pursuing its “five nodes in four years” strategy laid out by CEO Gelsinger upon his arrival in 2021. This will theoretically allow it to achieve industry leadership in both transistor performance and efficiency leadership by 2025. To that end, Gelsinger says it’s looking to begin its ramp for Meteor Lake in the second half of 2023. If that occurs, we’ll be surprised as it’s been rumored to be delayed. Instead, we may see a Raptor Lake refresh.
“We are at or ahead of our goal of five nodes in four years,” said Gelsinger in the earnings report. “Intel 7 is now in high-volume manufacturing for both client and server. On Intel 4, we are ready today for manufacturing and we look forward to the MTL (Meteor Lake) ramp in the second half of the year,” he said.
Unfortunately for Intel, it doesn’t anticipate a quick rebound from its financial nadir. Its CEO predicted continuing “macro weakness” through the first half of 2023. However, he noted there’s a possibility of an uptick later this year. Given the uncertain economic conditions though, Intel is only providing guidance for Q1 of 2023 and nothing beyond that. That guidance is even more brutal than this report: It predicts YoY revenue will be down 40%, with gross margins hitting 39%.
Intel’s earnings report follows news this week that it has canceled a planned $700 million R&D facility in Oregon. It was also announced this week that it was laying off 544 employees in California as it begins to tighten its belt. It’s stated it plans on reducing expenses by $3 billion in 2023, with that number increasing to $10 billion by 2025.
Hello, folks, and welcome back to your favorite Friday roundup of all the space news fit to print. This week we’ve got experimental rocket engines, a gigantic map, and galaxies galore. The James Webb Space Telescope found hydrogen in a galaxy more than eight billion light years away, and the coldest ice ever, but it’s currently down due to a software glitch.
Closer to home, Rocket Lab launched their Electron rocket from US soil for the first time. NASA came together for a day of remembrance that somehow managed to be both somber and ineffably sweet.
JWST Spots the Coldest Chamaeleon
If you wish to make an apple pie from scratch, you must first invent the universe. And somewhere along the way, you’ll need one of the ancient molecular clouds of dust and ice from which stars and habitable planets like Earth are born. This week, Webb scientists announced that the telescope has spotted just such a place. It’s a stellar nursery called the Chamaeleon I cloud, loaded with these primordial crystals. That’s the tableau you’re seeing in the image above — you can tell it’s from Webb by those iconic six-pointed stars. The ice contains traces of sulfur and ammonia, along with simple organic molecules like methanol. And at just ten degrees above absolute zero, it’s the coldest ice ever found.
“We simply couldn’t have observed these ices without Webb,” said Klaus Pontoppidan, a Webb project scientist involved in the research. “The ices show up as dips against a continuum of background starlight. In regions that are this cold and dense, much of the light from the background star is blocked, and Webb’s exquisite sensitivity was necessary to detect the starlight and therefore identify the ices in the molecular cloud.”
‘Virginia Is for Launch Lovers’: Rocket Lab Launches Electron Rocket From US Soil
Late Wednesday evening, aerospace startup Rocket Lab successfully launched its Electron rocket from NASA’s Wallops Flight Facility in Virginia. This was the 33rd launch of the Electron, but its first launch from American soil.
The Electron isn’t reusable — but in 2021, Rocket Lab announced the Neutron. Designed for reusability, the Neutron will have about a third of the lift capacity of a Falcon 9.
NASA ‘Rotating Detonation Engine’ Aces Hot Fire Tests
Speaking of 3D-printed rocket engines: NASA announced this week that it has successfully validated a next-gen rocket engine it hopes will revolutionize rocket design. The new engine generates thrust “using a supersonic combustion phenomenon known as a detonation.” And this is no experimental error — their full-scale alpha build produced more than 4,000 pounds of thrust at full throttle.
These engines get their name (rotating detonation rocket engine, or RDRE) from the unique way they produce thrust. Detonation waves echo around a circular chamber, wringing out every bit of energy from the rocket fuel. It’s great for efficiency, but it puts the whole system under extreme pressure. Undaunted, NASA turned to an advanced additive manufacturing process, even developing its own bespoke metal alloy for the task.
According to the agency, the RDRE incorporates the agency’s GRCop-42 copper alloy into a powder bed fusion (PBF) additive manufacturing process. PBF uses a laser or particle beam to seamlessly fuse ultra-fine particles. It’s a lot like the sintering process used to make the space shuttle rocket engines — and even they had to be actively cooled by the rockets’ own cryofuel, in order to withstand the unearthly temperatures and pressures of takeoff. If the design holds up, NASA intends to use RDRE in its efforts to establish a long-term presence off-planet.
Dark Energy Detector Plots Largest-Ever Map of Galaxy
Astronomers have created a gargantuan map of the Milky Way, using a telescope built to detect dark energy. Featuring more than three billion stars, it focuses on the galaxy’s orbital plane — a region notoriously difficult to study.
Earth’s atmosphere scatters starlight so that points of light turn into point clouds. So, the astronomers just dove right in. To isolate different stars and celestial objects, the group used some extra-snazzy math to get rid of noise. This allowed them to “paint in” the proper background, letting them tell one star from another.
Astronomers have released a gargantuan survey of the galactic plane of the Milky Way. The new dataset contains a staggering 3.32 billion celestial objects — arguably the largest such catalog so far. The data for this unprecedented survey were taken with the US Department of Energy-fabricated Dark Energy Camera at the NSF’s Cerro Tololo Inter-American Observatory in Chile, a Program of NOIRLab. Credit: Saydjari et al., via NoirLab
“One of the main reasons for the success of DECaPS2 is that we simply pointed at a region with an extraordinarily high density of stars and were careful about identifying sources that appear nearly on top of each other,” said Andrew Saydjari, lead author on the (open-access!) paper accompanying the gigantic map. “Doing so allowed us to produce the largest such catalog ever from a single camera, in terms of the number of objects observed.”
Experts: Milky Way Too Large for Its “Cosmological Wall”
The history of astronomy has been all about recognizing that our place in the universe isn’t all that special. We’ve gone from the center of all existence to just another planet orbiting an average star in one of billions and billions of galaxies. However, a new simulation hints that there might be something special about the Milky Way after all.
Yepun, one of the four Unit Telescopes of the Very Large Telescope (VLT) at the European Southern Observatory, studies the center of the Milky Way. Yepun’s laser beam creates an artificial “guide star” to calibrate the telescope’s adaptive optics. Image: ESO/Yuri Beletsky
The model suggests that the Milky Way is far larger than it should be, based on the scale of the “cosmological wall”: an incomprehensibly huge semi-planar structure occupied by the Milky Way and other galaxies in the Local Group.
Scientists Detect Atomic Hydrogen in Most Distant Galaxy Ever
An international team of astronomers announces the discovery of cold atomic hydrogen, more than eight billion light-years from Earth. Cooler than ionized plasma but warmer than molecular hydrogen gas, atomic hydrogen is the raw fuel of coalescing stars. The researchers used gravitational lensing to spot the telltale — but deeply redshifted — 21cm line.
Webb Spies Centaur Chariklo’s Delicate Rings
Named for the daughter of Apollo, Chariklo is a centaur: a Kuiper belt object that orbits out past Saturn. It’s the first of its kind ever found with a confirmed ring system. The thing really is tiny; it’s about 160 miles in diameter and has less than two percent the mass of Earth. But a new report from Webb shows even that much mass is enough to sustain two slender rings, for a time.
In a remarkable stroke of scientific luck, the telescope was pointed just right to catch Chariklo as it passed in front of a star. When it did, the star’s light fluttered in a way that betrayed the presence of the rings.
Chariklo has two thin rings — the first rings ever detected (in 2013) around a small Solar System object. When Webb observed the occultation, scientists measured dips in the brightness of the star. These dips corresponded exactly as predicted to the shadows of Chariklo’s rings. pic.twitter.com/sqH08v1lOB
Nothing less than delighted, the astronomers report that Chariklo’s rings are two and four miles wide, respectively. But the asteroid actually has something in common with the Chamaeleon I cloud. Chariklo’s surface is covered in exotic phases of water ice that only Webb can see.
Principal investigator Dean Hines added, “Because high-energy particles transform ice from crystalline into amorphous states, detection of crystalline ice indicates that the Chariklo system experiences continuous micro-collisions that either expose pristine material or trigger crystallization processes.” It’ll be up to the JWST to find out more.
Software Glitch Brings JWST Down for Maintenance
Unfortunately, observations of Chariklo and other celestial bodies will have to wait a while. The JWST had a software glitch this week. Per NASA, the telescope’s Near Infrared Imager and Slitless Spectrograph (NIRISS) “experienced a communications delay within the instrument, causing its flight software to time out.” Unfortunately, this led to a software gridlock.
The telescope is unavailable for science observations because NASA and the Canadian Space Agency are doing root-cause analysis to figure out and fix the problem. But NASA emphasizes that the telescope is fine. There’s no damage and no indication of any danger. If it’s a software problem, it may well be a software fix.
Perseverance Files First Weather Report
Now that it’s been on Mars for a while, the Perseverance rover has filed an authoritative report on Martian weather. The number one takeaway: It’s cold on the Red Planet! The average surface temperature is -67C.
It’s also windy on Mars. Since Mars has an atmosphere, it has surface weather. It also has an axial tilt, so it has seasons, just like Earth. Dust storms can envelop Mars’ entire northern hemisphere.
Plumes of darker, subsurface dust waft to the surface when the sun warms Martian sands beneath transparent sheets of ice. Mars’ shifting winds then blow these plumes of dust into V-shaped patterns. Astronomers are using the plumes to learn more about Mars’ weather and surface climate. Image: NASA
Perseverance is covered in a suite of sensors that constantly monitor wind speed and direction, atmospheric pressure, temperature, humidity, and dust. Together, they make the rover’s Mars Environmental Dynamics Analyzer (MEDA).
Here, you can see the MEDA sensors extending from the rover’s mast below the iconic ChemCam.
“The dust devils are more abundant at Jezero than elsewhere on Mars and can be very large, forming whirlwinds more than 100 meters in diameter. With MEDA we have been able to characterize not only their general aspects (size and abundance) but also to unravel how these whirlwinds function,” says Ricardo Hueso, of the MEDA team.
Perseverance has captured numerous dust devils as they sweep through Jezero Crater. However, to get that data, MEDA’s exposed sensors also face damage from the harsh radiation environment, extreme temperature swings, and the ever-present Martian dust. A dust devil in January of last year kicked up enough debris that it damaged one of MEDA’s wind instruments. Still, the rover perseveres.
NASA’s Bittersweet 2023 Day of Remembrance
Every year, NASA holds a memorial for staff, astronauts, and alumni who have died. 2023’s Day of Remembrance holds a somber significance, as Feb. 1 is the 20th anniversary of the Columbia disaster. Unfortunately, this year’s fallen also included Apollo 7 pilot Walt Cunningham, who passed earlier this month. Cunningham was the last surviving member of the Apollo 7 crew.
As in years past, NASA staff gathered this week at space centers and labs around the country, to honor the sacrifices of those who have given their lives in pursuit of exploration and discovery. But they did it in a way only NASA could do. They held nationwide town-hall safety meetings, to reflect on and improve NASA’s aerospace safety culture.
Ask not for whom the safety alarm tolls; it tolls for thee. NASA safety-culture town hall meeting at its Washington headquarters after the Arlington memorial service. Image: NASA/Keegan Barber via NASA HQ Flickr
What a fitting way to honor lives lost, while still reaching for the stars. Town-hall safety culture meetings. We love you guys. Never change.
Psyche Mission Now Targeting October 2023 Launch
Steady as she goes: After a year’s delay and a missed launch window, NASA’s Psyche mission team is getting the spacecraft in shape to launch this year. In a blog post, the agency said, “After a one-year delay to complete critical testing, the Psyche project is targeting an October 2023 launch on a SpaceX Falcon Heavy rocket.”
When it launches, Psyche will carry a technology demo for NASA’s shiny new Deep Space Optical Communications (DSOC) network. DSOC systems will use lasers for high-bandwidth communications between Earth and the Moon, Mars, and beyond. Beyond a deluge of scientific data, NASA expects that the network will be able to handle high-def images and video.
Skywatchers Corner
Comet C/2022 E3 (ZTF) is a long-period comet that last visited Earth in the time of the Neanderthals. Now it’s back for another close approach. And although we didn’t know this when we found it last year, it turns out the comet’s tail glows pale green, like a luna moth under a streetlight.
The robin’s-egg glow of Comet C/2022 E3 (ZTF)’s tail shines against its twin tails. Image: Dan Bartlett/NASA
At first, astronomers thought it might require binoculars to catch a glimpse of the thing. However, as ExtremeTech’s Adrianna Nine writes, the comet is now visible to the naked eye in places across much of the Northern Hemisphere.
Our verdant visitor will continue its brightening trend while it sails toward Earth. It will make its closest approach to us on February 2: perhaps too soon for a Valentine’s Day spectacular, but right on time for Imbolc, Candlemas, and Groundhog Day.
When it comes to treating cancer, groups of synergistic drugs are often more effective than standalone drugs. But coordinating the delivery of multiple drugs is easier said than done. Drugs’ molecular properties tend to differ, making it difficult to ensure that pharmaceuticals make it to their destinations without losing effectiveness along the way. An all-new multidrug nanoparticle might be the solution. A team of researchers at MIT has created a “molecular bottlebrush” capable of delivering any number of drugs at the same time.
Drug-loaded nanoparticles—or ultrafine particles ranging from one to 100 nanometers in diameter—prevent treatments from being released prematurely, which ensures that the drug reaches its destination before beginning to do its job. This means nanoparticles carrying cancer treatments can collect at the tumor site, facilitating the most effective treatment possible. There is, of course, one caveat: Only a few cancer-treating nanoparticles have been approved by the FDA, and only one of those is capable of carrying more than one drug.
MIT’s molecular bottlebrush, detailed Thursday in the journal Nature Nanotechnology, challenges that. Chemists start by inactivating drug molecules by binding and mixing them with polymers. The result is a central “backbone” with several spokes. All it takes to activate the inactivated drugs sitting along the backbone is a break in one of those spokes. This unique design is what enables the new nanoparticle to carry (and thus deliver) multiple drugs at a time.
(Image: Detappe et al/Nature Nanotechnology/MIT)
The team tested the molecular bottlebrush in mice with multiple myeloma, a type of cancer that targets the body’s plasma cells. They loaded the nanoparticle with just one drug: bortezomib. On its own, bortezomib usually gets stuck in the body’s red blood cells; by hitching a ride on the bottlebrush, however, bortezomib accumulated in the targeted plasma cells.
The researchers then experimented with multidrug combinations. They tested three-drug bottlebrush arrangements on two mouse models of multiple myeloma and found that the combinations slowed or stopped tumor growth far more effectively than the same drugs delivered sans bottlebrush. The team even found that solo bortezomib, which is currently approved only for blood cancers and not solid tumors, was highly effective at inhibiting tumor growth in high doses.
Through their startup Window Therapeutics, the researchers hope to develop their nanoparticle to the point that it can be tested through clinical trials.