You are reading the article Apple’s Studio Display’s Poor Webcam Quality Is Not A Software Bug After All updated in March 2024 on the website Bellydancehcm.com. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested April 2024 Apple’s Studio Display’s Poor Webcam Quality Is Not A Software Bug After All
When Apple announced Studio Display, it promised “sensational” webcam quality. However, as customers got their hands on the product, they noticed that the images captured by the built-in camera were not good. Apple is now rolling out a beta software that promises to fix some of these issues – but the thing is, Studio Display’s poor webcam quality is not a software bug after all.The complaints
According to pretty much every Studio Display owner, the webcam images are pretty bad compared to the front camera on other Apple devices. In most cases, the images look blurry, are washed out, and have a lot of noise.
In his review for The Verge, Nilay Patel wrote that the Studio Display’s camera looks “awful in good light, and downright miserable in low light.” Joanna Stern at The Wall Street Journal likened the camera performance to that of an “old BlackBerry.” Gizmodo had similar complaints, saying that the Studio Display’s webcam is “noisy” and “not great.”
Soon after the first Studio Display reviews criticizing its 12-megapixel webcam were published on the web, Apple told the press that it was working on a software update to improve the quality of the image captured by the built-in camera.
Nearly two months after Studio Display was announced, Apple today released a beta firmware to developers that brings enhancements to the image processing of the company’s built-in display webcam.
Right now, the update is only available to those running the latest beta of macOS Monterey, and it’s unclear when the update will be released to the public. However, some Studio Display users have already installed the firmware update to see what it actually changes. And it turns out, the update doesn’t change much.
As noted by Jason Snell, Apple has made some adjustments to make the Center Stage cropping less aggressive. At the same time, James Thomson also noted that there’s much less noise in the webcam images after the update, as well as a bit more contrast, but the quality is still “quite washed out” compared to other webcams.
Comparing the 15.5 (1st pic) and 15.4 (2nd pic) firmware for the Studio Display camera. There’s a _lot_ less noise, and a touch more contrast, but it’s still quite washed out compared to the iMac Pro camera (3rd pic, taken last month). chúng tôi James Thomson (@jamesthomson) April 26, 2023
The update doesn’t seem to miraculously improve the quality of the Studio Display’s webcam, and there’s a reason for that.It’s all about the ultra-wide lens
Apple proudly says that the Studio Display has a 12-megapixel camera, which should be enough for sharp images. After all, the iPhone and other Apple devices also have 12-megapixel front-facing cameras. But why is the Studio Display webcam so different in terms of image quality?
While most Apple devices have a regular wide front camera, Studio Display has an ultra-wide lens. This is because it has Center Stage, a feature that uses machine learning to always center the image on a person during a video call or video recording. Since this camera has no optical zoom, Center Stage digitally crops the image to center the people in the frame.
So while an iPhone is capable of taking a real 12-megapixel selfie, Center Stage cameras capture images at 12 megapixels using the ultra-wide lens and then digitally crop them to look like a regular photo or video. This process results in less-sharp images.
For instance, my third generation iPad Air has a seven-megapixel front-facing camera. When I compare it to my iPad mini 6 (which has Center Stage), the old iPad’s images look sharper.
The thing is, the ultra-wide lens is 12MP at its full size. It basically zooms in on you with digital cropping to make the image look like a regular photo, so you’re losing quality. Not to mention that the ultra-wide lens has a smaller aperture, so it gets less light. chúng tôi Filipe Espósito (@filipeesposito) April 26, 2023
As another example, I took the same picture using the wide and ultra-wide rear lens on my iPhone 13 Pro Max.
Both lenses have 12-megapixel resolution, but then I cropped the photo captured by the ultra-wide lens to make it look like the photo from the wide lens, simulating what happens with photos taken by a Center Stage camera. The result, as you can see below, is a much worse quality photo.Is there a solution?
Unfortunately, no matter what Apple does in terms of software updates, there’s nothing that will dramatically improve the Studio Display webcam.
The only two possible solutions to solve this problem are to use a higher resolution sensor, so that the cropped image is at least 12 megapixels, or a larger sensor to capture more light – which would help reduce noise in the image.
However, as you may have guessed, both solutions require a hardware upgrade, which means that owners of the first generation Studio Display will have to deal with the webcam the way it is.
FTC: We use income earning auto affiliate links. More.
You're reading Apple’s Studio Display’s Poor Webcam Quality Is Not A Software Bug After All
For annulated sea snakes, seeing the wonderful world of color wasn’t always possible. These venomous sea snakes that roam Australia and Asia’s oceans once lost their color vision, but a new study into their genomes reveals that they have potentially regained their ability to see a wider palette of colors over the last 100 million years. The findings were published July 12 in the journal Genome Biology and Evolution, published by Oxford University Press.
[Related: A guide to all the places with no snakes.]
For animals, normal color vision is mostly determined by genes called visual opsins. Multiple losses of opsin genes have occurred as tetrapods—a group including amphibians, reptiles, and mammals—have evolved. The emergence of new opsin genes is significantly more rare than losing them. A 2023 study found that some semi-aquatic snake species in the genus Helicops found in South America are the only known snakes to regain these opsin genes.
This ancestral snake species lived on the land and would later evolve into all snake species, including sea snakes. When their genes for color vision were gone, they could only perceive a very limited range of colors. However, that likely started to change as some elapid descendants began to change. Within the last 25 million years, two elapid lineages have moved from terrestrial to marine environments.
With the fully sequenced genome of the annulated snake in hand, the team in this new study from the University of Adelaide in Australia, The University of Plymouth in the United Kingdom and The Vietnamese Academy of Science and Technology looked at visual opsin genes in five ecologically distinct species of elapid snakes. Elapids are the family of about 300 venomous snakes that include mambas, cobras, and the annulated sea snake. Looking at this family more broadly offered an opportunity to investigate the molecular evolution of vision genes.
The team found that the annulated sea snake now has four intact copies of the opsin gene SWS1. Two of these genes are sensitive to ultraviolet light that has shorter wavelengths, while the other two genes have evolved a new sensitivity to the longer wavelengths of light that dominate ocean habitats.
“Only one [of these genes] was expected. To our knowledge, every other ~4000 snake species in the world (except a couple of Helicops species) have just one of these genes. The most interesting part is that two of these genes allow for perception of UV light, while the other two allow for the perception of blue light. This is expected to dramatically increase their sensitivity to colors which could be very useful in bright-light marine environments,” says Rosetto.
The authors believe that this sensitivity means that the snakes could have color discrimination that allows them to distinguish predators from prey, as well as potential snake mates against the more colorful background in the ocean.
[Related: How cats and dogs see the world.]
This significantly differs from the evolution of opsins in mammals like bats, dolphins, and whales during their own ecological transitions. These mammals saw more opsin losses as they adapted to dim-light and aquatic environments.
We learned today about Apple’s satellite project: a team working on ways to establish direct two-way connections between iPhones and satellites.
If that sounds like crazy science-fiction, it’s actually not. The technology to do it exists today and has been proven to work with today’s phones. You shouldn’t, however, expect to have ubiquitous access from anywhere on the planet, nor for satellite connections to replace your existing mobile data plan.
The technology has significant limitations…
A company called Lynk (originally Ubiquitilink) proved the tech works by creating what it called ‘the first cell tower in space.’ It created a prototype satellite that was assembled on the International Space Station and subsequently attached to the nose of the Cygnus resupply spacecraft for a live test back in February. It worked, as TechCrunch reported at the time.
The theory became a reality earlier this year after Ubiquitilink launched their prototype satellites. They successfully made a two-way 2G connection between an ordinary ground device and the satellite, proving that the signal not only gets there and back, but that its Doppler and delay distortions can be rectified on the fly.
“Our first tests demonstrated that we offset the Doppler shift and time delay. Everything else is leveraging commercial software,” Miller said, though he quickly added: “To be clear, there’s plenty more work to be done, but it isn’t anything that’s new technology. It’s good solid hardcore engineering, building nanosats and that sort of thing.”
If it sounds incredible that one of today’s iPhones can transmit into space, especially when there are still mobile dead-spots around on ordinary mobile networks, Lynk says it’s really not. Remove ground obstacles from the equation by beaming directly to and from space, and stick to low-frequency signals, and they can travel a long way.
“That’s the great thing — everybody’s instinct indicates [that it’s impossible],” said Ubiquitilink founder Charles Miller. “But if you look at the fundamentals of the RF [radio frequency] link, it’s easier than you think.”
The issue, he explained, isn’t really that the phone lacks power. The limits of reception and wireless networks are defined much more by architecture and geology than plain physics. When an RF transmitter, even a small one, has a clear shot straight up, it can travel very far indeed.
There are, however, some important caveats that would apply to Apple’s satellite project.
First, you can’t communicate with satellites in geosynchronous orbit – that’s simply too high. The maximum range is around 300 miles, which is extremely low in satellite terms. At that height, satellites can’t remain in orbit at one fixed point above the Earth: they need to orbit much faster than the Earth’s rotation, which means coverage from any one satellite won’t last long.
You’ll have no signal for 55 minutes, then signal for five.
So you’d need at least a thousand satellites to ensure there will always be at least one within range. That would be a massive undertaking, even for Apple.
Second, low-frequency signals mean low bandwidth. What Lynk has demonstrated so far is 2G communication, meaning that it’s suitable for things like text messages but not much more than that. The company does talk grandly about 3G, LTE, and 5G being subsequent stages, but that’s all just talk so far – and it’s hard to see how those kinds of speeds could be achieved over that kind of range through the atmosphere.
Third, it’s unlikely that Apple will sell you a data plan based on low-Earth satellite connections. This is tech which is most likely to be sold through existing mobile carriers as an additional roaming option in areas of the planet that are not served by conventional base stations.
If you want to understand more about how the tech works, the full TechCrunch piece is worth reading, and Lynk has links to other coverage on its website.
FTC: We use income earning auto affiliate links. More.
In mid-February, a team at the Institut Pasteur in France uploaded a SARS-CoV-2 genome with a strange family tree to a global epidemiology database. As one scientist explained on a forum used to identify new variants, the virus, which had infected an older man in northern France, appeared to share its “body” with Delta, but had borrowed the genetic information for most of its spike proteins from Omicron. Based on a standard naming convention, the new COVID type, if it sticks around, will likely be dubbed “XD.” The public has been calling it “Deltacron.”
SARS-CoV-2 has produced hybrids before, mostly involving the Alpha variant. But none of those ended up driving widespread outbreaks, and instead disappeared in the face of newly evolved strains.
[Related: A deep dive on the evolution of COVID and its variants]
Organisms swap genetic code with surprising regularity—influenza viruses, for example, have a modular structure that allows them to swap whole segments with one another. If a person catches two strains of COVID at the same time (rare, but possible when millions are falling ill), the pathogens can go through a process called “recombination” as they replicate, leaving bits of themselves inside the other.
Most of the time, that kind of Freaky Friday swap will kill the virus. But every so often, the recombinant has enough pieces intact to survive and continue spreading, which appears to have happened in the French “XD” case that’s now making headlines.Why you’ve probably heard of “Deltacron” before
Earlier this year, Bloomberg and other news outlets trumpeted the possibility that a “Deltacron” strain had emerged in Cyprus. Those headlines were debunked, even if hybrid COVID variants remained possible.
The coverage based on an interview aired by a local TV station. Leondios Kostrikis, a microbiologist at the University of Cyprus, said his lab had identified cases of the Delta variant with a few stray mutations that looked like those seen on Omicron.
Other scientists, including one from the World Health Organization, were quick to say that although a recombination was possible, this didn’t appear to be such a case. The issue was with the genetic data, which many outside observers believed showed traces of cross-contamination between samples. In other words, the Cyprus lab might have sequenced individual Omicron and Delta viruses at once, not a hybrid of the two.
In a January interview with Nature, Kostrikis said his words had been misconstrued, and that he didn’t think the viruses were a true hybrid. He explained that the Delta- and Omicron-like traits on the variant he’d sequenced might have resulted from convergent evolution—in the same way that bats and birds both have wings. He also argued that his samples hadn’t been contaminated, and withdrew them from an open-source database while waiting for other experts to confirm his findings.
[Related: Why everything eventually becomes a crab]
But there was another simpler explanation than cross-contamination: Sometimes viruses just happen to share a handful of overlapping mutations. Sequences with features of multiple variants “get uploaded all the time,” Thomas Peacock, who researches viral evolution at Imperial College London, told Nature at the time. “But, generally, people don’t have to debunk them because there isn’t a load of international press all over them.”Is this fake “Deltacron” again?
Scientists are much more confident that the new strain from France is a real recombinant. The genetic sequencing data is cleaner, and the Institut Pasteur team reports that it has cultured the virus in a lab, demonstrating that it isn’t a product of cross-contamination.
But just because a new variant combines genetic material from two strains doesn’t mean that it will combine their worst features in real life. Delta and Omicron are very different viruses: They invade human cells in distinct ways, and appear to have developed divergent strategies for evading the human immune system. It’s not at all clear that mixing and matching two independently evolved sets of mutations will improve the virus, rather than just creating a chaotic mishmash. Omicron surprised virologists in part because it used an odd set of adaptations to “unlock” a key receptor on human cells to invade them. Individually, each mutation could have made the virus worse at its job—but together, they ended up being extremely effective.
Since the “Deltacron” case was diagnosed last month, researchers have identified roughly 30 similar cases spread across France, the Netherlands, and Denmark. (It’s not clear if all of those are descended from the same infection, however.) That’s enough to suggest that the recombinant is capable of spreading. Still, with Europe currently seeing a new uptick in COVID cases, it would take many more than 30 “Deltacron” cases to demonstrate that the hybrid present a bigger threat than the virus that’s already in front of us.
The revelation that a major FaceTime bug can effectively turn your Apple devices into a hot mic, allowing a caller to hear or even see you before you pick up, would be a massive embarrassment no matter which company was involved. It’s an absolutely crazy security fail.
But when that company is Apple – which has been ceaselessly pushing privacy of late – it becomes so cringeworthy we’re going to have to invent a whole new scale just to measure it …
I mean, I get it. Bugs happen. No-one intends them, but coding is complex, and software engineers are human. It’s just a fact of life that some bugs will make it through, and that this will include security vulnerabilities.
Software testing is also complex, given the massive number of variables involved. This particular FaceTime bug occurs only when someone does something completely illogical and unexpected: adds themselves to a call they initiated. I appreciate this would have been a tricky scenario to anticipate and include in testing.
But when you are Apple, a company which has talked of little other than privacy over the past few months, then you don’t get a pass on this. And if you think I’m holding Apple to too high a standard, let’s take a look at some examples.FaceTime Bug vs. Privacy
October 2, Tim Cook talks privacy to Vice.
I’m not a pro-regulation guy, but when the free market doesn’t produce a result great for society, you have to ask yourself what we need to do. We’ve got to figure out a way to take it to the next level and change some things.
The way we go into product design is we challenge ourselves to collect as little as possible. We challenge ourselves to make it not identifiable. We don’t read your email, your messages. You are not our product. It’s not the business we’re in.
October 23, Cook gives a keynote address at the International Conference of Data Protection and Privacy Commissioners in Brussels.
We at Apple can—and do—provide the very best to our users while treating their most personal data like the precious cargo that it is. And if we can do it, then everyone can do it.
October 24, Cook says many companies can’t be trusted on privacy, and federal regulation is needed.
In this case, it’s clear that the amount of things that can be collected about you, without your knowledge, maybe with your consent – although it’s a 70-page legal piece of paper, just isn’t reasonable. These things can be used for such nefarious things, we’ve seen examples of this over the last several years and we think it’s time now to take this thing and put it under control, because if we don’t, the problem gets so large that it may be impossible to fix
November 18, Cook talks privacy with HBO.
Generally speaking, I am not a big fan of regulation. I’m a big believer in the free market. But we have to admit when the free market is not working. And it hasn’t worked here.
January 5, an Apple billboard in Vegas claims ‘What happens on your iPhone, stays on your iPhone.’
January 24, Cook writes an op-ed for Time in which he says that ‘data breaches seem out of control.’
Consumers shouldn’t have to tolerate another year of companies irresponsibly amassing huge user profiles, data breaches that seem out of control and the vanishing ability to control our own digital lives.Apple Standards
The standard to which I’m holding Apple today is one the company set for itself, very loudly and very frequently.
Difficult or not, the testing work to prevent a security vulnerability of this magnitude has to be done. Every variable has to be tested, whether it’s someone adding themselves to a call they made, adding contacts in reverse alphabetic order or asking Siri to initiate a call while standing on your head in a west-facing room on a Thursday evening.
Apple has responded by disabling group FaceTime calls. That’s a responsible course of action. And I have no doubt that it will quickly release an update to fix the bug.
But this FaceTime bug is an absolutely massive fail. Apple either needs to be able to overhaul its software development and testing regime such that it can be certain nothing of this seriousness can ever occur again, or it needs to cease throwing quite so many stones from what turns out to be a glass house.
Check out 9to5Mac on YouTube for more Apple news:
FTC: We use income earning auto affiliate links. More.
Despite the well-documented chaos wrought by sea level rise, hurricanes, and climate change, Americans keep returning to imperiled coastal areas. In fact, a new study in the journal Nature Sustainability indicates residents may be living larger than they were before the storm.
In an analysis of satellite images of five American coastal communities before and after a catastrophic weather event, the authors found “the same pattern at all five locations: since the last major hurricane, larger residential buildings have tended to replace smaller ones,” according to the study.
Eli Lazarus, a lecturer in geomorphology at the University of Southampton and lead author on the study, says that the findings have important repercussions for taxpayers. It may also fill in a few more pixels in our big picture understanding of the way Americans respond to disasters.
To estimate the mean change in real estate, Lazarus and his team gathered satellite data, from sources like Google Earth, of five hurricane-prone places: Mantoloking, New Jersey; Hatteras and Frisco, North Carolina; Santa Rosa Island, Florida; Dauphin Island, Alabama; and Bolivar, Texas. They looked at images taken before the most recent hurricane and compared them to satellite data gathered post-recovery.
Even with conservative study inclusion criteria (any structure that experienced a 15 percent or smaller change in size was excluded, Lazarus says, because with “satellite imagery, there’s tilt, the sun can glare in places, and you have to be careful with what you’re digitizing”), the results were striking. The study found that rebuilds were between 19 and 50 percent larger than the original structure. New construction increased in mean size between 14 percent and 55 percent compared to the buildings that stood before a given storm.
Lazarus admits that “people can renovate their houses for all manner of reasons,” and houses everywhere in the United States have been growing larger, irrespective of disaster. But he think his team identified an interesting and troubling post-disaster trend.
There’s also concern that such disasters may be displacing poor and middle-class homeowners, allowing developers to swoop in after a catastrophe and build a wealthy renter or buyer’s dream McMansion from the ashes. In a blog post accompanying the study, Lazarus cited several such events, documented by newspapers around the country. “The one that really continues to hold my attention is the New York Times piece on the Jersey shore,” he says, citing a story about developers who were able to buy bigger lots at depressed prices, permanently changing the community.
Not every disaster-prone place is eagerly repopulated. After Hurricane Sandy, residents of Staten Island’s Oakwood Beach neighborhood decided to sell their land to the government. They hope to use the money to rebuild a life where 20-foot storm surges can’t find them. And earlier this year, the Louisiana state government began formalizing plans to buy out homes in southern marshland. There, coastal flooding agitates and endangers residents—and restoring natural ecosystems has the potential to minimize natural disasters for people farther inland.
But these are unusual incidents, well-publicized precisely because of their rarity. “The political rhetoric you hear after an event, always, is, ‘We’ll be back. We’ll make it better than ever,’” Lazarus says. That line of thinking is likely to continue, but the evidence suggests it shouldn’t.
Update the detailed information about Apple’s Studio Display’s Poor Webcam Quality Is Not A Software Bug After All on the Bellydancehcm.com website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!