ContentSproute

us-technology

Starlink cuts Standard dish price with subscription promo making it 80% cheaper than the Mini kit

The standard Starlink dish is now discounted in a new promo. (Image source: SpaceX) The Starlink Mini Kit used to be way more expensive than the Standard dish at launch. These times are now back, after Starlink increased the Mini dish price and introduced a Standard kit promo. SpaceX barely waited for the free Starlink dish promo to expire and immediately introduced a new discount that makes the Standard Kit 80% cheaper than the Starlink Mini dish. The standard Starlink dish is now just $99 with an annual subscription for a $120/month Residential or an $80 Residential Lite satellite Internet plan in areas where the promo is available. In contrast, the Mini Kit that is now discounted on Amazon, has reverted to its previous high pricing over at Starlink. The Mini dish spent a few months priced lower than the Standard Kit, but such is demand for it, that SpaceX has now ended the promo, all the while it cut the standard dish pricing by $250 in certain areas of the US. Said areas coincide with states and regions where it doesn’t charge outrageous congestion fees, which could amount to a grand over the Starlink Standard Kit price. Future Starlink subscribers in parts of the Northwest, for instance, not only have to pay the full $349 price for the dish plus the respective taxes, as well as shipping and handling, but also a $1,000 surcharge. To further discourage new subscriptions in congested areas, Starlink now makes new customers wait for up to six weeks after clicking the buy button before it delivers their satellite Internet kits, too. The opposite is true for vast areas in the middle of the US, where subscription numbers are manageable. While not giving dishes away for free there with a 12-month subscription like during its previous promo, Starlink is now offering a new deal that brings the price of the Standard Kit down to $99 with an annual Residential plan subscription. Daniel Zlatev – Senior Tech Writer – 1806 articles published on Notebookcheck since 2021 Wooed by tech since the industrial espionage of Apple computers and the times of pixelized Nintendos, Daniel went and opened a gaming club when personal computers and consoles were still an expensive rarity. Nowadays, fascination is not with specs and speed but rather the lifestyle that computers in our pocket, house, and car have shoehorned us in, from the infinite scroll and the privacy hazards to authenticating every bit and move of our existence. Daniel Zlatev, 2025-07-24 (Update: 2025-07-24) Read More

Starlink cuts Standard dish price with subscription promo making it 80% cheaper than the Mini kit Read More »

10 Years of Insta360: STAR Market Debut, a Drone Surprise, and the Launch of Flow 2 Pro

Key Takeaways A milestone year for Insta360: With its 10th anniversary, a STAR Market IPO, and rumors swirling around the Antigravity drone brand, Insta360 is signaling bigger ambitions beyond action cams. Flow 2 Pro goes beyond basic stabilization: Insta360’s new smartphone gimbal blends AI tracking, smooth motion, and plug-and-play usability in a lightweight, portable design. Ecosystem-first thinking: Flow 2 Pro, Mic Air, and the X5 form a tightly connected lineup, pointing to Insta360’s bigger play: a unified content creation platform built on smart, modular tools. July 2025 marks a milestone year for Insta360. The company has just turned 10 years old, and it’s celebrating with more than just cake and candles.  It’s celebrating the occasion with new gear, a bolder vision, and a clear message that it’s no longer just about action cams.  Its latest release, the Flow 2 Pro, is more than just a smartphone stabilizer – it’s a smart, AI-powered tool that integrates into a growing creator ecosystem.  Add to that the new Mic Air, a wireless microphone the size of a mint tin, and it’s obvious: Insta360 isn’t just launching gadgets, it’s building a toolkit.  A full suite of tightly integrated products designed for modern, mobile-first creators. And behind this launch is a company that’s evolving fast – because while Flow 2 Pro is the news, Insta360’s bigger story might be what’s happening behind the scenes.  IPO Buzz, Billion-Dollar Valuation, and an Expanding Vision On June 11, Insta360 went public on the Shanghai STAR Market, and the reception was anything but quiet.  With a first-day jump of 285%, the stock soared from its IPO price of ¥47.27 to over ¥182, pushing the company’s market cap above ¥70B – roughly $10B. That kind of debut isn’t just good press – it’s a sign. Insta360 has momentum, and it’s making moves far beyond 360° cameras. A recent PR push proudly highlights that Insta360 has held the #1 spot in the U.S. 360° camera market for five years straight, and they’re not slowing down.  With plans for a new R&D center in Shenzhen, a manufacturing facility under construction, and fresh funding from the IPO, Insta360 is doubling down on innovation. They’re also quietly expanding their universe. Along with cameras and gimbals, Insta360 recently helped incubate Antigravity, a new consumer drone brand developed with third-party partners.  While Antigravity functions on its own, its ties to the Insta360 ecosystem suggest a bigger goal: smooth, AI-driven content tools for land, air, and handheld filming.  From a Shenzhen startup to a global imaging innovator, Insta360’s journey is no longer about finding a niche. It’s about owning the entire creative workflow from shoot to share – and the Flow 2 Pro is just the latest piece in that process.  Flow 2 Pro: The AI-Powered Smartphone Stabilizer That Feels Like the Rampage  Insta360’s latest device, the Flow 2 Pro, offers more than just smooth tracking – it’s like giving your phone a jetpack.  Thanks to AI-powered subject tracking, it can lock onto faces or objects with uncanny accuracy.  But what makes it truly fun is how it handles movement: it smooths out acceleration, tilts, and sudden shifts so effectively that your phone almost seems to glide. During my hands-on testing, I realized it was more ‘your phone demonstrating AI choreography’ than ‘you filming.’ If you stumble while walking, the Flow 2 Pro automatically corrects your shake – no fancy tripod needed.  But if you do need one, it even features a built-in tripod in the grip, so you’re always prepared for stationary shots as well. Compared to earlier gimbals that rely on rigid mechanical axes, this one feels alive. You point, and it follows. Simple logic, but executed with flair. It’s also compact and lightweight, so it’s easy to throw in a bag and take anywhere, whether you’re heading out for a casual vlog or a last-minute shoot on the road.  Insta360 also leans on usability here. There’s no fiddling with balance or twice-forgotten app settings. Out of the box: AI tracks. Tap to start. This feels like the next step in personal-film gear. A great tool for creators who’d rather talk to the camera than tweak a dial. Everyday Hero with a Few Limits On the downside, the battery life is about 10 hours according to specs – respectable but not amazing. And if you’re filming fast sports or wild movement, expect occasional tracking delays. For example, I tested both the Flow 2 Pro and the Insta360 X5 at the UCI Mountain Bike World Series event.  While the Flow 2 Pro occasionally loses track of riders in sharp turns or jumps, the X5 handles chaos like a pro – its wider field of view and advanced stabilization keep the action smooth and the subject in frame, even at top speed.  But for everyday use – vlogs, filming your latest cooking experiment, or even capturing clips from a bumpy tuk-tuk ride on vacation – it performs well.  Flow 2 Pro vs. the Rest: Why It’s More Than Just Another Gimbal Yes, there are cheaper gimbals out there. The DJI Osmo Mobile SE, for example, costs under $100 and does the job for basic stabilization.  Hohem’s iSteady M6 costs over $200 for a bundle with AI tracking. The Flow 2 Pro is priced at $159.99, providing a good mix of affordability and advanced features.  Not to mention that it’s sleek, simple to set up, and plays well with your native camera and social apps. What makes the Flow 2 Pro stand out is its combination of intelligent tracking, sleek design, and easy plug-and-play use.  You don’t need to fuss with calibration either. The AI locks onto your subject quickly – no need to stand still or constantly reframe.  Insta360 built its reputation on smart 360° video, and this feels like a natural evolution. It’s less ‘gimbal’ and more ‘AI stick’ – a stabilizer that thinks for you. And because it’s part of a bigger Insta360 ecosystem, the Flow 2 Pro plays nicely with Insta360 accessories such

10 Years of Insta360: STAR Market Debut, a Drone Surprise, and the Launch of Flow 2 Pro Read More »

iPadOS 26 preview: A long-awaited multi-tasking update pays off (so far)

I’m not going to beat around the bush: iPadOS 26 and its new multitasking features are a game-changer for Apple’s tablets. Pretty much ever since Steve Jobs introduced the iPad 15 years ago (!), Apple has tried to straddle two worlds. In one, the iPad is a super-simple, easy-to-use tablet with a gorgeous display and tons of good apps from the App Store for gaming, entertainment and light work. The other world is one where the iPad replaces your traditional computer, letting people do the serious work that’s been typically reserved for a Mac or Windows PC. iPadOS has too often served as a hindrance to the latter goal over the years, particularly as the iPad Pro has gotten more powerful. The Stage Manager multitasking experience Apple introduced with iPadOS 16 in 2022 was a major step towards making the iPad’s software suitable for power users — but it was rather buggy at launch and not as flexible as iPad power users were hoping for. The calls to just put macOS on the iPad grew louder. But this year, Apple took a different approach: it brought crucial macOS features like the familiar “stoplight” window controls, the menu bar at the top of the screen and vastly improved window management tools to iPadOS 26. The result is an iPad experience where you can easily jump between multiple windowed apps set up just how you like and one where you go full screen to focus on your content. It’s a massive refinement over the old Stage Manager experience and one that I think will finally satisfy iPad die-hards who want to push their tablets to the limit. Before diving into the details, a quick word on betas and stability. As usual, Apple’s public betas feel pretty stable and capable, but that doesn’t mean you won’t run into weirdness here and there. App crashes, particularly with third-party apps, happened far more on this beta than with iPadOS 18. I’m confident those things will be ironed out as more developers update their apps for the new OS, but you’ll also run into things like UI inconsistencies and occasional stutters and jerkiness when jumping between apps. As we always say, think hard about what you’re willing to put up with to try a beta, even the relatively stable public beta. After all, the final version of iPadOS 26 will be out in just a few months. (Ed. note: Apple just released the public betas for iOS 26, iPadOS 26, macOS 26 and watchOS 26. This means, as Nate stated above, you can run the preview for yourself, if you are willing to risk potentially buggy software. As usual, we highly recommend backing up all your date before running any beta, and you can follow our guide on how to install Apple’s public betas to do so.) Multitasking Time for the nitty-gritty. When you update to iPadOS 26, you’ll be asked if you want to enable multitasking or have apps run in full screen mode only. When Apple says full screen only, they mean it. Past versions of iPadOS offered either Stage Manager or a basic, two-app split screen view with a third app available in a Slide Over window. The latter option is gone now, though you can still easily set up two or three apps side by side with iPadOS 26’s window tiling features. You can use multitasking mode or full screen only, with no in between. I think that’s a smart move, as plenty of people who use an iPad probably never use these multi-app features, and having a “multitasking on or off” toggle keeps things simple. When you turn on multitasking mode, apps still open in full screen first — but you can grab any corner of the window to resize it or touch the top of the app and drag it around the screen. You could already do this with Stage Manager, so what really sets this new mode apart is how it interacts with other windows. Swiping up from the bottom of the display reveals the usual Home Screen view, but with your apps tucked to the side as a visual cue that you can add another app to that group. And, as before, you can move, resize and stack that app window wherever you want. As I’m writing this, I have nine separate app windows open on my iPad, and getting around them feels more Mac-like than ever before. I can swipe up and hold from the bottom of the Home Screen and see every open app in a smaller window, which makes finding the specific thing I want a lot easier; I can also just command-tab through them. Apps can be minimized down to the dock and when I want the app back, it’ll pop open in the same size window and same placement as before. Swiping up from the bottom of the screen twice minimizes everything I have open to start fresh — but again, if I reopen those apps, they’ll go back to exactly where I had them set up before. I realize it sounds kind of silly to make a big deal of this, but it’s hard to overstate how much this improves the iPad multitasking experience. With Stage Manager, I was never quite sure where an app would open or if it would be full screen or windowed. It can be useful for setting up multiple groups of apps, but adding and removing apps from that view was not terribly intuitive. Just opening everything in one space is a lot more intuitive. And if you want to have various different spaces with specific apps, you can still turn on Stage Manager. It’s a lot easier to add and remove apps from various different groups than it used to be; minimizing a window puts it into its own space that you can add more apps to (or just use it on its own). There are a few other new components that make multitasking work

iPadOS 26 preview: A long-awaited multi-tasking update pays off (so far) Read More »

macOS 26 beta preview: Spotlight’s time to shine

I’ve learned not to expect much from macOS updates — not through sheer cynicism, but from the obvious reality that Apple pays far more attention to iOS and iPadOS than its desktop platform. I get it. macOS is a thing of the past, while smartphones and tablets are Apple’s profitable present and future. But still, I think Mac users deserve more than just widgets, or the ability to merely mirror their iPhones (a feature that’s not only genuinely useful, but also cements how crucial iPhones are to Apple and its users today). Now with macOS 26, Apple is finally showing a bit more love to its laptops and desktops. After testing the macOS Tahoe 26 developer beta for a few weeks, it definitely feels like a more substantial update than the last few versions. The revamped Spotlight alone will likely delight Mac diehards, since it makes it easier to find apps and perform all manner of tasks without requiring your fingers leave the keyboard. Add in a lovely visual refresh thanks to Apple’s Liquid Glass design, as well as enhanced iPhone continuity features, and you have an operating system that feels like a genuine step forward for the Mac faithful. (Ed. note: Apple just released the public betas for iOS 26, iPadOS 26, macOS 26 and watchOS 26. This means you can run the preview for yourself, if you are willing to risk potentially buggy or unstable software that could cause some of your apps to not work. As usual, we highly recommend backing up all your date before running any beta, and you can follow our guide on how to install Apple’s public betas to do so.) Spotlight becomes more than a search engine I’ve never been a heavy Spotlight user, aside from the few times I’ve needed to quickly look for an app or file. But in macOS Tahoe 26, it’s suddenly a lot more useful. Now Spotlight can also help you find specific files; search cloud file services and websites; run Apple shortcuts and automations; and even run basic commands, like looking up your recent screenshots when you type “/screenshot.” Spotlight has become more than just a search engine for your Mac, it’s practically a super-powered command line. (And notably, it has no real equivalent on Windows. So once again, Mac power users will be able to gloat about the supremacy of macOS.) Apple Over the course of my testing, tapping the command key and space bar to trigger Spotlight practically became second nature. I’d bring it up to find files, as usual, but I noticed that it was better at unearthing what I was looking for than before. Spotlight also replaced my usual practice of typing in search strings in Safari or Chrome’s address bar. It takes just a few seconds to bring up Spotlight, type “YouTube,” hit tab to trigger the search box and type in the video I’m looking for. This intelligent site searching also works for Amazon and IMDB — hopefully, Apple will add more commonly used sites over time (or perhaps just the ability to map a site’s internal search engine to Spotlight). If you’re often buried by tons of tabs in your web browser, Spotlight can also quickly search through them. That helped me avoid getting distracted by social media and Slack conversations. I could just stay in my productivity flow, since I didn’t have to sift through multiple app windows and tabs. The ability to trigger actions from Spotlight was similarly useful — it’s a cinch to pop it up, start typing “Send Message” and jot out a quick text to my wife. Spotlight also learns your most common commands over time, so now I just have to type “se” for the Send Message action to pop up. I’m sure for a certain type of Mac power user, Spotlight will give them far fewer reasons to ever touch their mouse or trackpad. Apple A more refined user interface with Liquid Glass While Spotlight is the most powerful upgrade in macOS Tahoe 26, you’ll notice the spiffier Liquid Glass interface first. As with iOS 26, it basically amounts to more transparencies and visual flourishes spread throughout the OS. The menu bar at the top of the screen is now fully transparent, instead of looking foggy like before. It’s not much, but it does make your Mac’s screen seem a little bigger (or perhaps that may have just been extra noticeable on the cramped 13-inch MacBook Air I’ve been testing on). Similarly, widgets and the Control Center dropdown have more glass-like visual elements that make them look a bit more modern. There’s no real practical advantage, but to paraphrase a classic Marge Simpson quote, I just think it’s neat. Apple Silicon-equipped hardware has more than enough graphics power to spare, so these visual upgrades also don’t hinder performance at all. I didn’t notice any slowdown during my testing, and according to Activity Monitor, there didn’t seem to be a big hit to CPU or GPU usage. Apple Better iPhone integration Even though you’ve been able to make phone calls on Macs for a while now through FaceTime, it’s taken until macOS Tahoe 26 for Apple to debut a dedicated Phone app. The app itself is nothing special — it gives you a quick glance at your contacts and recent calls, all in a compact Liquid Glass window — but at least it’s a more logical place for phone calls. Even better than the standalone app, though, is the addition of iPhone Live Activities appearing in the macOS Tahoe 26 menu bar. That makes it easier to keep track of an inbound Uber or Doordash order, without whipping out your phone. And if you need to tap into a specific activity, macOS will also automatically launch the app from your phone via iPhone mirroring. It’s the sort of usability feature you’d expect from Apple, and notably it’s also not easily replicable on Windows. (And sure, you can also view it as a

macOS 26 beta preview: Spotlight’s time to shine Read More »

iOS 26 beta preview: Liquid Glass is better than you think

At WWDC 2025, Apple revealed a major visual shake-up for iOS (not to mention the rest of the company’s operating systems). This is the biggest change, aesthetically, since the shift away from the stitching, textures and skeuomorphic design of the iOS 4. It also comes with significantly fewer AI and Siri updates this time around. However, it’s the smaller touches that make iOS 26 seem like a notable improvement over its predecessor. I’ve been running the iOS 26 developer beta for the last two weeks and here’s how Apple’s new Liquid Glass design — and iOS 26 broadly — stacks up. (Ed. note: Apple just released the public betas for iOS 26, iPadOS 26, macOS 26 and watchOS 26. This means you can run the preview for yourself, if you are willing to risk potentially buggy or unstable software that could cause some of your apps to not work. As usual, we highly recommend backing up all your date before running any beta, and you can follow our guide on how to install Apple’s public betas to do so.) Liquid Glass changes everything iOS 26 looks new and modern. And for once, how Apple describes it — liquid glass — makes sense: it’s a lot of layers of transparent elements overlapping and, in places, the animations are quite… liquidy. Menus and buttons will respond to your touch, with some of them coalescing around your finger and sometimes separating out into new menus. Liquid Glass encompasses the entire design of iOS. The home and lock screens have been redesigned once again, featuring a new skyscraping clock font that stretches out from the background of your photos, with ever-so-slight transparency. There’s also a new 3D effect that infuses your photos with a bit of spatial magic, offering a touch of Vision Pro for iPhone users. The experience in the first few builds of the iOS 26 beta was jarring and messy, especially with transparent icons and notifications, due to those overlapping elements making things almost illegible. Updates across subsequent releases have addressed this issue by making floating elements more opaque. There is also a toggle within the Accessibility tab in Settings to reduce transparency further, but I hope Apple offers a slider so that users can choose exactly how “liquid” they want their “glass” to be. If you own other Apple products, then you’ll come to appreciate the design parity across your Mac, iPad and Apple Watch. One noticeable change I’d been waiting for was the iOS search bar’s relocation to the bottom of the screen. I first noticed it within Settings, but it reappears in Music, Podcasts, Photos and pretty much everywhere you might need to find specific files or menu items now. If, like me, you’re an iPhone Pro or Plus user, you may have struggled to reach those search bars when they were at the top of the screen. It’s a welcome improvement. Visual Intelligence Screenshot (Apple) With iOS 26 on iPhones powerful enough to run Apple Intelligence, the company is bringing Visual Intelligence over to your screenshots. (Previously it was limited to Camera.) Once you’ve grabbed a shot by pressing the power and volume up buttons, you’ll get a preview of your image, surrounded by suggested actions that Apple Intelligence deduced would be relevant based on the contents of your screenshot. Managing Editor Cherlynn Low did a deep dive on what Visual Intelligence is capable of. From a screenshot, you can transfer information to other apps without having to switch or select them manually. This means I can easily screenshot tickets and emails, for example, to add to my calendar. Apple Intelligence can also identify types of plants, food and cars, even. If there are multiple people or objects in your screenshot, you can highlight what you want to focus on by circling it. There aren’t many third-party app options at this point, but that’s often the case with a beta build. These are features that Android users have had courtesy of Gemini for a year or two, but at least now we get something similar on iPhones. One quick tip: Make sure to tap the markup button (the little pencil tip icon) to see Visual Intelligence in your screenshots. I initially thought my beta build was missing the feature, but it was just hidden behind the markup menu. More broadly, Apple Intelligence continues to work well, but doesn’t stand out in any particular way. We’re still waiting for Siri to receive its promised upgrades. Still, iOS 26 appears to have improved the performance of many features that use the iPhone’s onboard machine learning models. Since the first developer build, voice memos and voice notes are not only much faster, but also more accurate, especially with accents that the system previously struggled with. Apple Intelligence’s Writing tools — which I mainly use for summarizing meetings, conference calls and even lengthy PDFs — doesn’t choke with more substantial reading. On iOS 18, it would struggle with voice notes longer than 10 minutes, trying to detangle or structure the contents of a meeting. I haven’t had that issue with iOS 26 so far. Van life vlogger, bald or running for Congress? (Image by Mat Smith for Engadget) Genmoji and Image Playground both offer up different results through the update. Image Playground can now generate pictures using ChatGPT. I’ll be honest, I hadn’t used the app since I tested it on iOS 18, but the upgrades mean it has more utility when I might want to generate AI artwork, which can occasionally reach photorealistic levels. One useful addition is ChatGPT’s “any style” option, meaning you can try to specify the style you have in mind, which can skirt a little closer to contentious mimicry — especially if you want, say, a frivolous image of you splashing in a puddle, Studio Ghibli style. Apple also tweaked Genmoji to add deeper customization options, but these AI-generated avatars don’t look like me? I liked the original Genmoji that launched last year, which had

iOS 26 beta preview: Liquid Glass is better than you think Read More »

Apple’s iOS 26, iPadOS 26, macOS Tahoe 26 and watchOS 26 public betas are ready to download

You can now take Apple’s 2026 software for a spin. The first public betas for iOS 26, iPadOS 26, macOS 26, watchOS 26 and tvOS 26 are now available, and we have directions on updating your devices if you’re feeling brave. The two most obvious changes serve to unify Apple’s platforms. First, we have Apple’s biggest cosmetic overhaul to date. Liquid Glass is the company’s name for the shiny, translucent redesign that will be heading to its software this fall. The other significant change is in the numbering. Apple traded its old chronological system for a year-based one. Since 2026 is when the software will spend the bulk of its time in the spotlight, “26” it is. Apple iOS 26 brings new personalized backgrounds and polls to Messages. Live Translation is another new arrival, making it easier to communicate in Messages, FaceTime and Phone. In addition, Visual Intelligence inches forward: It now lets you interact with content on your iPhone’s screen. There are also new screening tools to decide whether a conversation is worth your time. The Phone app even includes Hold Assist, which listens to the Muzak so you don’t have to. Check out our preview of iOS 26 for more. Arguably, Apple’s most significant update this year is iPadOS 26. The new software makes Apple’s tablet more of a workhorse. The iPad finally has desktop-like window management and Menu Bar dropdown entries. It even includes the Preview app and Exposé, both familiar to Mac users. The update should do a lot to calm the fury over the iPad Pro’s wasted productivity potential. We got into the details of the iPadOS 26 public beta and found the new multitasking features to be a big deal. Apple Meanwhile, Apple’s Mac software adopts the “26” branding without ditching California landmarks. macOS Tahoe 26 adds the Phone app and Live Activities from the iPhone. The update also introduces a more advanced Spotlight that allows you to take actions directly from the launcher. Here’s our first look at macOS Tahoe 26. Finally, watchOS 26 adds Workout Buddy, a virtual fitness coach. The AI-powered feature learns from your fitness history to “identify meaningful insights in real time.” A text-to-speech model then communicates those to you verbally. “You’re crushing it — closing that move ring for six straight days!” Although the public betas are less risky than installing a developer beta on day one, remember that this is still pre-release software. Only go this route if you’re comfortable with the inherent risks, which could include buggy apps and unpredictable battery life. It also can’t hurt to make a local backup of your device before taking the plunge. Read More

Apple’s iOS 26, iPadOS 26, macOS Tahoe 26 and watchOS 26 public betas are ready to download Read More »

How to install the iOS 26 public beta

The latest version of iOS will arrive officially this fall, but you don’t need to wait to start testing the software on your iPhone, thanks to Apple’s public beta rollout. Here’s everything you need to know about setting up the iOS 26 beta, along with the respective betas for iPadOS 26 and watchOS 26, which Apple also revealed in its WWDC 2025 keynote. Before we get started:, no, you haven’t accidentally slept through eight versions of major Apple OS updates. In case you missed the news, from now on all of the company’s various operating systems will be named after years to keep everything aligned and easy to follow. So rather than iOS 19, we’re getting iOS 26 this year, which refers to the year after each update rolls out. Presumably that’s because we’ll be using it for longer in 2026 than what will remain of this year once the full version is in the wild. It’s also important to keep in mind that any beta is software in a pre-release state, meaning it’s far more likely you’ll encounter bugs, crashes and other issues with apps and in general use, which Apple and third-party developers will attempt to fix before the final version rolls out to users worldwide. Install any beta at your own risk and think carefully before doing so with the device you use every day. It’s also very important that you back up any device you want to test software on before you download it. iOS 26 supported devices iOS 26 is supported on a wide range of iPhones — but not all of them. You’ll need one of the following models: iPhone SE (second generation or later) iPhone 11 iPhone 11 Pro iPhone 11 Pro Max iPhone 12 iPhone 12 mini iPhone 12 Pro iPhone 12 Pro Max iPhone 13 iPhone 13 mini iPhone 13 Pro iPhone 13 Pro Max iPhone 14 iPhone 14 Plus iPhone 14 Pro iPhone 14 Pro Max iPhone 15 iPhone 15 Plus iPhone 15 Pro iPhone 15 Pro Max iPhone 16e iPhone 16 iPhone 16 Plus iPhone 16 Pro iPhone 16 Pro Max If your iPhone isn’t listed above, that probably means it’s too old to run iOS 26, so you’ll need to upgrade to one of the listed models. Installing betas used to be a fiddly process, but it’s very easy these days. If it’s your first time installing an iOS public beta, you’ll need to first visit the Apple Beta Software Program website and sign up using your Apple credentials. After that, navigate to Settings > General > Software Update on your compatible iPhone, and choose “iOS 26 public beta”. You should then see the option to download and install the beta software. You can read about our first experiences with iOS 26 here. iPadOS 26 supported devices Here are the supported models for the iPadOS 26 beta iPad Pro (M4) iPad Pro 12.9-inch (3rd generation or later) iPad Pro 11-inch (1st generation and later) iPad Air (M3) iPad Air (M2) iPad Air (3rd generation and later) iPad (A16) iPad (8th generation and later) iPad Mini (A17 Pro) iPad Mini (5th generation and later) How to install the iPadOS 26 public beta Like with iOS above, you’ll need to first visit the Apple Beta Software Program website and sign up using your Apple credentials if you’ve never taken part in one before. After that, navigate to Settings > General > Software Update on your supported iPad, and choose “iPadOS 26 public beta”. You should then see the option to download and install the beta software in the Software Update screen. You can read about our experiences with iPadOS 26 here. watchOS beta: Use caution While Apple Watch users can also participate in beta programs in the same way as iOS and iPadOS testers, doing so carries greater risk. That’s because if you’re not enjoying the experience and decide you want to downgrade to watchOS 11, well, you can’t. Apple doesn’t allow it. And if you downgrade your watch’s paired iPhone to iOS 18, your Apple Watch won’t work correctly with your phone until it’s back on the new software version. You can, however, leave your Apple Watch on watchOS 11 when your phone is on the iOS 26 beta. Be extremely sure, then, that you’re comfortable with the possibility of waiting it out for a few months with an Apple Watch plagued with issues before downloading the watchOS 26 beta. For most people, it likely isn’t worth the hassle. watchOS 26 supported devices You’ll need one of these models to run the watchOS 26 beta Apple Watch SE. (2nd generation) Apple Watch Series 6 Apple Watch Series 7 Apple Watch Series 8 Apple Watch Series 9 Apple Watch Series 10 Apple Watch Ultra Apple Watch Ultra 2 How to install the watchOS 26 public beta If you’ve assessed the risks for your Apple Watch and still choose to install the watchOS 26 beta, you’ll first need to have already updated your paired iPhone to the iOS 26 beta (see above). After that, make sure your Apple Watch is paired to your iOS 26-running iPhone and open the Watch app on your iPhone. Then, navigate to General > Software Update, and choose the watchOS 26 public beta. After doing that, you should be able to download the beta software. Read More

How to install the iOS 26 public beta Read More »

Major Quantum Computing Advance Made Obsolete by Teenager (2018)

A teenager from Texas has taken quantum computing down a notch. In a paper posted online earlier this month, 18-year-old Ewin Tang proved that ordinary computers can solve an important computing problem with performance potentially comparable to that of a quantum computer. In its most practical form, the “recommendation problem” relates to how services like Amazon and Netflix determine which products you might like to try. Computer scientists had considered it to be one of the best examples of a problem that’s exponentially faster to solve on quantum computers — making it an important validation of the power of these futuristic machines. Now Tang has stripped that validation away. “This was one of the most definitive examples of a quantum speedup, and it’s no longer there,” said Tang, who graduated from the University of Texas, Austin, in spring and will begin a Ph.D. at the University of Washington in the fall. In 2014, at age 14 and after skipping the fourth through sixth grades, Tang enrolled at UT Austin and majored in mathematics and computer science. In the spring of 2017 Tang took a class on quantum information taught by Scott Aaronson, a prominent researcher in quantum computing. Aaronson recognized Tang as an unusually talented student and offered himself as adviser on an independent research project. Aaronson gave Tang a handful of problems to choose from, including the recommendation problem. Tang chose it somewhat reluctantly. “I was hesitant because it seemed like a hard problem when I looked at it, but it was the easiest of the problems he gave me,” Tang said. The recommendation problem is designed to give a recommendation for products that users will like. Consider the case of Netflix. It knows what films you’ve watched. It knows what all of its other millions of users have watched. Given this information, what are you likely to want to watch next? You can think of this data as being arranged in a giant grid, or matrix, with movies listed across the top, users listed down the side, and values at points in the grid quantifying whether, or to what extent, each user likes each film. A good algorithm would generate recommendations by quickly and accurately recognizing similarities between movies and users and filling in the blanks in the matrix. In 2016 the computer scientists Iordanis Kerenidis and Anupam Prakash published a quantum algorithm that solved the recommendation problem exponentially faster than any known classical algorithm. They achieved this quantum speedup in part by simplifying the problem: Instead of filling out the entire matrix and identifying the single best product to recommend, they developed a way of sorting users into a small number of categories — do they like blockbusters or indie films? — and sampling the existing data in order to generate a recommendation that was simply good enough. At the time of Kerenidis and Prakash’s work, there were only a few examples of problems that quantum computers seemed to be able to solve exponentially faster than classical computers. Most of those examples were specialized — they were narrow problems designed to play to the strengths of quantum computers (these include the “forrelation” problem Quanta covered earlier this year). Kerenidis and Prakash’s result was exciting because it provided a real-world problem people cared about where quantum computers outperformed classical ones. “To my sense it was one of the first examples in machine learning and big data where we showed quantum computers can do something that we still don’t know how to do classically,” said Kerenidis, a computer scientist at the Research Institute on the Foundations of Computer Science in Paris. Kerenidis and Prakash proved that a quantum computer could solve the recommendation problem exponentially faster than any known algorithm, but they didn’t prove that a fast classical algorithm couldn’t exist. So when Aaronson began working with Tang in 2017, that was the question he posed — prove there is no fast classical recommendation algorithm, and thereby confirm Kerenidis and Prakash’s quantum speedup is real. “That seemed to me like an important ‘t’ to cross to complete this story,” said Aaronson, who believed at the time that no fast classical algorithm existed. Tang set to work in the fall of 2017, intending for the recommendation problem to serve as a senior thesis. For several months Tang struggled to prove that a fast classical algorithm was impossible. As time went on, Tang started to think that maybe such an algorithm was possible after all. “I started believing there is a fast classical algorithm, but I couldn’t really prove it to myself because Scott seemed to think there wasn’t one, and he was the authority,” Tang said. Finally, with the senior thesis deadline bearing down, Tang wrote to Aaronson and admitted a growing suspicion: “Tang wrote to me saying, actually, ‘I think there is a fast classical algorithm,’” Aaronson said. Throughout the spring Tang wrote up the results and worked with Aaronson to clarify some steps in the proof. The fast classical algorithm Tang found was directly inspired by the fast quantum algorithm Kerenidis and Prakash had found two years earlier. Tang showed that the kind of quantum sampling techniques they used in their algorithm could be replicated in a classical setting. Like Kerenidis and Prakash’s algorithm, Tang’s algorithm ran in polylogarithmic time — meaning the computational time scaled with the logarithm of characteristics like the number of users and products in the data set — and was exponentially faster than any previously known classical algorithm. Once Tang had completed the algorithm, Aaronson wanted to be sure it was correct before releasing it publicly. “I was still nervous that once Tang put the paper online, if it’s wrong, the first big paper of [Tang’s] career would go splat,” Aaronson said. Aaronson had been planning to attend a quantum computing workshop at the University of California, Berkeley, in June. Many of the biggest names in the field were going to be there, including Kerenidis and Prakash. Aaronson invited Tang to come

Major Quantum Computing Advance Made Obsolete by Teenager (2018) Read More »

OpenAI prepares to launch GPT-5 in August

Earlier this year, I heard that Microsoft engineers were preparing server capacity for OpenAI’s next-generation GPT-5 model, arriving as soon as late May. After some additional testing and delays, sources familiar with OpenAI’s plans tell me that GPT-5 is now expected to launch as early as next month. OpenAI CEO Sam Altman recently revealed on X that “we are releasing GPT-5 soon” and even teased some of its capabilities in a podcast appearance with Theo Von earlier this week. Altman decided to let GPT-5 take a stab at a question he didn’t understand. “I put it in the model, this is GPT-5, and it answered it perfectly,” Altman said. He described it as a “here it is moment,” adding that he “felt useless relative to the AI” because he felt like he should have been able to answer the question but GPT-5 answered it instantly. “It was a weird feeling.” GPT-5 had already been spotted in the wild before Altman’s appearance on This Past Weekend, fueling speculation that the next-generation GPT model was imminent. I understand OpenAI is planning to launch GPT-5 in early August, complete with mini and nano versions that will also be available through its API. I reached out to OpenAI to comment on the launch of GPT-5 in August, but the company did not respond in time for publication. Altman referred to GPT-5 as “a system that integrates a lot of our technology” earlier this year, because it will include the o3 reasoning capabilities instead of shipping those in a separate model. It’s part of OpenAI’s ongoing efforts to simplify and combine its large language models to make a more capable system that can eventually be declared artificial general intelligence, or AGI. The declaration of AGI is particularly important to OpenAI, because achieving it will force Microsoft to relinquish its rights to OpenAI revenue and its future AI models. Microsoft and OpenAI have been renegotiating their partnership recently, as OpenAI needs Microsoft’s approval to convert part of its business to a for-profit company. It’s unlikely that GPT-5 will meet the AGI threshold that’s reportedly linked to OpenAI’s profits. Altman previously said that GPT-5 won’t have a “gold level of capability for many months” after launch. Unifying its o-series and GPT-series models will also reduce the friction of having to know which model to pick for each task in ChatGPT. I understand that the main combined reasoning version of GPT-5 will be available through ChatGPT and OpenAI’s API, and the mini version will also be available on ChatGPT and the API. The nano version of GPT-5 is expected to only be available through the API. While GPT-5 looks likely to debut in early August, OpenAI’s planned release dates often shift to respond to development challenges, server capacity issues, or even rival AI model announcements and leaks. Earlier this month, I warned about the possibility of a delay to the open language model that OpenAI is also preparing to launch, and Altman confirmed my reporting just days after my Notepad issue by announcing a delay “to run additional safety tests and review high-risk areas.” I’m still hearing that this open language model is imminent and that OpenAI is trying to ship it before the end of July — ahead of GPT-5’s release. Sources describe the model as “similar to o3 mini,” complete with reasoning capabilities. This new model will be the first time that OpenAI has released an open-weight model since its release of GPT-2 in 2019, and it will be available on Azure, Hugging Face, and other large cloud providers. Microsoft is in the security hot seat again Microsoft made security its top priority last year, following years of security issues and mounting criticism after a scathing report from the US Cyber Safety Review Board. The company has been working to improve its “inadequate” security culture ever since. But this week, we were reminded of Microsoft’s challenges once again. A major security flaw in Microsoft’s on-premises versions of SharePoint allowed hacking groups to exploit a zero-day vulnerability and breach more than 50 organizations — including the US nuclear weapons agency. Security researchers discovered the vulnerability was being exploited on July 18th, and Microsoft issued an alert a day later. Microsoft engineers then spent all weekend working on patches and released updates for SharePoint Subscription Edition and SharePoint 2019 late on July 20th. A patch for SharePoint 2016 servers was released on the morning of July 22nd. The previously unpatched flaw appears to have originated from a combination of two bugs that were presented at the Pwn2Own hacking contest in May. Microsoft has linked the attacks to two hacking groups that are affiliated with the Chinese government, but the company hasn’t disclosed exactly how hackers were able to bypass its patches to create a zero-day exploit. The security flaw was only exploitable through on-premises versions of SharePoint, so the Microsoft 365 version of SharePoint Online was unaffected. This certainly limited the scale of damage, but the targeted nature of these attacks will be hugely concerning for Microsoft and the company’s customers. It’s also likely to accelerate a move away from these older versions of SharePoint, which are in the extended support phase until July 2026. Complicating the concern around Microsoft’s security practices is a new report from ProPublica that warns of a little-known Microsoft program that could expose the US Defense Department to Chinese hackers. Microsoft has been using engineers in China to help maintain the department’s computer systems, with digital escorts that reportedly lack the technical expertise to properly police foreign engineers. It’s a troubling development after the Office of the Director of National Intelligence called China the “most active and persistent cyber threat to US Government, private-sector, and critical infrastructure networks.” On the same day the SharePoint exploit was discovered, Microsoft’s head of communications, Frank Shaw, responded to the ProRepublica report and announced changes to “assure that no China-based engineering teams are providing technical assistance for DoD Government cloud and related services.” Sources tell me

OpenAI prepares to launch GPT-5 in August Read More »

Scroll to Top