Best Apple Watch Apps: Don’t Bother With Third-Party Options

There are plenty of apps for the Apple Watch, but Apple’s native apps are still among the best.

Apple Watch Series 8 is an iterative upgrade over the Series 7. With each new iteration, the Apple Watch gets more advanced. It’s specifically noteworthy when it comes to tracking your health and fitness. If you want to take advantage of the best Apple Watch apps, we have some pretty straightforward advice: Skip the App Store and stick with the watch’s native apps.

Companies including Amazon, eBay, Target, Slack and TripAdvisor have dropped support for Apple Watch apps, but those services are better-suited for our phones, tablets and laptops anyway. What does matter is the built-in Activity tracker, Messages and Phone apps — the things we want on hand for a quick and convenient glance, regardless of which Apple Watch version we’re currently sporting.

“The watch is really about convenience,” said Ray Wang, principal analyst and founder of Constellation Research. “You’re not going to spend so much screen time on your watch. So I think the secret of building a good Apple Watch app is to think of it as an accessory in addition to something. Very few people use it as a standalone unless it’s for fitness or health or some kind of monitoring.”

Read more: Set Up Your New Apple Watch in Just a Few Taps

When the Apple Watch launched in 2015, it had 3,000 apps available to download. Today, there are more than 20,000 apps — 44 of which are built into the wearable. While watches weren’t an in-demand accessory in general back in 2015, the Apple Watch proved to be a useful tool for checking messages, the weather and reminders, Wang added — all of which are already built into the device.

Here are several native Apple Watch apps that you may not already be using.

  1. Sleep
    The Apple Watch was late to the game when it came to sleep tracking — a crucial wellness feature that rivals like Fitbit have offered for years. While Apple’s Sleep app may not be as comprehensive as the sleep monitoring available on other devices, it’s still a great way to keep track of your slumber and get into a regular bedtime routine. When wearing your Apple Watch overnight, it’ll tell you how much time you’ve spent asleep while in bed as well as your sleeping respiratory rate. That latter feature is a new addition that Apple launched with WatchOS 8 in September.
  2. Wallet
    The Apple Watch is designed to make it so that you don’t have to reach for your phone as often, and the Wallet app is one of the best examples. It allows you to store things like credit cards, boarding passes and movie tickets on your wrist once you’ve added them to the Wallet app on your phone. That means you won’t have to dig into your purse or pocket to make a quick purchase or board your flight. Apple is also expanding what the Wallet app can do in WatchOS 8, which introduces the ability to add home keys and identification cards to your watch.
  3. Messages
    The Messages app is one of the most basic and fundamental Apple Watch apps, but it’s also among the most useful. As the name implies, Messages allows you to read and respond to text messages directly from your wrist. Your phone is still the best tool for sending long text messages, but the Apple Watch can come in handy for sending short, time-sensitive replies when you don’t have a moment to reach for your phone. If you have the Apple Watch Series 7, the latest model, you’ll be able to respond to texts using the device’s new QWERTY keyboard, which is much easier than using the Scribble function.
  4. Noise
    If you have an Apple Watch Series 4 or later, you can use the Noise app to measure the ambient sound in your environment. If the decibel level has risen to a point where your hearing could be affected, the app can notify you with a tap on your wrist.

Read more: Apple Watch Series 7 Review: A Slightly Better Smartwatch Than Last Year’s

  1. Cycle Tracking
    Women can use the Cycle Tracking app to log details about your menstrual cycle, including flow information and symptoms such as headaches or cramps. Using that data, the app can alert you to when it predicts your next period or fertile window is about to start.
  2. ECG
    If you have an Apple Watch Series 4 or later, you have an electrical heart rate sensor that works with the ECG app to take an electrocardiogram (sometimes called an EKG by cardiologists). You’ll also need an iPhone 6S or later, and both the phone and the watch will need to be on the latest version of iOS and WatchOS, respectively. It’s also not available in all regions.
  3. News
    The News app will help you keep up with current events on the fly, showing you stories that it selects based on your interests. However, it’s not available in all areas.
  4. Mindfulness
    The Apple Watch has long offered breathing exercises. But WatchOS 8’s Mindfulness app, which replaced the Breathe app, adds a new option to the Apple Watch’s relaxation repertoire: reflections that prompt you to pause and think about special moments in your life. You’re still able to access Breathe sessions from this app, but the new Reflect option just gives you another way to take a break from your day.
  5. Remote
    If you have an Apple TV, you can use your watch as another remote control — assuming both devices are connected to the same Wi-Fi network. Use the Remote app to swipe around on the watch face and move through the Apple TV menu options, and play or pause shows.
  6. Camera
    You can’t take a picture with your watch itself. But with the Camera app, your watch can act as a remote control for your iPhone’s camera. Use it to help take selfies or start recording on your phone across the room, so you can finally get everyone in that big group shot.
  7. Walkie-Talkie
    The Walkie-Talkie app lets you use your watch as a walkie-talkie to chat with another person wearing an Apple Watch. You press a button to talk, and release it to listen to the reply. The app isn’t available in all regions, and both participants need to have connectivity through a Bluetooth connection to the iPhone, Wi-Fi or cellular. You also have to accept an invitation to connect with someone through the app — they can’t just start talking to you.
  8. Voice Memos
    Like on the iPhone, you can use the Voice Memos app on your Apple Watch to record personal notes and things to remember while on the go. The voice memos you record on the watch will automatically sync to any other iOS devices where you’re signed in with the same Apple ID.

The future of native Apple Watch apps
The collection of native Apple Watch apps is likely far from complete. We saw the addition of the Sleep app and Blood Oxygen app with last year’s respective WatchOS 7 software update and Apple Watch Series 6. And if reports are to be believed, Apple has broader ambitions in the health and wellness space that we could see in the years to come. The company is reportedly working on blood pressure and thermometer tools for the Apple Watch, according to The Wall Street Journal. Apple is also working on a blood-sugar sensor that could help diabetics manage their glucose levels, Bloomberg reported last year, although it says this functionality likely won’t be commercially available for several years.

AI as Lawyer: It’s Starting as a Stunt, but There’s a Real Need

People already have a hard enough time getting help from lawyers. Advocates say AI could change that.

Next month, AI will enter the courtroom, and the US legal system may never be the same.

An artificial intelligence chatbot, technology programmed to respond to questions and hold a conversation, is expected to advise two individuals fighting speeding tickets in courtrooms in undisclosed cities. The two will wear a wireless headphone, which will relay what the judge says to the chatbot being run by DoNotPay, a company that typically helps people fight traffic tickets through the mail. The headphone will then play the chatbot’s suggested responses to the judge’s questions, which the individuals can then choose to repeat in court.

It’s a stunt. But it also has the potential to change how people interact with the law, and to bring many more changes over time. DoNotPay CEO Josh Browder says expensive legal fees have historically kept people from hiring traditional lawyers to fight for them in traffic court, which typically involves fines that can reach into the hundreds of dollars.

So, his team wondered whether an AI chatbot, trained to understand and argue the law, could intervene.

“Most people can’t afford legal representation,” Browder said in an interview. Using the AI in a real court situation “will be a proof of concept for courts to allow technology in the courtroom.”

Regardless of whether Browder is successful — he says he will be — his company’s actions mark the first of what are likely to be many more efforts to bring AI further into our daily lives.

Modern life is already filled with the technology. Some people wake up to a song chosen by AI-powered alarms. Their news feed is often curated by a computer program, too, one that’s taught to pick items they’ll find most interesting or that they’ll be most likely to comment on and share via social media. AI chooses what photos to show us on our phones, it asks us if it should add a meeting to our calendars based on emails we receive, and it reminds us to text a birthday greeting to our loved ones.

But advocates say AI’s ability to sort information, spot patterns and quickly pull up data means that in a short time, it could become a “copilot” for our daily lives. Already, coders on Microsoft-owned GitHub are using AI to help them create apps and solve technical problems. Social media managers are relying on AI to help determine the best time to post a new item. Even we here at CNET are experimenting with whether AI can help write explainer-type stories about the ever-changing world of finance.

So, it can seem like only a matter of time before AI finds its way into research-heavy industries like the law as well. And considering that 80% of low-income Americans don’t have access to legal help, while 40% to 60% of the middle class still struggle to get such assistance, there’s clearly demand. AI could help meet that need, but lawyers shouldn’t feel like new technology is going to take business away from them, says Andrew Perlman, dean of the law school at Suffolk University. It’s simply a matter of scale.

“There is no way that the legal profession is going to be able to deliver all of the legal services that people need,” Perlman said.

Turning to AI
DoNotPay began its latest AI experiment back in 2021 when businesses were given early access to GPT-3, the same AI tool used by the startup OpenAI to create ChatGPT, which went viral for its ability to answer questions, write essays and even create new computer programs. In December, Browder pitched his idea via a tweet: have someone wear an Apple AirPod into traffic court so that the AI could hear what’s happening through the microphone and feed responses through the earbud.

Aside from people jeering him for the stunt, Browder knew he’d have other challenges. Many states and districts limit legal advisors to those who are licensed to practice law, a clear hurdle that UC Irvine School of Law professor Emily Taylor Poppe said may cause trouble for DoNotPay’s AI.

“Because the AI would be providing information in real time, and because it would involve applying relevant law to specific facts, it is hard to see how it could avoid being seen as the provision of legal advice,” Poppe said. Essentially, the AI would be legally considered a lawyer acting without a law license.

AI tools raise privacy concerns too. The computer program technically needs to record audio to interpret what it hears, a move that’s not allowed in many courts. Lawyers are also expected to follow ethics rules that forbid them from sharing confidential information about clients. Can a chatbot, designed to share information, follow the same protocols?

Perlman says many of these concerns can be answered if these tools are created with care. If successful, he argues, these technologies could also help with the mountains of paperwork lawyers encounter on a daily basis.

Ultimately, he argues, chatbots may turn out to be as helpful as Google and other research tools are today, saving lawyers from having to physically wade through law libraries to find information stored on bookshelves.

“Lawyers trying to deliver legal services without technology are going to be inadequate and insufficient to meeting the public’s legal needs,” Perlman said. Ultimately, he believes, AI can do more good than harm.

The two cases DoNotPay participates in will likely impact much of that conversation. Browder declined to say where the proceedings will take place, citing safety concerns.

Neither DoNotPay nor the defendants plan to inform the judges or anyone in court that an AI is being used or that audio is being recorded, a fact that raises ethics concerns. This in itself resulted in pushback on Twitter when Browder asked for traffic ticket volunteers in December. But Browder says the courts that DoNotPay chose are likely to be more lenient if they find out.

The future of law
After these traffic ticket fights, DoNotPay plans to create a video presentation designed to advocate in favor of the technology, ultimately with the goal of changing law and policy to allow AI in courtrooms.

States and legal organizations, meanwhile, are already debating these questions. In 2020, a California task force dedicated to exploring ways to expand access to legal services recommended allowing select unlicensed practitioners to represent clients, among other reforms. The American Bar Association told judges using AI tools to be mindful of biases instilled in the tools themselves. UNESCO, the international organization dedicated to preserving culture, has a free online course covering the basics of what AI can offer legal systems.

For his part, Browder says AI chatbots will become so popular in the next couple of years that the courts will have no choice but to allow them anyway. Perhaps AI tools will have a seat at the table, rather than having to whisper in our ears.

“Six months ago, you couldn’t even imagine that an AI could respond in these detailed ways,” Browder said. “No one has imagined, in any law, what this could be like in real life.”

New Apple Music, TV and Devices Apps Now Available on Windows

Microsoft is making it easier to use certain Apple services on Windows.

Last year, during its Oct. 12 Surface event, Microsoft announced that Apple Music and Apple TV would soon be coming to the Microsoft Store, as replacements for Windows alternatives that just weren’t up to par — and that day is now here.

Apple Music, Apple TV and a third app known as Apple Devices (which lets you manage your Apple devices) are now available for you to download, as long as you’re running Windows 11. We’ll briefly discuss what each of these new Apple applications can do for you on Windows, and how you can install them right now.

If you want to learn more about Windows 11, check out the best Windows 11 features and the upgraded Windows 11 features we love the best.

How to download Apple Music, Apple TV and Apple Devices for Windows 11
As long as you’re running Windows 11 version 22621.0 or higher, you can download any of the three apps to your computer. If you’re still running Windows 10 or something older, check out our guide on how to download and install Windows 11.

Now all you have to do is either click the links below or manually search for the apps in the Microsoft Store:

Apple Music (replacement for iTunes): Stream music, listen to podcasts and more, from the Apple Music service. Must be a paid subscriber.
Apple TV (replacement for Apple TV web player): Watch Apple TV Plus, movies and more. You must be a paid subscriber as well.
Apple Devices (replacement for iTunes): Manage your Apple devices, including your iPhone, iPad, iPod and iPod Touch. You can sync music, movies and TV shows, as well as update, back up and restore your devices.

If you download Apple Music, Apple TV or Apple Devices (or all three), you’ll no longer be able to use iTunes. The only way to get iTunes back up and running is to uninstall whichever of the three apps you downloaded.

Also, all three Apple apps on Windows are currently previews, which means that not all features may work as expected.

My Favorite Hidden iPhone Shortcut To Turn On The Flashlight (And More)

This simple pro iPhone tip will save you time and fumbling.

My iPhone’s flashlight isn’t just a tool I casually fire up if something accidentally rolls under the couch, it’s a feature I use daily to light up the way to the bathroom in the middle of the night, scan my backyard when animals make weird sounds and… OK, yeah, find something I’ve lost under my couch. And since I use the iPhone flashlight so often, I’ve turned on a tool deep in the iOS settings menu that makes it faster to light up the torch — no more fumbling with the lock screen for the flashlight icon or unlocking the phone first.

I don’t exaggerate when I say this hidden iPhone feature has changed the flashlight for me.

Back Tap for the iPhone is an accessibility feature that Apple introduced with iOS 14,. It lets you quickly perform certain actions — say, taking a screenshot or launching your camera — by simply tapping the back of your phone. Essentially, it turns the entire back of your iPhone into a button.

This is an important benefit for all kinds of people, and for me, enabling Back Tap has let me turn it into a customizable button to quickly trigger the iPhone flashlight. I’ll tell you exactly how to set it up for yourself, and you can of course customize Back Tap to trigger other actions.

Also, if you want to learn more about other iPhone and iOS features, check out these 10 next-level iOS 16 features and how to find the “secret” iPhone trackpad.

How to set up Back Tap on iPhone
Whether you want to link Back Tap with your flashlight, camera or launch a different iPhone app, the path through your iPhone settings begins the same way.

On your compatible iPhone (iPhone 8 or later), launch the Settings application and go to Accessibility > Touch > Back Tap. Now you have the option to launch your action (in this case, your flashlight) with either two or three taps. Although two taps is obviously faster, I would suggest three taps because if you fidget with your phone, it’s easy to accidentally trigger the accessibility feature.

Once you choose a tap option, select the Flashlight option — or a different action if you prefer. You’ll see over 30 options to choose from, including system options like Siri or taking a screenshot, to accessibility-specific functions like opening a magnifier or turning on real-time live captions. You can also set up Back Tap to open the Control Center, go back home, mute your audio, turn the volume up and down and run any shortcuts you’ve downloaded or created.

You’ll know you’ve successfully selected your choice when a blue checkmark appears to the right of the action. You could actually set up two shortcuts this way — one that’s triggered by two taps and one that’s triggered by three taps to the iPhone’s back cover.

Once you exit the Settings application, you can try out the newly enabled Back Tap feature by tapping the back of your iPhone — in my case, to turn on the flashlight. To turn off the flashlight, you can tap on the back of your iPhone as well, but you can also just turn it off from your lock screen if that’s easier.

For more great iPhone tips, here’s how to keep your iPhone screen from dimming all the time and cancelling all those subscriptions you don’t want or need.

Apple Reportedly Plans to Use Own Screens on Mobile Devices

The push would reflect the company’s effort to be less reliant on other companies.

Apple plans to begin its own custom displays on mobile devices starting in 2024, Bloomberg reported Tuesday.

The push, intended to bring more production in-house, is expected to begin with the Apple Watch by the end of the year, according to the report, which cited people with knowledge of the matter. The displays will also appear on other devices such as the iPhone, according to the report.

Apple’s display endeavor would dovetail with the company’s efforts to make it less reliant on components provided by third parties, in this case, Samsung, which is also a key competitor in the phone market.

This isn’t the first time Apple has gone about developing its own components to reduce costs. The iPhone maker has spent years making its own 5G modem after it purchased the business from Intel in 2019 for $1 billion in order to not rely on chips made by Qualcomm.

Apple didn’t immediately respond to a request for comment.

Apple Reportedly Working on Own Bluetooth, Wi-Fi Chip

The iPhone maker is also working on its own 5G chip that might be in phones in 2024.

Apple will make its own Bluetooth/Wi-Fi chip for its iPhones to replace third-party components, according to a Bloomberg report Monday.

Currently, iPhones include chips from Broadcom to handle Bluetooth and Wi-Fi functions, and by making its own component, Apple could save itself some money. The company could start including the new chip in its phones by 2025, Bloomberg reported.

This is not the first time Apple has gone about developing its own components to reduce costs. The iPhone maker has spent years making its own 5G modem after it purchased the business from Intel in 2019 for $1 billion in order to not rely on chips made by Qualcomm. Following some delays, Apple’s 5G chip could make its way into iPhones starting in late 2024 or early 2025 instead of later this year, according to the Bloomberg report.

Apple and Broadcom didn’t immediately respond to a request for comment.

Apple’s AR/VR Headset: What Could Be Coming in 2023

The company’s next big product should arrive next year. Here’s what we expect.

Apple has been integrating augmented reality into its devices for years, but the company looks like it will leap right into the territory of Meta, Microsoft and Magic Leap with a long-expected mixed-reality headset in 2023.

The target date of this AR/VR headset keeps sliding, with the latest report in early December from noted analyst Ming Chi-Kuo suggesting an arrival in the second half of 2023. With an announcement event that could happen as soon as January, we’re at the point where every Apple event seems to feel like the one where it could pull the covers off this device at last. Bloomberg’s Mark Gurman reported in early January that’s he’s heard the company is aiming to unveil the headset in the spring ahead of the annual Worldwide Developers Conference in June.

2023 looks like a year full of virtual reality headsets that we originally expected in 2022, including the PlayStation VR 2 and Meta Quest 3. Apple has already laid down plenty of AR clues, hinting at what its mixed-reality future could hold and has been active in AR on its own iPhones and iPads for years.

As far as what its device could be like, odds are strong that the headset could work from a similar playbook as Meta’s recent high-end headset, the Quest Pro, with a focus on work, mixed reality and eye tracking onboard.

Is its name Reality Pro? Is the software called xrOS?
The latest report from noted Apple reporter Mark Gurman at Bloomberg suggests the operating system for this headset could be called “xrOS,” but that may not indicate the name of the headset itself. Recent trademark filings reported by Bloomberg showed the name “Reality” showing up a lot: Reality One, Reality Pro and Reality Processor. Apple’s existing AR software framework for iOS is named RealityKit, and previous reports suggested that “Reality OS” could be the name for the new headset’s ecosystem.

No one really expected the Apple Watch’s name (remember iWatch?), so to some degree, names don’t matter at this point. But it does indicate that Apple’s moving forward on a product and software, for sure.

One of several headsets?
The headset has been cooking for a long while. Reports have been going around for several years, including a story broken by former CNET Managing Editor Shara Tibken in 2018. Apple’s been building more advanced AR tools into its iPhones and iPads for years, setting the stage for something more.

Whatever the headset might become, it’s looking a lot more real lately. A detailed report from The Information earlier this year discussed likely specs, which include what Bloomberg’s Mark Gurman says is Apple’s latest M2 chip. According to another report from Bloomberg earlier this year, Apple’s board of directors have already seen a demonstration of the mixed-reality headset.

The expected arrival of this headset has kept sliding for years. Kuo previously predicted that Apple’s VR-AR headset would arrive in the fourth quarter of 2022 with Wi-Fi 6 and 6E support. But this VR-type headset could be the start of several lines of products, similar again to how Meta has been targeting future AR glasses. Kuo has previously predicted that Apple smart glasses may arrive in 2025.

Apple could take a dual headset approach, leading the way with a high-end AR-VR headset that may be more like what Meta has done with the Quest Pro, according to Bloomberg’s Gurman. Gurman also suggests a focus on gaming, media and communication on this initial first-wave headset. In terms of communication, Gurman believes FaceTime using the rumored headset could rely on Memoji and SharePlay: Instead of seeing the person you’re talking to, you’d see a 3D version of their personalized Memoji avatar.

Eventually, Apple’s plans for this headset could become larger. The company’s “goal is to replace the ‌iPhone‌ with AR in 10 years,” Kuo explained in a note to investors, seen by MacRumors. The device could be relatively lightweight, about 300 to 400 grams (roughly 10.5 to 14 ounces), according to Kuo. That’s lighter than Meta’s Oculus Quest 2. However, it’s larger than a normal pair of glasses, with early renders of its possible design looking a lot more like futuristic ski goggles.

Read more: The Metaverse is Just Getting Started: Here’s What You Need to Know

The headset could be expensive, maybe as much as $2,000 or more, with 8K displays, eye tracking and cameras that can scan the world and blend AR and VR together, according to a report from The Information last year. That’s to be expected, considering the Quest Pro costs $1,500 and AR headsets like the Magic Leap 2 and Hololens 2 are around $3,000.

It’s expected to feature advanced processors, likely based on Apple’s recent M2 chips, and work as a stand-alone device. But it could also connect with Apple’s other devices. That’s not a surprising move. In fact, most of the reports on Apple’s headset seem to line right up with how VR is evolving: lighter-weight, with added mixed-reality features via more advanced pass-through cameras. Much like the Quest Pro, this will likely be a bridge to future AR glasses efforts.

Previous reports on Apple’s AR/VR roadmap suggested internal disagreements, or a split strategy that could mean a VR headset first, and more normal-looking augmented reality smart glasses later. But recent reports seem to be settling down to tell the story of a particular type of advanced VR product leading the way. What’s increasingly clear is that the rest of the AR and VR landscape is facing a slower-than-expected road to AR glasses, too.

Apple has been in the wings all this time without any headset at all, although the company’s aspirations in AR have been clear and well-telegraphed on iPhones and iPads for years. Each year, Apple’s made significant strides on iOS with its AR tools. It’s been debated how soon this hardware will emerge: this year, the year after or even further down the road. Or whether Apple proceeds with just glasses, or with a mixed-reality VR and AR headset, too.

I’ve worn more AR and VR headsets than I can even recall, and have been tracking the whole landscape for years. In a lot of ways, a future Apple AR headset’s logical flight path should be clear from just studying the pieces already laid out. Apple acquired VR media-streaming company NextVR in 2020 and it bought AR headset lens-maker Akonia Holographics in 2018.

I’ve had my own thoughts on what the long-rumored headset might be, and so far, the reports feel well-aligned to be just that. Much like the Apple Watch, which emerged among many other smartwatches and had a lot of features I’d seen in other forms before, Apple’s glasses probably won’t be a massive surprise if you’ve been paying attention to the AR and VR landscape lately.

Remember Google Glass? How about Snapchat’s Spectacles? Or the HoloLens or Magic Leap? Meta is working on AR glasses too, as well as Snap and also Niantic. The landscape got crowded fast.

Here’s where Apple is likely to go based on what’s been reported, and how the company could avoid the pitfalls of those earlier platforms.

Apple declined to comment on this story.

Launch date: Looks likely for 2023
New Apple products tend to be announced months before they arrive, maybe even earlier. The iPhone, Apple Watch, HomePod and iPad all followed this path.

The latest reports from Kuo point to possible delays for the release of the headset to the second half of 2023, but an event announcing the headset could happen as soon as January. That timeframe would make a lot of sense, giving time for developers to understand the concept well ahead of the hardware’s release, and even possibly allowing for Apple’s WWDC developer conference (usually in June) to go over specifics of the software.

Either way, developers would need a long head start to get used to developing for Apple’s headset, and making apps work and flow with whatever Apple’s design guidance will be. That’s going to require Apple giving a heads-up on its hardware well in advance of its actual arrival.

An Apple headset could be a lot like the Meta Quest, but higher end
There’s already one well-polished success story in VR, and the Quest 2 looks to be as good a model as any for where future headsets could aim. Gurman’s report makes a potential Apple VR headset sound a lot like Facebook’s stand-alone device, with controller-free hand tracking and spatial room awareness that could be achieved with Apple’s lidar sensor technology, introduced on the iPad Pro and iPhone 12 Pro.

Apple’s headset could end up serving a more limited professional or creative crowd. But it could also go for a mainstream focus on gaming or fitness. My experiences with the Oculus Quest’s fitness tools feel like a natural direction for Apple to head in, now that the Apple Watch is extending to subscription fitness training, pairing with TVs and other devices.

The Oculus Quest 2 (now officially the Meta Quest 2) can see through to the real world and extend some level of overlap of virtual objects like room boundaries, but Apple’s headset could explore passthrough augmented reality to a greater degree. I’ve seen impressive examples of this in headsets from companies such as Varjo. It could be a stepping stone for Apple to develop 3D augmented reality tech on smaller glasses designs down the road.

Right now, there aren’t any smart glasses manufacturers able to develop normal-looking glasses that can achieve advanced, spatially aware 3D overlays of holographic objects. Some devices like the nReal Light have tried, with mixed success. Meta’s first smart glasses, Ray-Ban Stories, weren’t AR at all. Meta is working on ways to achieve that tech later on. Apple might take a similar approach with glasses, too.

The VR headset could be a ‘Pro’ device
Most existing reports suggest Apple’s VR headset would likely be so expensive — and powerful — that it will have to aim for a limited crowd rather than the mainstream. If so, it could target the same business and creative professionals that more advanced VR headsets like the Varjo XR-3 and Meta Quest Pro are already aiming for.

I tried Varjo’s hardware. My experience with it could hint at what Apple’s headset might also be focusing on. It has a much higher-resolution display (which Apple is apparently going to try to achieve), can blend AR and VR into mixed reality using its passthrough cameras, and is designed for pro-level creative tools. Apple could integrate something similar to its lidar sensors. The Quest Pro does something similar, but in a standalone device without as high-end a display.

Varjo’s headset, and most professional VR headsets, are tethered to PCs with a number of cables. Apple’s headset could work as a standalone device, like the Quest 2 and Quest Pro, and also work when connected to a Mac or iPad, much like the Quest 2 already does with Windows gaming PCs. Apple’s advantage could be making a pro headset that is a lot more lightweight and seamlessly standalone than any other current PC-ready gear. But what remains unknown is how many apps and tools Apple will be able to introduce to make its headset feel like a tool that’s truly useful for creators.

Controls: Hand tracking or a small wearable device?
The Information’s previous reports on Apple’s headset suggest a more pared-down control system than the elaborate and large game controller-like peripherals used by many VR headsets right now. Apple’s headset should work using hand tracking, much like many VR and AR headsets already enable. But Apple would likely need some sort of controller-type accessory for inputs, too. Cracking the control and input challenge seems to be one of the bigger hurdles Apple could face.

Recent patent filings point to a possible smart ring-type device that could work for air gestures and motion, and maybe even work with accessories. It’s also possible that Apple might lean on some of its own existing hardware to act as inputs, too.

Could that controller be an Apple Watch? Possibly, but the Apple Watch’s motion-control capabilities and touchscreen may not be enough for the deeper interactions an Apple headset would need. Maybe iPhones could pair and be used as controllers, too. That’s how Qualcomm is envisioning its next wave of phone-connected glasses.

Lenovo Reimagines Laptops at CES With Acrobatic Dual Screens

The ThinkBook Plus Twist could change the future of hybrid work with its OLED and color E Ink display.

Lenovo is always good for some interesting announcements at CES and it did not disappoint this year with not one, but two dual-screen laptops.

The more consumer-focused model is the Yoga Book 9i, which is essentially two 13.3-inch 2.8K OLED attached at the center with a 360-degree soundbar hinge like those used in its Yoga 9i two-in-one. This allows the screens to be used in a variety of ways such as one big vertical display (pictured above) or horizontally (pictured below) with windows able to move or flow between the two screens. A compact Bluetooth keyboard is included as well as an origami-style folding stand to support the Yoga Book in either position.

You won’t always have room to use the Yoga Book 9i as dual displays, though. (It would be awkward on my train commute for sure.) To use it as a regular laptop with a single 13.3-inch screen, you can call up a full onscreen haptic keyboard on the bottom display — the one without the webcam above it — with a 10-finger gesture as if you’re starting to type on a keyboard.

If you’re not a fan of touch-typing on screen — and I’m not — the included Bluetooth keyboard magnetically attaches to the bottom display and the bottom half turns into a touchpad. You can also slide the keyboard to the bottom half of the screen and continue using the top half as extra display space.

A Lenovo-designed active pen is included, too, and there’s unique software for notetaking so you can take advantage of the two displays. For example, you can watch a video conference on one screen while you take notes on the other. And when you’re done working, the stand doubles as a carrying case to hold the pen and keyboard.

Powered by 13th-gen Intel Core processors, the Yoga Book 9i will start at $2,100 and is expected to be available starting June 2023. Availability for the UK and Australia wasn’t announced but the price converts to approximately £1,765 or AU$3,110.

The ThinkBook Plus has been around for a few years now and is a completely different take on a dual-screen laptop. The second-gen model was a traditional laptop with the exception of an E Ink display on its lid. The Gen 3 model switched things up by putting a secondary 8-inch display next to the keyboard. For the new ThinkBook Plus Twist, Lenovo went back to the Gen 2 design but this time used a color E Ink display and put the displays on a rotating hinge.

So the laptop has both a 13.3-inch 2.8K OLED display and a 12-inch E Ink touch display. When you rotate the E Ink screen to be in front of the keyboard, it turns off the OLED and switches the E Ink screen to the main display for Windows 11. Unfortunately, you can’t have the displays mirror each other, which might be handy for commercial uses or giving presentations.

Both displays work with the bundled pen. The E Ink display is handy for jotting down a quick note like a list of action items in a meeting without opening the laptop. It can also be used for notetaking, checking your e-mail and calendars, keeping an eye on notifications and so on. And if the display rotates, you can also do all of that stuff with the main OLED display and much much more.

So why make it rotate? When I asked Lenovo that question, the answer was more or less so that the E Ink could be used for reading and keeping an eye on things without opening the laptop, but also the OLED could be used in tablet mode for streaming video or drawing or anything else where using the OLED screen as a tablet makes more sense than the E Ink. And E Ink sips power, so you could also use it to get simple office work done in Windows when your battery’s running too low to use the OLED.

Motorola’s New Phone Is for People Who Really, Really Love Their Lenovo ThinkPad

The business-oriented ThinkPhone takes its design cues from Lenovo’s popular laptop line.

Lenovo’s popular ThinkPad laptop line is finally getting a mobile sidekick. The ThinkPhone by Motorola, announced at CES, will have a similar aesthetic to that of the ThinkPad computers, down to their signature red button. Motorola, which is owned by Lenovo, appears to have focused on three specific areas for the business-focused ThinkPhone: security, durability and productivity. Motorola has not said how much the device will cost.

Read more: Here are the must-see CES highlights, the wackiest products revealed and the most futuristic tech we’ve seen.

Among the ThinkPhone’s most interesting features is its customizable red key, which can be used to launch certain apps or features, such as the Walkie Talkie functionality in Microsoft Teams. It sounds similar to the programmable button on Samsung’s XCover6Pro, a phone that was also developed for enterprise and industrial uses.

It’s a work-oriented device, so the ThinkPhone unsurprisingly includes a variety of productivity features that make it easier to connect the phone to your PC. Many of these features are already available through Motorola’s existing Ready For software, which you can find on previously launched phones like the Motorola Edge 20 and Edge 20 Plus. These include the ability to use the ThinkPhone as your laptop’s webcam for video calls, automatic connectivity to your Windows laptop via Wi-Fi when it’s nearby, drag-and-drop file transfers between the ThinkPhone and your Windows computer and a unified clipboard. The difference, however, is that some of these features can be accessed with the ThinkPhone’s red key, making it stand out from Motorola’s other devices.

Motorola is also positioning the ThinkPhone as ideal for storing sensitive work-related information. The device has a separate processor called Moto KeySafe, which isolates PINs, passwords and other sensitive data. Lenovo and Motorola’s ThinkShield and Moto Threat Defense software can also be found on the device. The ThinkPhone also has tools that allow IT departments to manage aspects like lock screen settings and network alerts.

As for durability, the ThinkPhone is MIL STD 810H certified and is constructed from an aramid fiber that Motorola says is stronger than steel. It also has Gorilla Glass Victus and should be able to withstand drops from up to 1.25 meters. Like most modern phones, the ThinkPhone has IP68 water resistance.

Many of the ThinkPhone’s other specifications are similar to the ones found on standard flagship smartphones. The phone has a 6.6-inch display, Android 13 and runs on Qualcomm’s Snapdragon 8 Plus Gen 1 processor. There’s a 50-megapixel main camera and a 13-megapixel ultrawide camera along with a 32-megapixel selfie camera.

The announcement comes as there’s been more emphasis on cross-platform compatibility between smartphones and laptops throughout the industry. Apple’s MacOS Ventura update, for example, introduced the ability to use your iPhone as a Mac webcam and seamlessly move FaceTime calls between your iPhone and Mac computer. Google’s Phone Hub feature lets you do things like check your phone’s signal or battery status, sync notifications and access photos from the camera roll on your Chromebook.

The iPhone Has a New Siri Voice Command You’ll Want to Know About

This feature only works on iOS 16.

You can use your voice to do so much on your iPhone. Thanks to Siri, you can do really basic things like send a text message and get directions or you can get more complicated and use your voice to pull up all the movie showtimes for your local theater — no hands needed.

Apple is always adding new commands to Siri, and with the somewhat recent release of iOS 16, there’s one particular addition I’m most excited about.

You can finally use your voice to restart your iPhone.

Anytime I notice a software issue with my iPhone, like applications automatically force-closing, a laggy operating system or unresponsive features, I reboot my device to hopefully fix these bugs. And many times it does.

Don’t miss: iOS 16.2 on Your iPhone: Every New Feature, Tool and More

However, the only way to restart my iPhone is by either turning the phone off and on or force-restarting it. Both of these options require the use of my hands, and take several steps, but now it’s so much easier thanks to iOS 16. If you’re having any issues and need to reboot your device, here’s how to do it with just your voice.

You should also check out these 10 hidden iOS 16 features for your iPhone and the complete guide you need to master your iPhone’s latest software update.

Restart your iPhone using this simple voice command
As long as you have the “Hey Siri” feature enabled, which constantly listens for the two-letter command, you can say the following to restart your iPhone:

First, say “Hey Siri” to activate Siri.
Next, say “Restart iPhone.”
And last, say “Yes” when Siri prompts you to confirm.
Your iPhone will then restart. You’ll need to enter your passcode to unlock your screen.

You can also use this new feature on the iPad, but you’ll need to be running at least iPadOS 16.1.

If you don’t have “Hey Siri” enabled, you can to Settings > Siri & Search and toggle on Listen for “Hey Siri.” If you don’t want your iPhone listening for this command all the time, you can always just activate Siri by holding down on the side button for a second, although this does defeat the whole hands-free aspect of restarting your iPhone.