DoorDash can import grocery lists from iOS’ Reminders app

Though I do love walking through a supermarket and picking out my own foods, I will admit that, come winter, I often turn to delivery apps to get my products. DoorDash, one of the many delivery apps on the market, has launched a new feature that could make this process even more seamless, allowing iOS users import their grocery list from Reminders into the app. 

To take advantage of this, you can go to Reminders and copy your list or import it directly in the DoorDash app. An option should appear while you're shopping inside a store that says "Got a grocery list?" in a box on the page. From there you can click import and choose which list you want to sync based on the titles and a preview of the items. DoorDash will then show you options based on your list. So, for example, if you wrote onions, then it will let you scroll through different onions for sale and below it will have your next item with other options. 

DoorDash is also unveiling other changes, such as letting you add items from multiple stores to an order before placing it. The company has offered DoubleDash since 2021 but that only allowed you to include items from close stores after placing the original order. 

This article originally appeared on Engadget at https://www.engadget.com/apps/doordash-can-import-grocery-lists-from-ios-reminders-app-140020164.html?src=rss

WhatsApp custom lists are here to help you keep track of convos

WhatsApp just announced a custom lists feature to help users keep track of the neverending glut of conversations. This is basically a refined version of the chat filters feature that was released earlier this year.

The appropriately-named Lists tool allows people to filter chats via a myriad of custom categories. Users can create lists for family members, friends, coworkers, neighbors or just about anyone else. The platform says that these tools “help you focus on the conversations that are most important, when you need them.”

Just like the Favorites feature, users can add both group chats and one-on-one chats to any list. Look for the “+” icon in the filter bar at the top of the Chats tab to get started. WhatsApp is introducing this update today, but it’s a tiered rollout so it could take a few weeks to reach everyone.

This is just the latest update for the world’s favorite chat app. The platform announced that users can now add contacts from any device, and not just the primary smartphone affiliated with the account. WhatsApp recently hit 100 million users in the US, though that figure pales to the two billion users across the globe.

This article originally appeared on Engadget at https://www.engadget.com/apps/whatsapp-custom-lists-are-here-to-help-you-keep-track-of-convos-173525237.html?src=rss

WhatsApp custom lists are here to help you keep track of convos

WhatsApp just announced a custom lists feature to help users keep track of the neverending glut of conversations. This is basically a refined version of the chat filters feature that was released earlier this year.

The appropriately-named Lists tool allows people to filter chats via a myriad of custom categories. Users can create lists for family members, friends, coworkers, neighbors or just about anyone else. The platform says that these tools “help you focus on the conversations that are most important, when you need them.”

Just like the Favorites feature, users can add both group chats and one-on-one chats to any list. Look for the “+” icon in the filter bar at the top of the Chats tab to get started. WhatsApp is introducing this update today, but it’s a tiered rollout so it could take a few weeks to reach everyone.

This is just the latest update for the world’s favorite chat app. The platform announced that users can now add contacts from any device, and not just the primary smartphone affiliated with the account. WhatsApp recently hit 100 million users in the US, though that figure pales to the two billion users across the globe.

This article originally appeared on Engadget at https://www.engadget.com/apps/whatsapp-custom-lists-are-here-to-help-you-keep-track-of-convos-173525237.html?src=rss

Bridge Command lets you live out your starship fantasies

In 2016, I dragged my Engadget colleagues to preview Star Trek: Bridge Crew, a VR title letting you live out your fantasies of sitting on the bridge of a starship. Sadly, despite having two fans in the team, we failed miserably at the game, a wound I’ve been nursing ever since. When Bridge Command, London’s latest attraction, asked me if I wanted to try out its real world equivalent, I leapt at the chance. After all, this wasn’t just me testing out a new sci-fi themed event, it was a shot at redemption.

Bridge Command sits in the space between an escape room, team-building exercise, live-action roleplay and immersive theater. It’s essentially a paid-for LARP taking place on a custom-built starship set which cost £3 million (around $4 million) to play space captain. In order to survive and succeed, each player must work with their team, communicate and solve problems on the fly for the better part of two hours.

ASIDE: There’s plenty of existing bridge simulator roleplaying games and a small, but vibrant community that supports it. Digital platforms like Thorium Nova, Artemis and EmptyEpsilon are all platforms that enable folks to gather around to play in teams. Bridge Command itself is built on top of EmptyEpsilon’s platform, albeit with some degree of customization on top.

Effort has been taken to ensure Bridge Command isn’t a one-and-done experience, and creator Parabolic Theatre hopes to build a base of recurring fans. There are two different “ships” players can crew, the smaller UCS Havock and the far larger UCS Takanami, which do two different jobs in the fleet. In terms of capacity, both vessels can take up to 14 players at a time but the ideal figure is around nine. There are four different mission types, too:

  • Exploration: Involving discovery and adversity.

  • Military: Space dogfighting.

  • Intrigue: Espionage and more subtle action.

  • Diplomacy: Making nice with alien races.

With two ships and four missions, you can play the game eight times and theoretically get a new experience every time. But creators Parabolic Theatre will look to develop the game’s running story over time, like a long-running D&D campaign. The game even tracks your performance as your career progresses, and can receive promotions after a particularly successful mission.

I dragged a Trek-loving friend along to one of the previews, which set us on a Military mission on the UCS Havock. We were tasked with escorting a resupply mission to a large warship on assignment, a rather mundane assignment. It’s not much of a spoiler to suggest our gang of plucky underdogs might wind up in over their heads on a far grander mission. Or that they’ll need to take the under-equipped ship to go toe-to-toe with the baddies and win out against impossible odds.

Both “ships” are fully-realized starship sets, which are probably better-assembled than what you’ll see on most sci-fi series. They’re designed to withstand the regular punishment that can only occur when crews of friends come to play spaceships. But once you’re onboard, you’re essentially in a self-contained environment for the duration of the mission. And it’s a pretty impressive piece of set design.

The vibe is distinctly Star Voyage (Not Infringing Any Copyright, Promise!), with the Havock laid out like the USS Defiant, but with the paint job from Red Dwarf’s first two seasons. A trio of terminals line each side wall, with the captain’s chair on a raised dais in the middle. There’s a helm console up front that’s pointed directly at the imposing viewscreen that dominates the room. There’s a ready room off to one side of the bridge and a toilet on the other, while the corridor behind the bridge is the ship’s engineering bay, bunkroom and brig.

Everything from the terminals and the set is linked up, so if a subsystem takes damage you’ll not just have it grayed out on your screen. Built-in dry ice machines will emit “smoke” when something goes wrong or you take a nasty hit from an enemy vessel. If the lights had flashed at the same time, I’d have been tempted to start jostling myself around in my seat to add to the immersion.

Image of the UCS Takanami (c) Alex Brenner. No use without credit.
The bridge of the (larger) UCS Takanami.
Alex Brenner / Bridge Command

There were seven of us in the party, including some other journalists and some regular players who were coming for a regular session. Your humble narrator took the helm, figuring that I’d played enough Star Trek: Tactical Assault and Star Trek: Bridge Commander to be useful. We had an acting captain, and folks manning the radar, communications, engineering, laser and torpedo stations.

If you’ve ever used a touchscreen in your life then you won’t feel too unmoored from the role you’ve got to do here. Not to mention the first half hour of the game is little more than a tutorial to ensure that everyone is fluent with what they’ve got to do.

My helm station, for instance, offers you a picture of the ship with a 360-degree coordinate ring around it. There are two sliders, one for impulse power and one for warp, and a small square that lets you make some small evasive maneuvers. This is fine when the ocean-going liners you find in Star Trek are just heading from waypoint to waypoint, but pretty rubbish for combat. And I’m still annoyed you’re locked to a flat plain when space offers so much room for verticality.

Spoiler warning: The following three paragraphs outline my mission in greater detail.

The story begins while you’re putting on your military-issue space boilersuit, with a fictional newsreel playing in the background setting the scene. Once you’ve “transported” from the entrance to the space station, you’re then given a mission briefing and a send-off from the Earth president. Our mission, as outlined, was to escort a freighter on a resupply mission to a battle fleet which was dealing with pirates on the edge of the system.

A member of the Bridge Command team starts as our captain, giving us a tour of the ship and assigning roles for us to play. After we all get used to the basics in what might as well be called the tutorial stage, the captain then departs to help elsewhere. We’re then sent off to scout for incoming threats in nearby nebulas that, quelle surprise, are full of pirates. Naturally, the closer we get to the battle group, the harder the attacks we have to repel, forcing our chief engineer to race around repairing and repowering systems.

We limped to the battle group, repairing and re-armoring before we hatched a plan to play Possum to lure out the pirates. That plan worked spectacularly well, and with our hull integrity at just three percent, we were able to take out the pirates command and control vessel. After being congratulated by the top brass we were escorted back to the space station for a debrief and a drink in the bar.

End of Spoiler Warning: The following paragraphs do not contain any spoiler material.

It’s important to be aware of one’s own privilege and preferences when reviewing something like this. I found Bridge Command to be enormous fun, and if I lived in London, I suspect it would quickly become a hobby I indulged in with like-minded friends on a monthly basis. At £40 ($50) a session, the cost is a little steep but, even so, you could easily make this a long-running roleplaying game. And I’m sorely tempted to go a few more times when I can just to try and gain those promotions.

If there’s a downside (and it’s not even really a downside per-se), it’s that there are phases of play where you’re not doing anything. Or, at least, you’re a present and useful member of the team waiting for your colleagues to fulfill their parts of the mission. I found, given the need for clear oral communication and cooperation, that there were plenty of times where the best thing I could do to help my team was shut up and wait.

Given that focus on communication, I suspect it might be a turn-off if you’re a little shy or quiet of voice. The game doesn’t work unless everyone’s talking to share information between consoles and so it’s nearly-impossible to sit quietly in the corner. That’s not to say you need to bring any Big Theater Kid energy along, but I can imagine how this would feel like mandatory fun if you were dragged along by your friends or on a work team-building exercise. It’s a damn sight more fun and less painful than paintball, so maybe count your blessings there.

Bridge Command is located at St. George’s Wharf which is next to Vauxhall tube station in London. It is open for most of each day through to late evening, with ticket prices starting at £40 (around $50) at off-peak times for a single session.

This article originally appeared on Engadget at https://www.engadget.com/entertainment/bridge-command-lets-you-live-out-your-starship-fantasies-140046532.html?src=rss

This dazzling NASA image shows the biggest super star cluster in our galaxy

The James Webb Space Telescope continues to capture images of space that are clearer and more detailed than what we've seen before. One of the latest images it has taken is of a "super star cluster" called Westerlund 1, and it shows an abundant collection of heavenly bodies, shining brightly like gemstones. Super star clusters are young clusters of stars thousands of times bigger than our sun that are all packed in a small area. Our galaxy used to produce more clusters billions of years ago, but it doesn't churn out as many stars anymore, and only a few super star clusters still exist in the Milky Way. 

Westerlund 1 is the biggest remaining super star cluster in our galaxy, and it's also the closest to our planet. It's located 12,000 light-years away, made up of massive stars between 50,000 and 100,000 times the mass of our sun within a region that measures six light-years across. Those stars include yellow hypergiants that are around a million times brighter than our sun, as well. Since the stars populating the cluster have a comparatively short life, scientists believe it's only around 3.5 to 5 million years old. That's pretty young in the cosmic scale. As such, it's a valuable source of data that could help us better understand how massive stars form and eventually die. We won't be around to see it, but the cluster is expected to produce 1,500 supernovae in less than 40 million years. 

Astronomers captured an image of the super star cluster as part of an ongoing survey of Westerlund 1 and another cluster called Westerlund 2 to study star formation and evolution. To take the image, they used Webb's Near-InfraRed Camera (NIRCam), which was also recently used to capture a gravitationally lensed supernova that could help shed light on how fast our universe is expanding. 

This article originally appeared on Engadget at https://www.engadget.com/science/space/this-dazzling-nasa-image-shows-the-biggest-super-star-cluster-in-our-galaxy-120053279.html?src=rss

NASA’s latest supernova image could tell us how fast the universe is expanding

The James Webb Space Telescope's Near-Infrared Camera (NIRCam) captured a curious sight in a region 3.6 billion light-years away from Earth: A supernova that appears three times, at three different periods during its explosion, in one image. More importantly, this image could help scientists better understand how fast the universe is expanding. 

A team of researchers chose to observe the galaxy cluster PLCK G165.7+67.0, also known as G165, for its high star rate formation that also leads to higher supernova rates. One image, which you can see above, captures what looks to be a streak of light with three distinct dots that appear brighter than the rest of it. As Dr. Brenda Frye from the University of Arizona explained, those dots correspond to an exploding white dwarf star. It is also gravitationally lensed — that is, there's a cluster of galaxies between us and the star that served as a lens, bending the supernova's light into multiple images. Frye likened it to a trifold mirror that shows a different image of the person sitting in front of it. To note, it is the most distant Type Ia supernova, which is a supernova that occurs in a binary system, observed to date.

Because of that cluster of galaxies in front of the supernova, light from the explosion travelled three different paths, each with a different length. That means the Webb telescope was able to capture different periods of its explosion in one image: Early into the event, mid-way through and near the end of it. Trifold supernova images are special, Frye said, because the "time delays, supernova distance, and gravitational lensing properties yield a value for the Hubble constant or H0 (pronounced H-naught)." 

NASA describes the Hubble constant as the number that characterizes the present-day expansion rate of the universe, which, in turn, could tell us more about the universe's age and history. Scientists have yet to agree on its exact value, and the team is hoping that this supernova image could provide some clarity. "The supernova was named SN H0pe since it gives astronomers hope to better understand the universe's changing expansion rate," Frye said. 

Wendy Freedman from the University of Chicago led a team in 2001 that found a value of 72. Other teams put the Hubble constant between 69.8 and 74 kilometers per second per megaparsec. Meanwhile, this team reported a value of 75.4, plus 8.1 or minus 5.5. "Our team’s results are impactful: The Hubble constant value matches other measurements in the local universe, and is somewhat in tension with values obtained when the universe was young," Frye said. The supernova and the Hubble constant value derived from it need for be explored further, however, and the team expects future observations to "improve on the uncertainties" for a more accurate computation. 

This article originally appeared on Engadget at https://www.engadget.com/science/space/nasas-latest-supernova-image-could-tell-us-how-fast-the-universe-is-expanding-130005672.html?src=rss

Pokémon Sleep now supports smartwatches for more accurate tracking

It’s certainly taken a while, but Pokémon Sleep now offers smartwatch support for sleep tracking. This is fantastic news because, look, smartphones are good at many things, but tracking sleep from underneath a pillow isn’t really one of them. Using a smartwatch should make for more accurate tracking which, in turn, will make Snorlax and his friends happy.

The app works with all of the major smartwatch models, including the Apple Watch, Galaxy Watch and Pixel Watch. It also integrates with certain Fitbit devices. The sleep data syncs with Apple Health and the Android Health Connect app, for later perusal.

We don’t know why smartwatch integration took so long, but it’s worth noting that the app itself took four years from the first announcement until an official release. It’s likely Pokémon Sleep has a small team, as it doesn’t have the same cultural footprint as its walking-based cousin Pokémon Go.

Also, Pokémon Sleep is now on Spotify. You read that right. All of the app’s music tracks are available for daytime (or nighttime) listening via a 34-song playlist. These include three songs that have yet to be added to the app, for the real Snorlax-heads out there. Each track includes a matching visual that showcases the sleep styles of a specific Pokémon. Gotta. Catch. Em. All.

The Spotify tracks are available to anyone, even those with a free account. Pokémon Sleep is available for free on the Apple App Store or Google Play Store.

This article originally appeared on Engadget at https://www.engadget.com/apps/pokemon-sleep-now-supports-smartwatches-for-more-accurate-tracking-152113397.html?src=rss

HTC Vive’s Focus Vision is a $999 stab at high-end VR and mixed reality

HTC Vive is following up its intriguing, yet expensive, XR Elite headset with something that's still quite pricey, the $999 Focus Vision. Built on the same platform as the standalone Vive Focus 3, the upgraded model adds a slew of new features like built-in eye tracking, 16MP stereo color front-facing cameras for mixed reality and automatic IPD adjustment (which makes it easier to share). And with the additional $149 DisplayPort wired streaming kit, gamers can also hook the Focus Vision up to their PCs for more intensive VR experiences.

Judging from the price and features alone, the Focus Vision isn't much of a mainstream consumer play from HTC Vive. But that's to be expected. While Meta has poured tens of billions into making its Quest headsets cheaper and more accessible, without any need to worry about profitability, HTC Vive has leaned towards making more expensive headsets better suited for business and government work. The Focus 3, for example, made its way to the International Space Station to help astronauts exercise and relax.

While the Vive XR Elite looked almost like a pair of over-sized glasses, the Focus Vision doesn't look much different than the Focus 3. It's clearly a standard VR headset, albeit one a step above the Meta Quest 3, a device mostly made of cheaper plastic and other low-grade materials. There's plenty of cushioning along the front headset and rear head strap, and there's more than enough room to fit large glasses.

Under the hood, the Vive Focus Vision features a 5K LCD display, delivering a 2.5K resolution per eye, a 90Hz refresh rate and a wide 120-degree field of view. (HTC says it'll gain 120Hz support over DisplayPort later this year.) In addition to the two 16MP front-facing cameras, which are positioned like human eyes for distortion-free mixed reality, there's also an infra-red flood light for hand tracking in low light, four external tracking cameras and the usual depth sensor.

Once again, HTC has stuck a removable battery pack in the headset's rear strap, but now there's also a small built-in battery offering an additional 20 minutes of standby charge. That means you can swap battery packs without shutting down the headset and leaving your VR immersion. That feature alone could be compelling to organizations where employees will have to wear the Focus Vision for hours on end. HTC claims the headset can last for two hours of continuous use.

HTC Vive Focus Vision
HTC Vive

With the Vive Focus Vision, HTC is also making a play for high-end VR gaming. While Meta's Quest headsets can connect to gaming PCs wirelessly and with USB-C cables, they're essentially delivering a compressed video feed of VR experiences from those system. The Focus Vision's DisplayPort kit functions more like a standard PC VR headset: It gives you a direct connection to your computer's video card. You shouldn't see any of the lag or compression artifacts that you occasionally do with Meta Quest to PC connections.

As I expected, the Focus Vision feels very similar to the Focus 3. It's easy to put on adjust, there's more than enough room for my glasses to fit, and the front and rear cushioning helps it rest comfortably on my noggin. Thanks to the rear battery pack, the headset also feels well-balanced on my head. Other headsets, even Apple's Vision Pro, can feel front-heavy and place pressure on your nose and eyes.

HTC Vive Focus Vision
HTC Vive

When it comes to the actual VR experience, the Focus Vision delivers what I'd expect from an expensive HTC Vive headset. The 5K display is sharp enough to read small text, and its large field of view makes wandering around locations in Nature Treks VR feel genuinely immersive. I haven't had much of a chance to try PC gaming just yet, but I'm looking forward to delving into that for our review. 

I'm still disappointed by the limited selection of apps in the VivePort store, but once again this isn't a device that needs to cater much to general VR users. Companies relying on the Focus Vision will either use existing enterprise apps or build something for themselves. And gamers likely won't spend much time outside of the wired DisplayPort connection, where they can access the full bounty of their SteamVR libraries.

The Vive Focus Vision is available for pre-order today for $999 ($1,299 for businesses with an additional warranty) until October 17. HTC will also throw in the DisplayPort kit free for early adopters, and there are also three game bundles to choose from. 

Update 9/18 10:55PM: HTC Vive has extended the pre-order window from September 30 to October 17. We've updated the post to reflect that.

This article originally appeared on Engadget at https://www.engadget.com/ar-vr/htc-vives-focus-vision-is-a-999-stab-at-high-end-vr-and-mixed-reality-120054049.html?src=rss

Google’s AI notebook can generate a podcast about your notes

Google's latest update for its AI-powered research tool NotebookLM can turn the materials you want to pore over into a podcast-like audio discussion. The new feature called Audio Overview takes information from documents you've uploaded and then generates a "deep dive" discussion between two AI hosts. In addition to summarizing your sources, Google says the hosts will be able to find links between different topics and even banter back and forth. Based on the example the company posted with its announcement, the AI hosts sounded human enough to listen to, though you could still determine that the voices were AI-generated from their inflections and odd pronunciations of certain words. 

Since the feature is still in its experimental stages, Google admits that it has its limitations. The hosts can only speak English at this time, and they sometimes say inaccurate information, which means you will have to double check your material and ensure you didn't just learn something that's not factual. You also can't interrupt the hosts while they're speaking yet, and it still takes several minutes for NotebookLM to generate an Audio Overview for notebooks with larger files. Biao Wang, Google Labs product manager, wrote in the feature's announcement post that his team is "excited to bring audio into NotebookLM" despite those limitations, since they "know some people learn and remember better by listening to conversations."

The company launched NotebookLM back in 2023 as some sort of a digital assistant that you can ask questions about the documents you upload. In June this year, Google announced that NotebookLM has officially started running on Gemini 1.5 Pro, giving it new features and tools, and has expanded to over 200 countries and territories.

This article originally appeared on Engadget at https://www.engadget.com/ai/googles-ai-notebook-can-generate-a-podcast-about-your-notes-140004869.html?src=rss

Apple Intelligence for iPhone, iPad and Mac arrives in October

Apple Intelligence is coming next month. The company has revealed that its artificial intelligence platform is arriving on iPhones, iPads and MacBooks with the iOS 18.1, iPadOS 18.1 and macOS Sequoia 15.1 updates rolling out in October. It will only work on Apple's newer and more powerful devices, though, including the iPhone 15 Pro and the upcoming iPhone 16 models, as well as MacBooks and iPads running on M-series chips. In addition, the first batch of Apple Intelligence features will only be available in US English. Support for English in Australia, Canada, New Zealand, South Africa and the UK will be available in December, while for other languages, including Chinese, French, Japanese and Spanish is coming next year. 

One of the first Apple Intelligence features you'll be able to use is Writing Tools, which can rewrite, proofread and summarize text for you in Mail, Notes, Pages and even in third-party apps. The Memories feature will give you a way to easily create movies in Photos when you type a description for the kind of images you're looking for. You'll even be able to search for specific photos and videos by using natural language. And if you want to quickly remove background objects in images without damaging the rest of the photo, you can use the Clean Up tool. 

There's also a feature you can use to record, transcribe and summarize audio in Notes and Phone. If you initiate a recording while on a call, for instance, Apple Intelligence will generate a summary after it ends. A new Focus feature called "Reduce Interruptions" will surface only notifications that need immediate attention, while Priority Messages in Mail will put time-sensitive messages at the top based on the contents of those emails. You'll also see summaries of an email's most important information across your inbox and then use Smart Reply, which identifies questions and suggests quick responses, to fire off a quick message. 

Apple says its AI technology will make Siri more natural and more integrated into its platforms, as well. The voice assistant will apparently be able to understand your inquiries, even if you stutter or stumble over your words, and it can follow your train of thought even if you switch between text and voice. Apple says it's releasing more AI features over the next few months, including one that can generate an image using context when you circle an empty space and another that can create original emoji (or "Genmoji") based on a description you type.

Catch up on all the news from Apple’s iPhone 16 event!

This article originally appeared on Engadget at https://www.engadget.com/ai/apple-intelligence-for-iphone-ipad-and-mac-arrives-in-october-120502268.html?src=rss