Microsoft Accessibility Vision Guide

Create and Move Between Worksheets in Excel

A worksheet is a collection of cells within an Excel file. Worksheets allow you to organise and manipulate different sets of data. Each Excel workbook can contain multiple worksheets. For example, you could have a worksheet showing data for each month of the year all within the same Excel file.
When you create an Excel workbook, there is just one worksheet called Sheet1. The name of the worksheet appears on its sheet tab at the bottom of the Excel window.
To create a new worksheet, press Shift + F11.
To move between worksheets, press Control + PageUp/PageDown.
Note: the position of your PageUp and PageDown keys may vary depending on your keyboard. On a standard desktop keyboard they are in the set of six keys above the Arrow keys, on the right hand side.

https://sharons-shortcuts.ie/create-and-move-between-worksheets-in-excel/

Tips & Tricks for Testing Accessibility with Assistive Technologies

There are many ways to perform testing for accessibility, each one with their own strengths and weaknesses. Testing with assistive technologies is a great way to get a clear understanding of how your system behaves for real users – assuming that the tester is able to effectively use the assistive technologies they’re testing with. As a kickoff point, here are some valuable tips and tricks we’ve discovered during our own testing experiences.

Know when it’s time to test

Our first tip is to avoid testing with assistive technologies if you’re not ready to do so. At a minimum, you should ensure you’re passing automated accessibility testing before firing up a screen reader.

Software Assistive Technologies, like all software, have bugs. It can sometimes be difficult to track down where in the technology stack an issue exists. Is it in the Accessibility API? Is it in the browser? Is it in the assistive technology? Or is it in your website? For clarity’s sake, make sure you’ve tackled the things an automated tool like Tenon can tell you before beginning any AT testing.

Test with a purpose in mind

It rarely makes sense to just turn on a screen reader and start tabbing around the page. This practice does not accurately represent the experience of real users who visit your site with a screen reader. Additionally, it isn’t a very efficient way to find accessibility problems. Therefore, you need to test with a specific purpose in mind.

For instance, using assistive technologies when performing a scripted use case test is great. Using a screen reader to find out whether images have text alternatives is not so great, because an automated tool like Tenon is a faster way to get that information.

Checking for bugs

If you’ve done your part in making sure you’ve followed good semantic markup practices and are passing automated tests in Tenon, but testing with AT still doesn’t work out right, it could be an AT or browser bug. If you suspect this is the case, you’ll want to address and report it properly.

Digital A11Y provides links to various issue trackers for assistive technologies and browsers. In any instance where something isn’t right and you think it should be, it doesn’t hurt to check and see if you’re experiencing a bug elsewhere in the technology stack.

Specifics with screen readers

Screen readers can be difficult to please, especially because they don’t have any way of understanding context. There is only so much you can do to appease the way a screen reader is going to translate written text. Understand that mispronunciations on the screen reader’s part are not violations of accessibility.

As such, you don’t have to get hung up on correcting how a screen reader pronounces something. For instance, the first three digits of the Tenon phone number (443-489-6549) might get announced as “four-hundred-forty-three”, and that’s OK. Other things that might sound like they’re getting read aloud wrong are abbreviations, acronyms, and homonyms.

If your goal is to check that your visible content is going to be read out (separate from any speech viewer they have), turn on any highlighting options that the screen reader offers. VoiceOver on macOS and Narrator do this by default; NVDA features a highlight setting that can be enabled based on a user’s preferences.

Understand your assistive technology

Learn about the assistive technology you’re using before doing any real testing. For instance, try to fill out forms or completely navigate a discrete task with Dragon Speech Recognition Software. After going through enough practice forms to be comfortable with how to use Dragon and after your Dragon has had time to learn from you, you will be able to test more effectively.

You don’t need to test in every browser with every screen reader. In fact, some combinations work better than others, and some combinations don’t work at all.

Just because you can access an actionable element with a keyboard AND a screen reader doesn’t always mean that it is keyboard accessible. Sometimes, a screen reader might detect a click handler bound to an element and automatically fire that same handler when you press the Enter key. This is a feature that works for the benefit of screen reader users. However, with the screen reader off this workaround isn’t available and, as a result, won’t be available for those users who need keyboard accessibility but don’t use a screen reader.

Winaero Tweaker 1.55 Released

Winaero Tweaker is a program that allows you to adjust many Windows settings that would otherwise require registry hacks. The last few versions added some settings in File Explorer options which Microsoft removed in some builds of Windows. It also makes it easier to adjust some Microsoft Edge settings to enhance Edge’s security.

A few notes and warnings. While the program is very accessible there is information that appears on the screen as you move from one option to the next. This information explains what each option does and there are also links to learn more about each feature. JAWS does not speak this information by default. To have the information spoken automatically press insert+S until your screen highlighting setting is set to “all.” I am unclear as to how to perform the equivalent change using NVDA.

Many of these settings are capable of disabling critical functions of your computer. I recommend changing them only if you understand what the setting is designed to do or change. Due to time constraints I am unable to provide any assistance on using this program and I take no responsibility if it causes any undesirable effects.

Allyant – Thursday, June 29, 2023 at 11:24 AM

How Do I Check If My Website Is WCAG Compliant (And Does It Need To Be)?

One of the most common questions we get from organizations interested in digital accessibility is simply, “Is my website compliant?”

While this may seem like a simple question, there are many additional layers to peel back on when determining the short or long answer.

This article will break down these layers and outline some ways to determine the answer. Additionally, we will cover some ancillary topics, such as the relevance and potential impact of your website being WCAG (Web Content Accessibility Guidelines) compliant.

There are several ways to approach this question – whether just doing a simple spot check or really diving into making your site highly usable for all visitors and compliant with the WCAG success criteria.

Manual Review: The best way to check a website’s WCAG compliance status

The only authentic way to determine if your website complies with the WCAG standards is to do a manual review. In all cases, this should also include people with disabilities, specifically native screen-read users, as we detailed in a recent blog post. 

If your organization is truly serious about providing equitable access to your consumers and ensuring you comply with the WCAG standard, you should deploy live-user testing as part of your accessibility project plan.

That said, you can perform more immediate tests in real-time to get a baseline understanding of your website’s current WCAG compliance level. This can also be a great way to expose your team to the importance of web accessibility.

3 Quick and easy ways to check whether a website is WCAG compliant 

Keyboard Testing: Website visitors with various physical disabilities will largely use keyboard navigation to engage with your brand through your website. You can even do a quick test right now to understand how your site performs from this perspective. Simply visit your website’s homepage and attempt to ‘navigate’ through the experience using only keyboard commands. Turn your mouse off and see if you can easily navigate to a product and move it to your cart or learn about your services and find a way to contact your organization using primarily the Tab, Enter, and Arrow Keys. Is it possible? You should see a clear and visible Skip to Main Content link, easy access to full navigation menus, and clear ‘focus indicators’ on all elements that could otherwise be ‘clicked’ with a mouse.
Video Content & Captioning: Most websites today have some form of video(s) to portray their services or highlight key features of products. Mute your computer or mobile device and turn on closed captioning. Are you able to consume the content? If you had a hearing impairment, could you obtain equal information through strong captions on the video or spoken content?
Automated Accessibility Testing: Through tools like our Allyant HUB or other free tools in the marketplace, such as WAVE from WebAIM or Google Lighthouse – you can run a quick spot check on your website and even see a score of your current standing. It is important to note that automated testing tools find about 30% of WCAG violations. However, running these tools can provide a quick and easy way to better understand your current compliance level or outline the importance of a web accessibility plan to your leadership or legal teams.
Is WCAG Compliance the same as ADA Website Compliance?

One of the most common misconceptions around website compliance is that ADA Website Compliance is the standard that an organization must meet. While this technically is not the case, it’s also not entirely off base as the WCAG standards and the ADA are closely connected when organizations work to ensure their website or mobile application is accessible to users with disabilities.

Among many others, here are three key areas where WCAG Compliance and ADA Website Compliance are closely, or nearly exclusively – when we consider web accessibility legal settlements – connected:

ADA Title III: The ADA is a U.S. civil rights law that prohibits discrimination against individuals with disabilities – this specifically includes public accommodations. Title III of the ADA outlines that places of public accommodation include both physical spaces (a bank, retail or grocery store, school, restaurant, and so on) and the digital realm. We can likely all agree that with the digital boom in recent years, this includes websites and online services that should not discriminate access based on the consumer having a disability. WCAG comes into play when aiming to digitally comply with Title III of the ADA as it is the internationally recognized standard for website compliance.
Legal Interpretation: Many times, when individuals and organizations first dive into web accessibility following a demand letter or lawsuit being aimed at their website, they are surprised to hear the ADA does not explicitly mention web accessibility standards or WCAG 2.1 AA conformance. However, there have long been legal interpretations and court rulings that have established that websites and digital platforms can, in fact, be considered places of public accommodation. Therefore, under Title III of the ADA, as outlined above, they are subject to ADA requirements, making WCAG highly applicable for brands considering website accessibility as a form of risk mitigation.
Recognized Standards: Although the ADA does not currently outline specific technical standards for web accessibility, its primary purpose is to ensure that individuals with disabilities are not discriminated against. The Department of Justice (DOJ), which enforces the ADA, has referenced WCAG as a recognized standard for web accessibility in various legal settlements, consent decrees, and formal statements on this topic – including very recently in a letter to colleges and universities.
Does WCAG Compliance apply to my business’ website?

At the most baseline level, if your business has a website, the answer is yes!

As with any business decision, you have the choice to disregard WCAG Compliance and web accessibility. However, it’s important to acknowledge how significant the WCAG standards can be for your bottom line, depending on your industry vertical and the nature of your work.

Naturally, there are certainly more high risk verticals when we think about web accessibility legal cases that might make the business case for complying with WCAG much simpler. 

The historical case law data shows that the most heavily targeted business verticals specifically offer purchase paths (such as retail and e-commerce) or digital services (such as financial institutions or hospitality). 

Additionally, as outlined above, organizations operating in the Title II space where Section 508 compliance applies are legally required to conform with WCAG 2.0 AA for all digital content they publish – including websites and PDFs and other documents posted for consumption online.

More simply stated, no business is inherently immune from WCAG compliance. 

If you have a website or other digital property that allows consumers to engage with your brand – WCAG compliance is applicable. It is, by and large, the only widely recognized and accepted standard for website accessibility conformance. Not trying to build out a web accessibility plan or consider the implications of WCAG conformance in any fashion will inherently mean your business is assuming at least some legal risk. 

Additionally, providing equitable access to your business and services is simply the right thing to do!

Is WCAG a Global Standard?

The great news for many brands – including our long list of international customers who offer their websites in various languages across the globe – is that WCAG applies regardless of where you are offering your website to consumers. 

Without a doubt, WCAG is acknowledged as the worldwide benchmark for website accessibility. This is mainly because the World Wide Web Consortium (W3C) developed and maintains these standards. For background, this is an international community that focuses on creating web standards more broadly (not just for accessibility).

Over many years, WCAG has gained widespread acceptance and adoption worldwide. Governments, organizations, and web developers across different countries refer to the WCAG success criteria nearly exclusively as a reference point for ensuring websites and digital properties are accessible to people with disabilities. 

More recently, many countries (or states and provinces) across the globe have adopted WCAG as the basis for their accessibility standards and regulations – including regulations such as the EU Web Accessibility Directive, the AODA in Canada, and of course, Section 508 Compliance in the United States.

So, is my website WCAG Compliant?

As outlined above, we strongly recommend organizations do a quick self-assessment when kicking off their web accessibility journey! If nothing else, it can introduce your design, development, and digital marketing teams to the WCAG standards and the types of violations you might need to work to resolve as you focus on driving compliance with your website through an expert audit.

However, we are always happy to help by giving our expert opinion. Our team here at Allyant would be happy to provide you with a free quick assessment of your website’s current compliance level and discuss how we can get you on your way to equitable access in no time. 

Simply chat with our team of experts to get this started, or fill out the contact form below!

How Do I Check If My Website Is WCAG Compliant (And Does It Need To Be)?

50 things you need to know about Apple Vision Pro

Size! Price! Apps! Battery! Release window! Creepy eyes! More!

(Image credit: Future)
It’s the most talked about piece of tech hardware in many years, and there’s still a lot to learn about it. The Apple Vision Pro is the most ambitious device Apple has ever made: a head-mounted wearable computer that it believes will change the way we interact with technology forever.

Fueled by a huge amount of processing power, sensors, an all-new OS and a truly striking industrial design, it’s still a long-way off from release. But that only makes us more eager to find out what this mixed-reality device is capable of.

From immersive ‘Environments’ to creepy digital ‘Personas’, from a hands-free control method to a potential date with Mickey Mouse, here are 50 facts, figures and features you need to know about Apple Vision Pro.

(Image credit: Apple)
Hardware

Apple is calling it a Spatial Computer. It believes the Vision Pro will usher in a new era for computing, a ‘spatial’ one, where our workspaces aren’t limited to desks and screens, but anywhere in the world around us — provided we’re wearing something like the Apple Vision Pro. Through its lenses, applications are superimposed onto your surroundings, similar to augmented reality iPhone applications.

Apple Vision Pro is powered by not one, but two chips in a dual-chip array. Mac fans will be familiar with the superb computing performance offered by the M2 chip set to be employed by the Vision Pro, but it’ll be paired with a brand new R1 chip, whose purpose is to handle all the data coming in from the sensors, cameras and mics.

It uses a pair of micro-OLED displays in its gasket, with 23 million pixels between them. That’s the same as having a 4K TV for each of your eyes.

Though Apple Vision Pro may appear like one, it’s not a VR headset. Not in the traditional sense at least, even though its hardware is conceptually similar. Apple is focussing on mixed-reality use cases, where applications and the real-world merge. Unlike most VR headsets for instance, Apple Vision Pro does not support ‘room-scale’ experiences — yes, applications can take up your entire field of view, but developer documentation shows that if you move more than 1.5 meters from the origin point of an experience, you’re going to have the plug pulled on immersion. The Meta Quest line, by comparison, lets you move freely around an area you define up to 15m x 15m.

There are a ton of cameras built in. There are at least 10 external cameras (oriented with two main forward facing cameras, four pointing downward, 2 TrueDepth cameras for establishing depth, and two pointing sideways to capture the periphery of your view and room), as well as at least four IR cameras for working in the dark areas inside the headset, as well as some on the outside.
(Image credit: Apple)

There are a lot of sensors in the Apple Vision Pro too. There is a LiDAR sensor for measuring distance and 3D mapping of spaces, and accelerometers and gyroscopes to track movement. And of course there will be sensors for tracking things like chip temperatures, as standard.

That front piece? That’s a single piece of curved laminated glass that joins up with an aluminium alloy frame. It’s designed to adhere to common curves of the human head to make it as comfortable as possible to wear.

Behind the front piece is a Light Seal. This helps the Vision Pro sit comfortably on your nose, and blocks out ambient light that could reduce the clarity of the displays. It’ll be available in different sizes for maximum comfort.

A dedicated Top button lets you take spatial photos and videos, using the onboard 3D camera array.

The Apple Vision Pro will make use of an external battery pack. That’s not as bad an idea as it sounds! You’ll get two hours of use on average from the battery, and you can plug the headset into a wall socket for continuous use, too.
(Image credit: Apple)

A Digital Crown, like what’s present on the Apple Watch, is on the Apple Vision Pro, too. It serves two purposes — firstly to act as a home button to go back to the Vision Pro’s main menu and apps screen, but also to dial in the level of immersion when in room scale ‘Environments’. In other words, turn it to tune in or out more or less of your real world surroundings.

The head band has an adjustable fit. Not only is its fabric mesh stretchy and breathable, but a side Fit Dial lets you tighten it just right, no matter what your head shape may be.

Perhaps the most striking feature of the Vision Pro design is its external EyeSight display. This forms part of the frontispiece of the headset and uses internal cameras to capture video of your eyes and the area around them, then shows that on the external screen to anyone looking at you. EyeSight turns on when someone approaches you.

Apple Vision Pro does not have a controller. You’re the controller! You’ll be able to navigate apps and the interface just by using your hands to create gestures, and your voice to issue commands.

Apple Vision Pro offers eye-tracking to help make these hands-free controls work accurately. LEDs and infrared cameras beam tracked light patterns on your eyes to ascertain where your gaze is sitting. That helps Vision Pro recognize which elements of its interface you’re looking to interact with.
(Image credit: Apple)

Wear glasses? You’ll be able to get prescription lens inserts for Apple Vision Pro. However, these will be sold separately, and made (initially at least) exclusively by ZEISS, a premium lens manufacturer. So don’t expect them to be cheap. They’ll magnetically attach to the lenses inside the headset.

It’s not just about Spatial Computing, but Spatial Audio, too. Apple has popped speakers into the side strap of the Vision Pro, over but not in your ears. This helps you hear your real-world surroundings without losing out on too much audio. During set up, they’ll map your ear geometry too to aid replicating a surround sound experience.
Software

The headset runs on the new visionOS operating system. From its icons to its fonts and control points, it builds on what users will be familiar with from macOS and iOS, but presents them in new ways that interact with the world around the user.

You have full freedom to move apps wherever you want in visionOS. They’re not locked to a set size, or a set position in a space. Instead you can tailor them to suit your workflow, needs and surroundings — the apps will even react to the ambient lighting of the positions you place them in, casting natural shadows in the digital space.

Get used to pinching. Putting together your thumb and forefinger will be the primary way to make selections in visionOS, while flicks will let you scroll through content.
(Image credit: Apple)

Worried about tired arms waved around all day? Don’t be! You can control Apple Vision Pro even if your hands are in your lap. Its onboard sensors and camera arrays are sensitive enough to track your intentions even if your hands aren’t waved right in front of the headset.

Can’t use your hands, or just want to give them a rest altogether? Siri will be on board to open and close apps and media, while hovering your gaze over a mic icon in any app will activate a dictation mode.

There’s already quite a few confirmed, familiar first-party apps that you can use on Apple Vision Pro. These include Apple TV, Apple Music, Photos, Messages, Mail, Safari, Freeform, Keynote, Mindfulness and Notes.

Lots of these apps can be viewed in those immersive ‘Environments’ mentioned earlier. A key one will be the Cinema Environment, which will blow Apple TV  screenings up to as large as 100 feet wide.

Bespoke entertainment content is already being made for Apple Vision Pro. Spatial Video support for Apple TV means that the upcoming Godzilla TV show, currently titled Monarch: Legacy of the Monsters, is being filmed with Spatial Video support in mind.
(Image credit: Apple)

Apple is looking to revive 3D video with Apple Vision Pro, too. It’ll be pushing Apple Immersive Videos, which use 180-degree 3D 8K recordings, alongside Spatial Audio, to put you in the heart of an action scene.

This is Apple’s first 3D camera too. You’ll be able to record and playback your own spatial videos and photos — and those being filmed will know you’re doing so as the headset’s front screen changes to a recording mode.

One of the key apps for Apple Vision Pro will be FaceTime. You’ll be able to make calls with the headset, and have your friends’ faces hover in panes alongside anything else you’re doing with Vision Pro at the same time.

But if you’re wearing a headset, what will your friends see when you’re on a FaceTime call? Your digital Persona, that’s what. Like a detailed memoji, the Apple Vision Pro will scan your face and create a bespoke, quite-uncanny 3D representation of you that will be beamed to your friends. It’s a bit like sending a video game character version of yourself to represent you on calls.

SharePlay will work in FaceTime too. If you want to work simultaneously on a SharePlay-supported app or project with Vision Pro on, you’ll be able to do so.
(Image credit: Apple)

It may have taken years to get Final Cut Pro for iPad, but you won’t have to wait so long for the Spatial Computing version. Apple already has plans for Final Cut Pro for Apple Vision Pro, and it should be ready in time for launch, too.

Of course, it won’t just be about first-party apps. A visionOS App Store will launch alongside the headset, offering third-party developers the opportunity to create apps and services for the device, too.

There’s already a visionOS SDK available, and as a result, we’ve already got a sneak-peek at some of the potential third party apps heading to Vision Pro. Some are great, like a gaming museum, and some are silly fun, like a giant calculator.

Of the third-party developers announced for Apple Vision Pro so far, the biggest announcement was that Disney will be creating apps for Vision Pro. At the headset’s launch, Disney showed off a sizzle reel that included transporting viewers to locations from Star Wars, having Mickey Mouse run around your living room, and streaming the Disney+ catalogue in the headset.

On top of that, iPhone and iPad apps will be compatible with Vision Pro. Though they’ll likely be flat plane versions of existing apps, Apple is keen to make it easy for developers to get their existing apps working on the headset so that users can feel right at home straight away with the headset.
(Image credit: Apple)

Among those supported iPhone and iPad apps will be Apple Arcade games. You’ll be able to Pair a Bluetooth controller with Apple Vision Pro, and play those titles on a giant screen inside the headset.

Vision Pro will support Apple’s Continuity features to allow you to work inside the headset, alongside your Mac. It’ll let you add multiple virtual displays to a MacBook, turning a single screen device into a multi-display workstation.

And though Vision Pro is a handsfree, controllerless device, there’s nothing stopping you from being able to pair a Bluetooth keyboard, mouse or trackpad with the headset. Handy for those aforementioned Continuity features with a Mac.

3D objects will be a big thing in visionOS, naturally, given the 3D nature of the interface. But developers can potentially go the extra mile by letting you take 3D objects out of an app and place them in the real-world around you, letting you scale them as desired.

Visual search will be part of visionOS. Working a bit like Visual Lookup, it’ll be able to identify items or detect and read aloud text that your gaze centers on. It sounds like a great accessibility feature, as well as a great learning tool.
Price, release and other features

Apple wants to make Vision Pro as secure as any of its other devices. It’ll offer an Optic ID security system that scans your iris for its unique features to authenticate access to your sensitive data. That Optic ID data is encrypted, and never leaves your headset.

Privacy is important. Though Apple Vision Pro has cameras constantly tracking your environment, that information is processed at a system level, meaning that third-party applications don’t get to see inside your living room. Likewise, eye-tracking data, such as where your eye lingers on a Safari page, will not be shared — just any selection taps.

Want to take Vision Pro on the go? A dedicated Travel Mode will be available, which will ask you what sort of transportation you’re using, and scale back some ‘awareness features’ in line with your activity. No one wants to accidentally gatecrash the cockpit of a plane, right?

How about sharing your Apple Vision Pro? A Guest Mode will also be included, and while its details are not yet clear, it’s assumed that you’ll be able to register several Optic ID security locks and get individual ‘workspaces’ for each user. But the details here still need to be confirmed.

Don’t run! Apple Vision Pro has a speed limit. Early SDK users are seeing that when testing high-speed motion, a ‘Moving at Unsafe Speed’ warning message appears and temporarily hides virtual content.
(Image credit: Apple)

Apple Vision Pro will make use of your iCloud account to make sure that everything you do on the headset is synced with your more traditional devices, like the iPhone and Mac.

It’s expensive. Apple Vision Pro will cost $3,499 when it launches at some point in 2024.

Try before you buy. Apple intends to shuffle its Apple Store retail locations around to include testing and showcase areas for Apple Vision Pro. Much like with the launch of Apple Watch, Apple reportedly wants people to come get fitted for the device to ensure they get the right band fit, optical inserts and more. That might mean you can’t get it at other retailers in that initial launch wave.

Can’t afford $3,499? Then patience, grasshopper. Rumor has it Apple already has plans to launch a cheaper version of Apple Vision Pro by the end of 2025.

But if you’re outside the US note that it’s not getting a global release. Not initially, anyway. The US is the first market to get the Apple Vision Pro, with other territories to follow at an as-yet-undetermined time.
https://www.imore.com/gaming/virtual-reality/50-things-you-need-to-know-about-apple-vision-pro

Introducing XploreNinja, a navigation app powered by BlindSquare for Android

BlindSquare – Friday, June 30, 2023 at 5:50 PM

Introducing XploreNinja, a navigation app powered by BlindSquare for Android devices

In today’s fast-paced world, technology has become an integral part of our lives, shaping the way we communicate, work, and navigate our surroundings. For individuals with vision loss, technology has opened up new possibilities and opportunities, empowering them to explore the world independently. One such groundbreaking innovation is XploreNinja – BlindSquare’s latest app release – a remarkable mapping system designed to support blind Android device users navigate safely and confidently; a new way to experience the world.

Introducing XploreNinja, Powered by BlindSquare:

• What is XploreNinja?
The XploreNinja Android app is a totally new platform, first released in January 2023, providing the BlindSquare developers access to new powers and flexible integrations. Available first for Android devices, an iOS version is planned and will be released as feature layers increase.

• What XploreNinja is not.
While powered by BlindSquare’s development team and using BlindSquare customized datasets (whether publicly sourced or custom developed) it is not BlindSquare for Android as features and functionality vary.

• What is BlindSquare (iOS) working on now?
Development of new features and expanding datasets continues with BlindSquare (iOS) as the flagship product.

• Datasets and special features.
The XploreNinja apple leverages customized datasets created for BlindSquare (iOS) serving geographies where public data is either poor or not currently available.  Examples of these datasets include parks (urban/rural), college/university campus sites, transit services, public pedestrian malls, and more. The XploreNinja app, in a park setting as an example, can provide automatic (or optional) tracking via audible prompts from point A to B (such as from the parking lot, to a park bench by the pond, to the nearest facilities).
BlindSquare’s Enterprise Products includes a service known as CLS (Customized Location Services), a dataset of points of interest both indoors and outdoors for blind users, which is maintained and authorized using a map interface (such as OpenStreetMaps, Google Maps, Google Satellite View, Google Street view) or field-collected or client supplied data.The XploreNinja app unlocks access to this data on Android devices.
Example use cases and press coverage,  including Parks, Transportation, Pedestrian Malls and more

BlindSquare’s new app, XploreNinja, represents a significant milestone in the field of assistive technology. By combining advanced mapping capabilities, continuous optimizations, and a user-centric design, the Xplore Ninja app empowers blind individuals to navigate their environments more safely, confidently, and independently. As technology continues to evolve, apps like XploreNinja serve as an example of how innovation can break barriers and create opportunities for individuals with vision loss. Through innovations like the XploreNinja app, BlindSquare is driving inclusivity and accessibility, enabling individuals with vision loss to embrace their surroundings and live life on their terms.

Privacy Policy

Patently Apple – Friday, June 30, 2023, 11:44 AM

Apple Patent Reveals advancing Siri for Controlling Apps on Apple Devices including their upcoming Vision Pro Headset

Yesterday the US Patent & Trademark Office published a patent application from Apple that relates to enabling Siri to understand a new set of commands for controlling applications like a word processor and more. The invention relates to the upcoming Apple Vision Pro, the iPhone and more.

Apple’s Siri may require training to be able to interact with the applications or process the commands to perform one or more tasks. This can be cumbersome and time intensive, creating barriers for developers who wish to integrate their applications with the digital assistant and for users who seek a greater level of access to different tasks with the digital assistant.

Addressing this issue extends to Apple’s coming Vision Pro Spatial Computing Headset.  Apple notes in patent FIG. 1B below that system #100 includes two (or more) devices in communication. First device #100b (e.g., a base station device) includes processor(s) may be wired or wireless.  The second device #100c (e.g., a head-mounted device) includes various components, such as processor(s), RF circuitry, memory, image sensor(s), orientation sensor(s), microphone(s), location sensor(s), speaker(s), display(s) and more.

Apple covers AR/VR/MR in eight paragraphs to emphasize that the invention relating to Siri controlling applications definitely extends to their upcoming Apple Vision Pro.  

In Apple’s patent FIG. 7 below the Apple Vision Pro may produce a VR environment including one or more virtual objects that Siri may interact with based on user input. In some examples, the headset may generate or receive a view of the virtual environment, including the one or more virtual objects. For example, as shown in FIG. 7, the headset may receive view #700 including a virtual painting #702 and a virtual couch #703.

While interacting with view #700, Siri may receive a spoken input #701 “make the couch blue” that it doesn’t recognize.  Accordingly, Siri determines whether the command matches an action, sub-action, or at least a portion of the metadata of a link model to determine which action should be performed.

The patent also relates to Siri working with apps on other devices like an iPhone as pictured in patent FIGS. 4 and 6 below. These are examples of input commands to be mapped and executed.

As shown in FIG. 4 above, the system may receive the spoken input #404 “bold the word ‘Hey!’.” Siri may process the spoken input to determine that the command is “bold” but may not understand what the command “bold” means or what action to perform based on that command. Accordingly, the system and Siri may determine the action to perform for the command “bold” by accessing a “link interface,” as noted in patent FIG. 3 below.

Finer details could be found in Apple’s 46 page patent application number 20230206912 titled “Digital Assistant Control of Applications.”  

Some of the Team Members on this Apple Project

Cédric Bray: AR/VR Engineering Manager  
Helmut Garstenauer: Senior AR/VR Software Engineer
Tim Oriol: Senior Software Engineering Manager, Technology Development Group
Kurt Piersol: Lead Engineer
Jessica Peck: Conversation Designer
Luca Simonelli: Software Engineering Manager
Nathan Taylor: App Intents Engineering Manager
https://www.patentlyapple.com/2023/06/apple-patent-reveals-advancing-siri-for-controlling-apps-on-apple-devices-including-their-upcoming-vision-pro-headset.html

Australian Story produced an episode where half the interviewees were blind. This is what we learnt

Mon 19 Jun 2023 at 3:58pm

One producer, one camera operator and two blind men rushing through Auckland Airport. There was a short window between our flight from Auckland to Samoa and we had to pretty much run to make the tight turnaround. 

Airports can be overwhelming for the most seasoned travellers. Add in blindness and it’s a whole different story. Plus, we were filming the mad dash for an upcoming Australian Story. I had underestimated what would be required.

Seventeen years ago, Jamie Teh and Mick Curran invented the world’s most popular free screen-reader software. Called NVDA (Non-Visual Desktop Access), it’s the only screen reader in the world made by blind people for blind people.

It’s now used by 275,000 people in 175 countries and has been translated into 50 languages. We were travelling with Jamie and Mick to document the impact their software is having in Samoa.

It was fascinating to watch them navigate their way through the airport using a form of echolocation, clicking their fingers to listen for sounds echoing off the walls and furniture to determine where they are in the space.

In the end we got through security, tracked down the correct gate and finally boarded the plane. It wasn’t until the adrenaline had subsided and I was seated next to Jamie and Mick that I could catch my breath and realise what we had just achieved.

“Are you guys OK?” I asked.

The notoriously good-humoured pair just laughed. As Jamie says, they take their blindness in their stride and just “get on with it”.

Kristine Taylor and camera operator Anthony Sines sit down for an interview with NVDA co-founder Jamie Teh (left).(Australian Story: Kate Wilson)none

When we commissioned the story we knew there could be some additional things to consider. As with many sighted people, I had not met many people with limited vision, so it was a big learning experience for me. 

Mick and Jamie are seasoned travellers, but it takes them a little extra planning to get from point A to point B, such as getting an escort at the airport from the taxi to the check-in counter and to the correct gate.

There’s so much that you take for granted as a sighted person, tiny details that you pick up and process, that a person who is vision-impaired doesn’t.

Mick Curran (centre) and Jamie Teh (right) read a braille menu as they sit down for lunch with Ari Hazelman from the Samoa Blind Persons Association.(Australian Story )none

When Jamie and Mick, camera operator Marc Smith and I arrived at our hotel in Samoa, one of things we did was guide Mick and Jamie to their room and then orientate them within the room. When they are in a new space they need to orientate themselves and become familiar with the space, including doing little things like identifying which bottle is the shampoo and which one is the conditioner, where they can find the TV remote, kettle and so on. 

I said to Mick and Jamie when we first started filming that they would need to let me know what I was doing right and, more importantly, what I was doing wrong. They were not shy in pointing out the latter. Anyone who has watched the program will have noticed their sense of humour on the trip — we had a lot of laughs, much of it at our expense.

Something we did get wrong was the first time we filmed with Jamie I took him by the hand to guide him. This is not the preferred way in the blind community, so I found out. Instead, I offered my elbow or arm. It was a small thing but made a big difference once I knew the correct way to guide.

It was a memorable experience filming with Mick and Jamie, Taylor says.(Australian Story)none

Audio description rolls out across the ABC

Although more than half the interview subjects in the story were blind, the interviews were structured and set up much the same as any other — lighting, two cameras, a producer and a quiet space to film. We faced one another in the interview as we normally would.

In fact, nothing much differed with Mick and Jamie — it was like any other Australian Story interview, however, for any sequences outside the home we would ensure we orientated them in the space first.

Jamie Teh in interview with Australian Story.(Australian Story: Kate Wilson)none

One thing Mick and Jamie had told me was that when blind or vision-impaired people got together in a room, they did not always feel it was necessary to face each other, so they might end up chatting away enthusiastically while facing in all different directions.

In this case I had to ask them to direct their responses to where our voice was coming from, but that’s something they are used to doing when talking to sighted people.

We knew that there would be a lot of people who are vision-impaired and blind who would be interested in this story and we wanted to make sure those people could engage fully with the episode.

Jamie Teh (left) and Mick Curran are sought after for their experience with accessibility for vision-impaired people.(Australian Story: Kate Wilson)none

In consultation with ABC’s senior channel manager, Charlie Cox, digital producer Megan Mackander and I quickly discovered the best way to make content accessible for vision-impaired viewers was audio description (AD). This is a narration that describes important visual elements of a television program, movie or performance between lines of dialogue. As with closed captions, it can be turned on or off as needed.

Jamie Teh explains how NVDA screen reader software works to create synthetic speech.(Australian Story )none

About 14 hours of content is audio described each week at the ABC. It’s something the ABC is rolling out across ABC iview and TV broadcast for select programs such as Australian Story, Back Roads, and popular children’s shows such as Bluey and Play School. The ABC is looking to expand this further in the future.

We’ve had some great feedback on the program. Inspiring,  talented, clever and change-makers were just some of the words popping up in our social media comments. Some viewers were even calling for the duo to have an Australian of the Year nomination.

For me, making this program was a truly memorable and educational experience. Jamie and Mick were kind teachers and such fun to be with.

Watch Australian Story’s Blind Leading the Blind on ABC iview (AD option available) or ABC News In-Depth YouTube (or watch the AD version here)

Sharon’s Shortcuts – Monday, June 26, 2023 at 5:24 AM

Calculate Formulas in Excel Spreadsheets

You can get Excel to recalculate formulas and functions in all open workbooks, or just in one worksheet in a workbook.
Press F9 to recalculate formulas in all open Excel workbooks.
Press Shift + F9 to recalculate formulas in the active worksheet.
Now, usually, Excel automatically updates the results of formulas and functions whenever something changes. So you only need these shortcuts if the Calculation Option is set to Manual.
So why would anyone change the calculation option from Automatic to Manual? Well, when working with very large files, the constant updating, whenever changes are made, can be slow. Therefore, people will sometimes switch to Manual mode while working through changes on worksheets that have a lot of data.
The Calculation Option is on the Formulas ribbon (Alt + M, then X).

Calculate Formulas in Excel Spreadsheets