国产精品美女一区二区三区-国产精品美女自在线观看免费-国产精品秘麻豆果-国产精品秘麻豆免费版-国产精品秘麻豆免费版下载-国产精品秘入口

Set as Homepage - Add to Favorites

【hd sex video friend and my wife】How AI is widening accessibility gaps for Indigenous communities

Source:Global Hot Topic Analysis Editor:focus Time:2025-07-02 08:55:15

The hd sex video friend and my wifeglobal arena of AI design and implementation continues to diversify, with flourishing interest in user-friendly generative AI leading its growth. Still incubating: AI's role in furthering accessible spaces, both online and off.

Meggan Van Harten is a strategic leader and partner at Design de Plume, an Indigenous-owned, women-led creative agency based in Canada. Design de Plume's business and design ethos incorporates what Van Harten calls a "kaleidoscope" of various world lenses: Indigeneity, inclusivity, diversity, equity, sustainability, and accessibility. The company was started following Van Harten's early career as a graphic designer, after she and her cofounders sought to add a perspective on digital design missing from the mainstream. 

Design de Plume integrates these principles directly into the framework of services and solutions like site design, such as its recent work with the Indigenous Protected and Conserved Areas (IPCAs) Knowledge Basket, a digital information-sharing platform for Indigenous-led conservation pathways in Canada. The company was awarded the 2023 DNA Paris Design Awardfor the project. 


You May Also Like

In her work as a strategist and public speaker, Van Harten keeps an eye trained on Indigenous accessibility, specifically. The Ontario-based designer says that excluding the cultural and language models of Indigenous Peoples in accessibility standards and compliance has left huge gaps in making communications accessible for Indigenous communities. 

And even with the AI boom heralding in a new wave of so-called technological innovation and automation, the gap isn't getting any smaller — in fact, it appears to be widening.

SEE ALSO: Funding the Earth's keepers: The need for Indigenous climate philanthropy

As collaborators in the design of digital technologies and spaces, Van Harten and her colleagues urge more people to assess the entire systemof accessibility in order to ask how assistive technologies, web services, and even AI-powered tools (like live transcriptions and image generation) can be more meaningful for Indigenous groups, and inclusive of their needs.

In conversation about this work and mission, Van Harten spoke to Mashable's Chase DiBenedetto about the failures in AI creation, the need for evolving accessibility standards, and how incorporating Indigenous perspectives should be among the first steps in technological innovation. 

Mashable: What does the "incorporation of Indigeneity" look like in your work?

Van Harten: There's this idea of "Two-Eyed Seeing," using the best of Western principles and Indigenous knowledge all at the same time. You're using both your eyes to accomplish this task. It's very much a part of the culture of Design de Plume, but also the culture of doing good design: taking the best from both worlds. I like the term "Open Cup" or "Empty Cup" — letting other people fill you with knowledge so that you can give a good solution at the end of the day. This isn't a new concept. It feels new to corporate America, but it's really not. It's really foundational to the Indigenous way of knowing. 

For instance, the way we conduct our engagement sessions with our clients. We always talk about it being more of a circle — no one is in charge or above each other, each voice should be represented equally — that way, we're able to create a more inclusive environment and focus on this idea that we are project-solving together. As an agency, we're not coming in with design trends or design tools that have worked in the past. That might not be true for every First Nation group or every Indigenous group that you encounter. You have to be really open to becoming a safe, open space for people to collaborate withyou. 

You've found that Indigenous groups are often excluded from accessible design, especially within the development of new AI tools. 

In terms of AI, it's actually been extremely difficult. There's so much bias in technology, in the way that it's developed. Indigenous People aren't consulted when it comes to solutioning. What I mean by that is that if they are brought in, they're brought in at the end of the project and asked, "Does this pass?" That's not good engagement. You need to involve those people from the start, explain to them the problem, get them involved in problem-solving, and compensate them for those efforts. Some of the difficulties with captioning, for instance, like automatic speech recognition, have been extremely difficult.

Can you elaborate on how live captioning, or automatic speech recognition more broadly, is failing these communities?

For example, I just downloaded a transcript of a live presentation I did, and it changed a word from "Anishinaabe," which is a large group of Indigenous People, to "Honest Nabi." So it distorted the language completely. I mentioned a group of people, and it couldn't even understand that. Even if there's a direct English translation…the way that Indigenous languages work is that they're extremely intentional, each letter and combination of letters has submeanings. There's so many intentional nuances to that specific word that are missed, especially in a case of live captioning.

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

[In a recent Mashable articleabout 3PlayMedia's 2023 State of Automatic Speech Recognition report, the accessibility services company]said that automatic speech recognition is about 90 percent accurate but needs to be more like 99 percent. But when it comes to Indigenous languages, it's literally 0 percent accurate. This is the harsh reality that we're living in. The way that we're building out technology, design systems, all of these things, is contributing to a culture of further [cultural] genocide and erasure of languages. 

Is the concept of universal design, or the idea that products and spaces should be created with full accessibility in mind from the beginning, a solution?

There's so much in terms of the development of standards, compliance laws, etc. But the foundation doesn't exist for Indigenous People. In the accessibility space, people quite frequently mention that ADA compliance (or, where I'm from, the AODA) is the ground floor, not the ceiling. For Indigenous People, it's just a wide-open gap. They just fall into this vortex. There's no systems for accessible language in their languages, and there's no way to develop tools that are based in two languages at the same time.

What's so frustrating is that we have the ideas there. The tools are good, in a way. It's good that we have captions. It's terrible that it can't understand, as I'm spelling it out to the system, what I'm actually trying to say.

SEE ALSO: The biggest assistive technology and accessibility triumphs of 2023 (so far)

I feel that with the rise of generative AI, and the investments made toward its constant innovation, this problem should have been considered by now. 

Absolutely. Why can't I talk to the AI in advance? If I am going to present in this language, let me teach you how to say it. But I can't do that. Right now the best solution is live transcriptions and live [human] captioning. That's great, but many organizations don't have the funding to be able to offer live captions. So the warning I have to give them is that the [automatic] captions aren't going to make any sense. 

Is there a structural reason for this complete disconnect?

There's a really big focus on monolingual accessibility support, so you have a document in English, or a document in French, and it will read it out to you in English or in French. But what happens within circles of Indigenous languages where, through genocide and discrimination, part of the language has been lost? Even if you develop a system and say, "We can now read this page fully in English, and this page in this other Indigenous language," you might still miss the concept completely, because there's no good way to embed both languages in the same document. There's no way to do that in any accessibility technology right now, because the focus is on a Western way of thinking, which is that everything is separate.

Have you found this single-focus issue in other popular AI use cases, like image generation or chatbots?

Image generation software has been so interesting in the design space. You hear a lot about this thing that can "Photoshop this so well. It looks so good." But it's pure nonsense, especially when it comes to representing Indigenous culture and principles.

Do you have an example?

In several First Nations cultures in Canada, we have something called the Medicine Wheel. There's a lot of meaning in that symbol, and it's meant to be a healing tool. I've used image generation software, and prompted "Indigenous medicine wheel." That's it. That's my prompt. It's not complicated. If you put that into Google, you'll get a legitimate Medicine Wheel. 

But what comes back from the image generation software is literally garbage. I've seen ones where it's a plate, with many different pieces on the plate. I thought maybe it was interpreting it as food. Food can be medicine, right? Food can be healing. But when I zoomed in on the image, I was like, "These are snail shells. And that's a cigarette." Just random pieces of texture that it's pulled to make it a "cool-looking" image, but fundamentally lost the intentionality and Indigenous cultural principles.


Related Stories
  • Threads launch fails to prioritize accessibility
  • An exploration of cinematic accessibility: Open captions set the standard
  • ChatGPT: New AI system, old bias?
  • Spotify's latest use of AI translates podcasters' voices into different languages
  • Funding the Earth's keepers: The need for Indigenous climate philanthropy

I've also seen it for alt text generation tools… I think it would actually be really approachable for people if they could use things like alt text generators, to at least have a good starting base. But you can't have that starting base if there's no foundation of intentionality and inclusion.

So am I worried that AI is coming for my job? No, not at all.

It does seem that alongside this fervor for AI innovation, there's this simultaneous culture of fear created around the capabilities of AI. 

If we're saying AI is creating a fear culture, it's creating this fear culture because it's not representing people.

SEE ALSO: New survey shows emotional toll of AI anxiety, employee monitoring

You talk about getting to a place where an AI assistant fully "comprehends" Indigenous languages or culture while providing a service. What does that entail?

I think comprehension is a collaborative skill. One of the things that's really cool about AI now is being able to converse with it, to help it and to grow it. So having that tool readily available in more systems would help with comprehension. You could tell it, "Hey, what you're saying is actually harmful and perpetuates a stereotype. We should stop saying that." I would love to see it more broadly applied to these systems. 

But also, before releasing a product and just expecting the world to fix it for you by interacting with it, involve people in the beginning stages of it to help influence it. Involve a diverse set of people to help talk with that AI, so that it can understand that there's a lot of garbage on the internet that it's going to pick up, or that people will fuel into it. 

In actuality, could a fully inclusive, and comprehensive, tool like this ever exist? Or is this something that shouldn't be relinquished to technology or AI-based services? Are human-in-the-loop solutions enough?

I think if you build technology in a more inclusive way, and you're willing to also take a destructive path forward and rethink the way that you've approached it before, it's possible. 

One of the major areas that they need to be talking about — as the government, or as corporations, or as these big beasts in tech — is who are they willing to exclude? Indigenous People are [one of] the fastest growing populationsin North America. So by not developing with these tools in mind, for Indigeneity and inclusivity, we're leaving out this huge group of people. It's not on Indigenous People to solve every problem. It's on the folks that have the funds to actually make it happen, and to be willing to rethink their processes to move forward in a good way.

A computer can't fake empathy. It can't understand empathy. And the same thing actually applies to people. If they're not empathetic, then they won't understand why this is meaningful, why we're talking about this particular issue. But that human level of empathy is how we're actually going to address this, and how we're going to solve it.

Topics Artificial Intelligence Social Good Identities Accessibility

0.1361s , 8397.9140625 kb

Copyright © 2025 Powered by 【hd sex video friend and my wife】How AI is widening accessibility gaps for Indigenous communities,Global Hot Topic Analysis  

Sitemap

Top 主站蜘蛛池模板: 高清欧美日韩一区二区三区在线观看 | 东京热一本无码av | 午夜理论片yy4080私人影院 | av无码不卡在线日韩av | 91精品国自产在线观看 | 91丝袜高潮流白浆潮喷在线观看 | www精品久久 | 97人妻在线免费观看 | 91啪国产福利在线 | 99久久无码一区人妻a片潘金莲 | 大伊香蕉精品视频在线 | 午夜亚洲电影一区二区三区 | 69看片| 韩国无码中文字幕在线视频 | av性天堂网 | AV国产精品毛片一区二区小说 | 99精品久久99久久久久久 | 韩国三级激情理论电影 | 一区二区婷婷在线视频 | 97亚洲熟妇自偷自 | 成人动作片在线观看 | 99国产精品一区 | 国产av午夜精品一 | 97人妻免费公开视频在线看 | av天堂一区二区三区 | 成人艳情一二三区 | 97无码人妻免费视频碰碰碰 | 午夜欧美国产 | 被教官按在寝室狂到腿软视频 | av黄片高清无码在线观看 | 爱豆传媒mv在线看 | h污小舞白丝玉足榨精小说 h无码动漫 | 波多野结衣办公室在线 | 91精品综合国产在线观看 | 午夜福利体验试看120秒 | 暗卫cao烂王爷屁股眼h | 波多野结衣多次高潮三个老人 | 丰满爆乳美女在线视 | 国产av无码一区二区三区dv | av在线无码观看另类重口 | 91精品无人区麻豆乱码一区 |