Filters








4,396 Hits in 5.1 sec

Bias in Data-driven AI Systems – An Introductory Survey [article]

Eirini Ntoutsi, Pavlos Fafalios, Ujwal Gadiraju, Vasileios Iosifidis, Wolfgang Nejdl, Maria-Esther Vidal, Salvatore Ruggieri, Franco Turini, Symeon Papadopoulos, Emmanouil Krasanakis, Ioannis Kompatsiaris, Katharina Kinder-Kurlanda, Claudia Wagner (+9 others)
<span title="2020-01-14">2020</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
In this survey, we focus on data-driven AI, as a large part of AI is powered nowadays by (big) data and powerful Machine Learning (ML) algorithms.  ...  Therefore, it is necessary to move beyond traditional AI algorithms optimized for predictive performance and embed ethical and legal principles in their design, training and deployment to ensure social  ...  Acknowledgement This work is supported by the project "NoBias -Artificial Intelligence without Bias", which has received funding from the European Union's Horizon 2020 research and innovation programme  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2001.09762v1">arXiv:2001.09762v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/is3pn4c2srgdfbjnpvbwodoura">fatcat:is3pn4c2srgdfbjnpvbwodoura</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200321113103/https://arxiv.org/pdf/2001.09762v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2001.09762v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Bias in data‐driven artificial intelligence systems—An introductory survey

Eirini Ntoutsi, Pavlos Fafalios, Ujwal Gadiraju, Vasileios Iosifidis, Wolfgang Nejdl, Maria‐Esther Vidal, Salvatore Ruggieri, Franco Turini, Symeon Papadopoulos, Emmanouil Krasanakis, Ioannis Kompatsiaris, Katharina Kinder‐Kurlanda (+11 others)
<span title="2020-02-03">2020</span> <i title="Wiley"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/agmngghlr5hyrpto64zks3fhry" style="color: black;">Wiley Interdisciplinary Reviews Data Mining and Knowledge Discovery</a> </i> &nbsp;
In this survey, we focus on data-driven AI, as a large part of AI is powered nowadays by (big) data and powerful machine learning algorithms.  ...  Therefore, it is necessary to move beyond traditional AI algorithms optimized for predictive performance and embed ethical and legal principles in their design, training, and deployment to ensure social  ...  ACKNOWLEDGMENT This work is supported by the project "NoBias -Artificial Intelligence without Bias," which has received funding from the European Union's Horizon 2020 research and innovation programme,  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1002/widm.1356">doi:10.1002/widm.1356</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/hbdgabycvndpjmgn7rjzkkk5ma">fatcat:hbdgabycvndpjmgn7rjzkkk5ma</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200219125003/https://onlinelibrary.wiley.com/doi/pdfdirect/10.1002/widm.1356" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/7e/9d/7e9d2818f69ea48c14b7ed5ed0a7ac667e71ad1d.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1002/widm.1356"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> wiley.com </button> </a>

Towards gender equity in artificial intelligence and machine learning applications in dermatology

Michelle S Lee, Lisa N Guo, Vinod E Nambudiri
<span title="2021-06-21">2021</span> <i title="Oxford University Press (OUP)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/aapnwtybrvghlc35hfimbtdlom" style="color: black;">JAMIA Journal of the American Medical Informatics Association</a> </i> &nbsp;
We present recommendations for ensuring sex and gender equity in the development of ML/AI tools in dermatology to increase desirable bias and avoid undesirable bias.  ...  We believe that sex and gender differences should be taken into consideration in ML/AI algorithms in dermatology because there are important differences in the epidemiology and clinical presentation of  ...  In considering bias in AI, 2 types of biases have been described in ML literature: desirable and undesirable biases. 11 Desirable biases result from accounting for differences between groups to allow  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1093/jamia/ocab113">doi:10.1093/jamia/ocab113</a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pubmed/34151976">pmid:34151976</a> <a target="_blank" rel="external noopener" href="https://pubmed.ncbi.nlm.nih.gov/PMC8757299/">pmcid:PMC8757299</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/jcvhwdqk5fcllf3nlz7fazwcv4">fatcat:jcvhwdqk5fcllf3nlz7fazwcv4</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20220225190242/https://watermark.silverchair.com/ocab113.pdf?token=AQECAHi208BE49Ooan9kkhW_Ercy7Dm3ZL_9Cf3qfKAc485ysgAAAscwggLDBgkqhkiG9w0BBwagggK0MIICsAIBADCCAqkGCSqGSIb3DQEHATAeBglghkgBZQMEAS4wEQQMRVlvK3BwmLpM5o6oAgEQgIICerIAYba0I_jmE-vGtZnM8KhJSYwfarwUiZc_DNXyKGypmfvp8f3BxNWPUILmQkyJuKsa2OHGSWTBfaskKFpeoLFAr8ki_kEXYowODsNsLjrUs3duxj3lXRFGJE9FPKS84KWk4cGFgOFe-IEvotOQrRYaKdxZYAniStSKLtdR24oWHy6YjFscXhrGF3gUpE1d4a9IqerBmSaTBEyG86pAm2ZvkwpUhWaOFSiS20x0rBFZ6WygB9laCbW6yNC88ntlPNaFz1mieAk99rwoeK2G6beT2ombzpdhV1KugOfavksjWuORJOafYN-v7sr21Xz0JEQ7F4uljhDK6Vibx-FjjY6arb_sec0B2E4n5_9oNTvT7rUSAwPOx9jBEjyY1PKqTQratT9LjoGxbdNgMvPH32kb1U_0rzSyiIj7FnfSiQ-MbxDpVuD5Y0o-Ky4n08Vgg26VhLlqWZhMepgM6xwvgEFsIUQ7eK8oNDhbEFVTM_QbJDnegzvOZF_dMNokKm-rQknxRMVmGW8jaTkN7Ab_MRPXQLPbxaabh15Kv_tw4gmBzGl8mMFsiQ1HLAEWNYXOolNpmPXuaDncm6VnjfY3TKmBu95pzRcfwft6Xvob1li3vvQO_EkGgtCq8kC4A001yJWjEhjZ1LzVrrTDQ3ksmIO-rE5vBmhcJU4FftQeVf_dZbXBOBo57bH81CrVpD9t7XX6QMppCFfAcbQqGN06BW_-ACi6qw4YucxxkCCLD4GlePI1YGj72yE7vMw6gpfLkdz42wdTFTU7dFbGn0OtKrCnB5bXnm6Z0IRcRDpIyrFIMokDhRpJCg5wKJTCNgz25ClzhPcvPY54u34" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/32/e6/32e69b0495a9c388241e63bf64ad14bb8a7fdb76.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1093/jamia/ocab113"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> oup.com </button> </a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8757299" title="pubmed link"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> pubmed.gov </button> </a>

Rethinking Fairness: An Interdisciplinary Survey of Critiques of Hegemonic ML Fairness Approaches

Lindsay Weinberg
<span title="2022-05-06">2022</span> <i title="AI Access Foundation"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/4ax4efcwajcgvidb6hcg6mwx4a" style="color: black;">The Journal of Artificial Intelligence Research</a> </i> &nbsp;
that entrench "bias," are non-consensual, and lack transparency; 8) the predatory inclusion of marginalized groups into AI systems; and 9) a lack of engagement with AI's long-term social and ethical outcomes  ...  use of AI fairness measures to avoid regulation and engage in ethics washing; 6) an absence of participatory design and democratic deliberation in AI fairness considerations; 7) data collection practices  ...  This article is based upon work supported by the NSF Program on Fairness in AI in Collaboration with Amazon under Award number 1939728, "FAI: Identifying, Measuring, and Mitigating Fairness Issues in AI  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1613/jair.1.13196">doi:10.1613/jair.1.13196</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/tm5jeni76fb2lh7ryvfy673jpa">fatcat:tm5jeni76fb2lh7ryvfy673jpa</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20220512035916/https://arxiv.org/ftp/arxiv/papers/2205/2205.04460.pdf" title="fulltext PDF download [not primary version]" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <span style="color: #f43e3e;">&#10033;</span> <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/ea/9e/ea9e568e54af1f3142f1be97d95fb3bd207c7eb7.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1613/jair.1.13196"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> Publisher / doi.org </button> </a>

Artificial intelligence: Explainability, ethical issues and bias

Marshan Alaa
<span title="2021-08-03">2021</span> <i title="Peertechz Publications Private Limited"> Annals of Robotics and Automation </i> &nbsp;
with data confi dentiality and bias as well as the auditability, fairness and accountability of the AI model.  ...  Ethical issues with artifi cial intelligence Connected to the transparency and accountability of ML algorithms, ethical AI is another major concern that attracts attention from researchers in AI domain  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.17352/ara.000011">doi:10.17352/ara.000011</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/dhxlagfh2vhkpeeda5qromnzxq">fatcat:dhxlagfh2vhkpeeda5qromnzxq</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20211010100240/https://www.peertechzpublications.com/articles/ARA-5-111.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/db/f0/dbf09a02b23ecc2fcfcd4628713b73aeec98f451.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.17352/ara.000011"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> Publisher / doi.org </button> </a>

Discrimination, Bias, Fairness, and Trustworthy AI

Daniel Varona, Juan Luis Suárez
<span title="2022-06-08">2022</span> <i title="MDPI AG"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/smrngspzhzce7dy6ofycrfxbim" style="color: black;">Applied Sciences</a> </i> &nbsp;
Bias, discrimination, and fairness are mainly approached with an operational interest by the Principled AI International Framework, so we included sources from outside the framework to complement (from  ...  In this study, we analyze "Discrimination", "Bias", "Fairness", and "Trustworthiness" as working variables in the context of the social impact of AI.  ...  Nowadays, with the use of AI systems, and particularly ML models and algorithms [44] , consequential decisions are being automatically generated about people.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/app12125826">doi:10.3390/app12125826</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/yhhdkrhoa5ha5flavcvqycbdgm">fatcat:yhhdkrhoa5ha5flavcvqycbdgm</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20220610043210/https://mdpi-res.com/d_attachment/applsci/applsci-12-05826/article_deploy/applsci-12-05826.pdf?version=1654679458" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/b0/a6/b0a636bf385ae8c24a061863025652ff9539ed24.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/app12125826"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> mdpi.com </button> </a>

A Framework for Fairness: A Systematic Review of Existing Fair AI Solutions [article]

Brianna Richardson, Juan E. Gilbert
<span title="2021-12-10">2021</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
This systematic review provides an in-depth summary of the algorithmic bias issues that have been defined and the fairness solution space that has been proposed.  ...  These needs have been organized and addressed to the parties most influential to their implementation, which includes fairness researchers, organizations that produce ML algorithms, and the machine learning  ...  Acknowledgments The authors wish to thank Hans-Martin Adorf, Don Rosenthal, Richard Franier, Peter Cheeseman and Monte Zweben for their assistance and advice.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2112.05700v1">arXiv:2112.05700v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/cax4fds475cbzioqat2bidgkba">fatcat:cax4fds475cbzioqat2bidgkba</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20211215195906/https://arxiv.org/pdf/2112.05700v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/93/cb/93cb543e9e5ffc99e0fb0b89c62e4554dbeb8c92.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2112.05700v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Machine learning in Governments: Benefits, Challenges and Future Directions

Yulu Pi
<span title="2021-08-24">2021</span> <i title="JEDEM Journal of e-Democracy and Open Government"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/swnv6g2gdrcdxhpvspb75sgjwe" style="color: black;">JeDEM - eJournal of eDemocracy &amp; Open Government</a> </i> &nbsp;
Machine learning (ML) is the fastest growing and at the same time, the most debated and controversial of these technologies.  ...  There is a conspicuous trend that organizations are seeking the use of frontier technologies with the purpose of helping the delivery of services and making day-to-day operational deci-sions.  ...  The US Congress proposed The Algorithmic Accountability Act of 2019, requiring companies to perform impact assessments of automated decision systems and evaluate their impact on accuracy, fairness, bias  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.29379/jedem.v13i1.625">doi:10.29379/jedem.v13i1.625</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/sgxo5z2gc5arvhvpy36t5yo6hi">fatcat:sgxo5z2gc5arvhvpy36t5yo6hi</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20210905135805/https://jedem.org/index.php/jedem/article/download/625/515" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/60/6c/606cfcdb338c5936c5101c0bff4570b3bfae4a89.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.29379/jedem.v13i1.625"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> Publisher / doi.org </button> </a>

Audit and Assurance of AI Algorithms: A framework to ensure ethical algorithmic practices in Artificial Intelligence [article]

Ramya Akula, Ivan Garibay
<span title="2021-07-14">2021</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
A modern market, auditing, and assurance of algorithms developed to professionalize and industrialize AI, machine learning, and related algorithms.  ...  From autonomous vehicles and banking to medical care, housing, and legal decisions, there will soon be enormous amounts of algorithms that make decisions with limited human interference.  ...  Discrimination and Bias Multiple forms of bias exist in AI and ML, explaining how an automated decisionmaking process may become unjust.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2107.14046v1">arXiv:2107.14046v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/ic4bq5f73rcxdg6we4qxdw6mcm">fatcat:ic4bq5f73rcxdg6we4qxdw6mcm</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20210802102925/https://arxiv.org/pdf/2107.14046v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/8f/2a/8f2adae36eaa0cc61f76e483264c7c4f736e15a2.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2107.14046v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Non-portability of Algorithmic Fairness in India [article]

Nithya Sambasivan, Erin Arnesen, Ben Hutchinson, Vinodkumar Prabhakaran
<span title="2020-12-08">2020</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
Conventional algorithmic fairness is Western in its sub-groups, values, and optimizations.  ...  We argue that a mere translation of technical fairness work to Indian subgroups may serve only as a window dressing, and instead, call for a collective re-imagining of Fair-ML, by re-contextualising data  ...  Towards Algorithmic Fairness in India To account for the challenges outlined above, we need to understand and design for end-to-end chains of algorithmic power, including how AI systems are conceived,  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2012.03659v2">arXiv:2012.03659v2</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/kj3qrwiufvegzeqjdk5blmwng4">fatcat:kj3qrwiufvegzeqjdk5blmwng4</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20201211012906/https://arxiv.org/pdf/2012.03659v2.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/ed/e4/ede46e8624db2a1ae1e8b0fb260bae02e7baac23.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2012.03659v2" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

An Introduction to Artificial Intelligence and Solutions to the Problems of Algorithmic Discrimination [article]

Nicholas Schmidt, Bryce Stephens
<span title="2019-11-08">2019</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
There is substantial evidence that Artificial Intelligence (AI) and Machine Learning (ML) algorithms can generate bias against minorities, women, and other protected classes.  ...  We propose a methodology for evaluating algorithmic fairness and minimizing algorithmic bias that aligns with the provisions of federal and state anti-discrimination statutes that outlaw overt, disparate  ...  There is substantial evidence that AI and ML algorithms can cause bias against minorities, women, and other protected classes.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1911.05755v1">arXiv:1911.05755v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/5iit4pmwnzcypdsihoiufs3rrm">fatcat:5iit4pmwnzcypdsihoiufs3rrm</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200912201732/https://arxiv.org/ftp/arxiv/papers/1911/1911.05755.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/a4/d9/a4d97258abf832d8614b38c2433583a88d873de7.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1911.05755v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Responsible AI by Design in Practice [article]

Richard Benjamins, Alberto Barbado, Daniel Sierra
<span title="2019-12-20">2019</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
There is, however, less consensus on, and experience with how to practically deal with those issues in organizations that develop and use AI, both from a technical and organizational perspective.  ...  Recently, a lot of attention has been given to undesired consequences of Artificial Intelligence (AI), such as unfair bias leading to discrimination, or the lack of explanations of the results of AI systems  ...  and accountability.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1909.12838v2">arXiv:1909.12838v2</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/orrgaoxxbfdujpqhgaqeknmtom">fatcat:orrgaoxxbfdujpqhgaqeknmtom</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200902021150/https://arxiv.org/ftp/arxiv/papers/1909/1909.12838.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/61/e6/61e65d862d613c8817008e19fa0e66a84bd49c1d.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1909.12838v2" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Can Explainable AI Explain Unfairness? A Framework for Evaluating Explainable AI [article]

Kiana Alikhademi, Brianna Richardson, Emma Drobina, Juan E. Gilbert
<span title="2021-06-14">2021</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
In this paper, we created a framework for evaluating explainable AI tools with respect to their capabilities for detecting and addressing issues of bias and fairness as well as their capacity to communicate  ...  We found that despite their capabilities in simplifying and explaining model behavior, many prominent XAI tools lack features that could be critical in detecting bias.  ...  Issues with the Selection & Creation of ML Models Just as the biases of the data affect the resulting algorithm, the selection of ML models and the constraint functions used to optimize the algorithm can  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2106.07483v1">arXiv:2106.07483v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/6kzy4k4znzffjkunc52n3lhlzu">fatcat:6kzy4k4znzffjkunc52n3lhlzu</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20210622003923/https://arxiv.org/pdf/2106.07483v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/78/3b/783bf35b19effa55e96fac17fac9d2b469b03977.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2106.07483v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

"Don't let me be misunderstood"

Stefan Strauß
<span title="2021-12-20">2021</span> <i title="Oekom Publishers GmbH"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/ahvaawe3fvd3zb4mfiaado7wyy" style="color: black;">TATuP - Journal for Technology Assessment in Theory and Practice</a> </i> &nbsp;
and ethical risks with technology.  ...  It is assumed that the mismatch between system behavior and user practice in specific application contexts due to AI‑based automation is a key trigger for bias and other societal risks.  ...  Bias in ML is a wicked problem inherent to AI.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.14512/tatup.30.3.44">doi:10.14512/tatup.30.3.44</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/s2gn3gimtva6vbfxhv3ubbwzdm">fatcat:s2gn3gimtva6vbfxhv3ubbwzdm</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20220107041753/https://tatup.de/index.php/tatup/article/download/6930/11670" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/bf/c8/bfc89dc2ea28c42b5ff1e8ccd97470d4fbffdc4e.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.14512/tatup.30.3.44"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> Publisher / doi.org </button> </a>

Taxonomy of Bias in AI: Systemic Bias Explained

Paola Di Maio
<span title="2020-08-11">2020</span> <i title="Zenodo"> Zenodo </i> &nbsp;
This note provides a top level taxonomy for categorizing AI Bias Low level developers of ML AI can struggle to grasp the overall system level due to poor neuronal connectivity in their brains.  ...  This note illustrates and explains systemic bias in a simple direct way so that even intellectually challenged low level thinkers can figure it out without too much effort.  ...  9 particular algorithmic bias. It defines and explain the notion of Systemic Bias, and includes a set of guiding criteria for algorithmic auditing, It is shared for reference and for discussion.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.5281/zenodo.3978830">doi:10.5281/zenodo.3978830</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/353nk2mkwzhebhpniggs5lg5kq">fatcat:353nk2mkwzhebhpniggs5lg5kq</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200815225307/https://zenodo.org/record/3978830/files/Systemic%20Bias.docx.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/99/34/9934c3b77b23bb3baeaaaa593c46de4c0f9911bb.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.5281/zenodo.3978830"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> zenodo.org </button> </a>
&laquo; Previous Showing results 1 &mdash; 15 out of 4,396 results