Visiting Senior Researcher, Algorithmic Auditing - Ada Lovelace Institute
The Ada Lovelace Institute (Ada) is a hiring a Visiting Senior Researcher to undertake a series of projects exploring AI auditing practices as a method for assessing and inspecting algorithmic systems and their impacts on people and society. This role is an excellent opportunity for a junior-to-mid career data scientist or computer scientist to undertake a project that will explore practical on-the-ground experience of AI auditing, and produce a series of projects that will feed into contemporary AI legislative and policy debates. The role The position of Visiting Senior Researcher, Algorithmic Auditing sits within Ada’s Industry and Emerging Technology Research Directorate. Alongside 5-6 other team members, this directorate undertakes research exploring the societal implications of emerging technologies and what steps developers of these technologies can take to address them. The Ethics & Accountability in Practice Programme is a research team sitting within this directorate that develops methods for AI and data practitioners and regulators to evaluate and assess potential risks, harms and impacts of AI and data-driven technologies. This role will report directly to the Senior Researcher for the Ethics & Accountability in Practice Programme. This role will be responsible for delivering three primary projects: Developing a series of case studies exploring methods for AI auditing in online safety contexts Developing a report outlining legal and ethical challenges in AI auditing, including how these can be addressed Developing a report exploring how the UK and other countries can develop a ‘marketplace’ of AI auditors This role will work on these projects with the support of the Senior Researcher and wider Ada functions, including our Comms, Operations, and Policy & Public Affairs teams. This role may also advise and contribute from time-to-time on other projects within the Industry and Emerging Technology Research Directorate. In addition to these outputs, this role will be responsible for communication strategies for outputs, and conceptualising, facilitating and attending meetings, workshops and events with a view to achieving strategic impact with key stakeholders. To date, Ada’s methodologies include the use of working groups and expert convenings, public deliberation initiatives, desk-based research and synthesis, policy and legal analysis and translation, and ethnographic research. We welcome new kinds of expertise and methodologies into our team, and for this role we are hoping to attract candidates with a background in data science and/or computer science. About youYou are a researcher or professional who may have a background researching for a policy department or a regulator, a technology company, research institute, charity or academic organisation. You are curious and passionate about the issues which arise at the intersection of technology and society, and are committed to bringing an interdisciplinary and intersectional lens to understanding them. Importantly, you’ll be comfortable taking initiative, working independently and to short deadlines at times. You’ll enjoy working in a team environment, willing to jump into projects and keen to explore areas of policy, technology and practice that you don’t already understand. You’ll appreciate the importance of high standards of rigour in research, but also want to think creatively about communicating and influencing in novel ways. For further information about the role, please download the full job description. About the Ada Lovelace Institute The Ada Lovelace Institute is an independent research institute funded and incubated by the Nuffield Foundation since 2018. Our mission is to ensure data and artificial intelligence work for people and society. We do this by building evidence and fostering rigorous debate on how data and AI affect people and society. We recognise the power asymmetries that exist in ethical and legal debates around the development of data-driven technologies and seek to level those asymmetries by convening diverse voices and creating a shared understanding of the ethical issues arising from data and AI. Finally, we seek to define and inform good practice in the design and deployment of AI technologies. The Institute has emerged as a leading independent voice on the ethical and societal impacts of data and AI. We have built relationships in the public, private and civil society sectors in the UK and internationally. Please find details of our work here. Our research takes an interconnected approach to issues such as power, social justice, distributional impact and climate change (read our strategy to find out more), and our team have a wide range of expertise that cuts across policy, technology, academia, industry, law and human rights. We value diversity in background, skills, perspectives and life experiences. As part of the Nuffield Foundation, we are a small team with the practical support of an established organisation that cares for its employees. How to applyThe closing date for applications is 09:30am (GMT) on Thursday 1st November 2022, with interviews taking place in mid December 2022. You will be required to complete some questions as part of this application process, and you are also required to upload an up-to-date copy of your CV. The Applied platform lets you save an application and resume it ahead of submitting before the application deadline. Should you need to make an application in a different format or require any adjustments as part of the application process, please get in touch with us: firstname.lastname@example.orgOur benefits package includes: 28 days holiday per annum and all public holidays (with the option to buy or sell up to 5 days). Pension scheme that offers employer contributions of up to 11%. Life assurance scheme. We offer family leave policies that provide an enhanced level of pay Cycle to work scheme and loans towards season tickets. Opportunities for learning and development Wellbeing support including an employee assistance provider.