Senior Policy Manager
Microsoft
Responsibilities
- Lead safety by design assessments for consumer products, especially AI-driven services, by setting safety requirements, developing risk mitigation initiatives, and managing case reviews.
- Collaborate with engineering teams to evaluate product releases for digital safety risks related to AI, offer mitigation guidance, and ensure adherence to Microsoft’s Safety by Design principles. Partner with other teams, including ORA and DSB, to support incorporation of Digital Safety product policies into existing workstreams to drive efficiencies.
- Develop insights and identify patterns in consumer-hosted services to provide actionable risk mitigation guidance and general policy updates. Assess product use cases and AI deployments to create scalable, generalizable policies that address risks and impacts of harmful content across diverse social contexts.
- Manage, document, and drive progress to uphold responsible Digital Safety requirements across a diverse group of Microsoft engineering, sales, and research teams.
- Analyze global safety challenges and support the development of Digital Safety governance frameworks including creating specific Product Policies. Ensure these frameworks incorporate AI-specific considerations and enable consistent application across Microsoft’s product portfolio.
- Build relationships to orchestrate and integrate Digital Safety guidance into product and feature development. Lead implementation efforts for safety partnerships and cross-functional projects.
- Partner with engineering teams to contribute to, prioritize, and deliver on product roadmaps, primarily by evaluating their upcoming product releases for Digital Safety risks and advice on the development of mitigations.
- Develop and maintain program materials. Work with Legal and Public Policy to review and deploy time sensitive and pertinent safety information into the program material.
Qualifications
- Master's Degree AND 3+ years work experience in trust and safety, corporate affairs, legal affairs or related area
- OR Bachelor's Degree AND 4+ years work experience in trust and safety, corporate affairs, legal affairs or related area
- OR equivalent experience.
- Experience with product policy or AI safety
- Experience drafting and managing trust and safety policies or programs.
- Experience managing a Digital Safety, trust and safety, or related program.
- Experience working in or with a large, matrixed organization.
Corporate Social Responsibility IC4 - The typical base pay range for this role across the U.S. is USD $105,800 - $204,000 per year. There is a different range applicable to specific work locations, within the San Francisco Bay area and New York City metropolitan area, and the base pay range for this role in those locations is USD $136,200 - $223,300 per year.
Certain roles may be eligible for benefits and other compensation. Find additional benefits and pay information here:
https://careers.microsoft.com/us/en/us-corporate-pay
This position will be open for a minimum of 5 days, with applications accepted on an ongoing basis until the position is filled.
Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance with religious accommodations and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.