Before heading to the main stage, grab some essential gear. This zone covers the ethical considerations of AI, practical prompting tips, and how to keep your data safe when working with AI tools.
AI can amplify biases present in training data, make decisions that affect people's lives, and generate misleading content. Responsible AI means being transparent about when AI is used, ensuring fairness across different groups, maintaining human oversight, and being accountable for AI-driven outcomes. It's not just about what AI can do β it's about what it should do.
Responsible AI is a set of steps taken to make sure that AI systems are trustworthy and uphold societal principles β working through issues such as fairness, reliability and safety, privacy and security, inclusiveness, transparency, and accountability. At Microsoft, this isn't a policy document that sits in a drawer. It shapes every product decision β from how Copilot handles your data to how new AI features are reviewed before release. Microsoft's six core principles guide the design, development, and deployment of all AI systems to ensure they're ethical, trustworthy, and human-centered.
π‘ Think of Responsible AI like the safety standards at a music festival. The concert is exciting and powerful β but behind the scenes there are crowd barriers, fire exits, first aid teams, and clear rules about what's allowed. You don't see most of it, but it's what makes the whole experience trustworthy. Microsoft's Responsible AI framework is the safety infrastructure behind every AI product they ship.
Do: Be specific and provide context. Break complex tasks into steps. Give examples of what you want. Set the role or persona. Ask for structured output.
Don't: Write vague one-word prompts. Assume the AI knows your internal context. Trust outputs without verification. Share sensitive data in public AI tools. Skip reviewing AI-generated code or content.
Prompting is how you talk to AI β and the quality of what you get back depends almost entirely on the quality of what you put in. Microsoft's guidance identifies four key elements every good prompt should include: Goal (what you want Copilot to do), Context (why you need it or how you'll use it), Source (which file, email, or data to use), and Expectations (the format, tone, or audience for the response). You don't need all four every time β but the more you include, the better the result.
Weak prompt
βWrite a blog post about sustainability.β
Strong prompt
βWrite a 500-word blog post for our company intranet, aimed at non-technical employees, explaining three simple ways our office can reduce its carbon footprint. Keep the tone friendly and practical.β
π‘ Think of prompting like briefing a brilliant new colleague on their first day. They're smart and capable β but they can only work with what you give them. The clearer and more specific your brief, the less time you spend correcting and the more time you spend on what actually matters.
When using AI tools, your prompts and data may be processed by third-party services. Enterprise solutions like Microsoft Copilot include data protection boundaries β your data stays within your tenant and isn't used to train public models. Always check your organization's AI usage policies, avoid pasting sensitive information into public AI tools, and understand the data residency of the AI services you use.
When you use Microsoft 365 Copilot at work, your data never leaves the Microsoft 365 service boundary. Your prompts, Copilot's responses, and any files or emails referenced are processed within the same secure environment that protects your existing Microsoft 365 data β not sent to OpenAI's public services. Copilot uses Azure OpenAI, which has much stricter enterprise controls than the consumer version.
Copilot can only access data that you are already authorised to see. It respects your existing SharePoint permissions, sensitivity labels, and access controls. If you can't open a file in SharePoint, Copilot can't read it either. There is no tenant-wide backdoor β access is always scoped to the signed-in user's permissions.
When Copilot searches the web to ground a response, the query sent to Bing is stripped of all user and tenant identifiers before it leaves Microsoft 365. Your name, domain, and tenant ID are never included in web search queries.
π‘ Think of it like a very thorough colleague who works entirely within your building, with your security badge. They can only access the rooms you can access, they never share what they find with anyone outside the building, and when they need to look something up online, they do so anonymously β no name tag, no company logo.