awesome-copilot/prompts/csharp-mstest.prompt.md
James Montemagno 6fb794bc79
Update development instructions and guidelines (#29)
* Delete outdated development instructions for Next.js + Tailwind and Python; add comprehensive guidelines for PostgreSQL DBA, Angular, ASP.NET REST APIs, Azure Functions with TypeScript, Bicep, Blazor, CMake with vcpkg, C#, .NET MAUI, GenAIScript, Terraform for Azure, localization, and markdown standards.

* Update documentation and prompts for consistency and clarity

- Standardized description formatting in various markdown files to use single quotes.
- Added error handling utility in update-readme.js for safer file operations.
- Improved title extraction logic in update-readme.js to handle frontmatter more robustly.
- Updated chat modes section in README to reflect new emoji and sorted chat mode links.
- Cleaned up various instruction files for better readability and consistency.
- Ensured all markdown files end with a newline for better compatibility with version control.

* Remove standardize-frontmatter.js script

* Add usage instructions for creating and switching chat modes in README.md

* Update README.md generation script to enhance instructions and usage details for custom chat modes

* Update README.md and update-readme.js for improved instruction clarity and consistency

* Refactor README.md links and update readme script for improved clarity and consistency in instructions

* Update README.md and update-readme.js for improved instruction clarity and consistency

* Changing from a patch to regen approach for the readme

* Bit more cleanup for how to show things in the readme

* Adding missing description

* Another missing description

---------

Co-authored-by: Aaron Powell <me@aaron-powell.com>
2025-07-03 11:18:52 +10:00

2.6 KiB

mode tools description
agent
changes
codebase
editFiles
problems
search
Get best practices for MSTest unit testing, including data-driven tests

MSTest Best Practices

Your goal is to help me write effective unit tests with MSTest, covering both standard and data-driven testing approaches.

Project Setup

  • Use a separate test project with naming convention [ProjectName].Tests
  • Reference Microsoft.NET.Test.Sdk, MSTest.TestAdapter, and MSTest.TestFramework packages
  • Create test classes that match the classes being tested (e.g., CalculatorTests for Calculator)
  • Use .NET SDK test commands: dotnet test for running tests

Test Structure

  • Use [TestClass] attribute for test classes
  • Use [TestMethod] attribute for test methods
  • Follow the Arrange-Act-Assert (AAA) pattern
  • Name tests using the pattern MethodName_Scenario_ExpectedBehavior
  • Use [TestInitialize] and [TestCleanup] for per-test setup and teardown
  • Use [ClassInitialize] and [ClassCleanup] for per-class setup and teardown
  • Use [AssemblyInitialize] and [AssemblyCleanup] for assembly-level setup and teardown

Standard Tests

  • Keep tests focused on a single behavior
  • Avoid testing multiple behaviors in one test method
  • Use clear assertions that express intent
  • Include only the assertions needed to verify the test case
  • Make tests independent and idempotent (can run in any order)
  • Avoid test interdependencies

Data-Driven Tests

  • Use [DataTestMethod] combined with data source attributes
  • Use [DataRow] for inline test data
  • Use [DynamicData] for programmatically generated test data
  • Use [TestProperty] to add metadata to tests
  • Consider [CsvDataSource] for external data sources
  • Use meaningful parameter names in data-driven tests

Assertions

  • Use Assert.AreEqual for value equality
  • Use Assert.AreSame for reference equality
  • Use Assert.IsTrue/Assert.IsFalse for boolean conditions
  • Use CollectionAssert for collection comparisons
  • Use StringAssert for string-specific assertions
  • Use Assert.ThrowsException<T> to test exceptions
  • Ensure assertions are simple in nature and have a message provided for clarity on failure

Mocking and Isolation

  • Consider using Moq or NSubstitute alongside MSTest
  • Mock dependencies to isolate units under test
  • Use interfaces to facilitate mocking
  • Consider using a DI container for complex test setups

Test Organization

  • Group tests by feature or component
  • Use test categories with [TestCategory("Category")]
  • Use test priorities with [Priority(1)] for critical tests
  • Use [Owner("DeveloperName")] to indicate ownership