* Delete outdated development instructions for Next.js + Tailwind and Python; add comprehensive guidelines for PostgreSQL DBA, Angular, ASP.NET REST APIs, Azure Functions with TypeScript, Bicep, Blazor, CMake with vcpkg, C#, .NET MAUI, GenAIScript, Terraform for Azure, localization, and markdown standards. * Update documentation and prompts for consistency and clarity - Standardized description formatting in various markdown files to use single quotes. - Added error handling utility in update-readme.js for safer file operations. - Improved title extraction logic in update-readme.js to handle frontmatter more robustly. - Updated chat modes section in README to reflect new emoji and sorted chat mode links. - Cleaned up various instruction files for better readability and consistency. - Ensured all markdown files end with a newline for better compatibility with version control. * Remove standardize-frontmatter.js script * Add usage instructions for creating and switching chat modes in README.md * Update README.md generation script to enhance instructions and usage details for custom chat modes * Update README.md and update-readme.js for improved instruction clarity and consistency * Refactor README.md links and update readme script for improved clarity and consistency in instructions * Update README.md and update-readme.js for improved instruction clarity and consistency * Changing from a patch to regen approach for the readme * Bit more cleanup for how to show things in the readme * Adding missing description * Another missing description --------- Co-authored-by: Aaron Powell <me@aaron-powell.com>
2.7 KiB
2.7 KiB
| mode | tools | description | |||||
|---|---|---|---|---|---|---|---|
| agent |
|
Get best practices for XUnit unit testing, including data-driven tests |
XUnit Best Practices
Your goal is to help me write effective unit tests with XUnit, covering both standard and data-driven testing approaches.
Project Setup
- Use a separate test project with naming convention
[ProjectName].Tests - Reference Microsoft.NET.Test.Sdk, xunit, and xunit.runner.visualstudio packages
- Create test classes that match the classes being tested (e.g.,
CalculatorTestsforCalculator) - Use .NET SDK test commands:
dotnet testfor running tests
Test Structure
- No test class attributes required (unlike MSTest/NUnit)
- Use fact-based tests with
[Fact]attribute for simple tests - Follow the Arrange-Act-Assert (AAA) pattern
- Name tests using the pattern
MethodName_Scenario_ExpectedBehavior - Use constructor for setup and
IDisposable.Dispose()for teardown - Use
IClassFixture<T>for shared context between tests in a class - Use
ICollectionFixture<T>for shared context between multiple test classes
Standard Tests
- Keep tests focused on a single behavior
- Avoid testing multiple behaviors in one test method
- Use clear assertions that express intent
- Include only the assertions needed to verify the test case
- Make tests independent and idempotent (can run in any order)
- Avoid test interdependencies
Data-Driven Tests
- Use
[Theory]combined with data source attributes - Use
[InlineData]for inline test data - Use
[MemberData]for method-based test data - Use
[ClassData]for class-based test data - Create custom data attributes by implementing
DataAttribute - Use meaningful parameter names in data-driven tests
Assertions
- Use
Assert.Equalfor value equality - Use
Assert.Samefor reference equality - Use
Assert.True/Assert.Falsefor boolean conditions - Use
Assert.Contains/Assert.DoesNotContainfor collections - Use
Assert.Matches/Assert.DoesNotMatchfor regex pattern matching - Use
Assert.Throws<T>orawait Assert.ThrowsAsync<T>to test exceptions - Use fluent assertions library for more readable assertions
Mocking and Isolation
- Consider using Moq or NSubstitute alongside XUnit
- Mock dependencies to isolate units under test
- Use interfaces to facilitate mocking
- Consider using a DI container for complex test setups
Test Organization
- Group tests by feature or component
- Use
[Trait("Category", "CategoryName")]for categorization - Use collection fixtures to group tests with shared dependencies
- Consider output helpers (
ITestOutputHelper) for test diagnostics - Skip tests conditionally with
Skip = "reason"in fact/theory attributes