Enable AUTHORING UX approval tests creation
See original GitHub issueBackground
Today there is no good story for template authors to test their templates and ensure they work as intended after doing changes and environment around them changes(.NET Framework, Template Engine…)
We have https://github.com/dotnet/templating/tree/main/tools/ProjectTestRunner but its pretty hard to understand and navigate for novice template author, also tooling is not compiled as dotnet global tool to just use… We run tests in our repo using Process.Start("dotnet new console) and than check output see example here.. Again, not very good way for template author to run/maintain tests.
Outcomes
This enables template development inner loop. We want to support approval tests, which means template author would do:
dotnet new console
once to create initial content- On test run TemplateEngine will provide ability to run tests that compare content from step 1) with what it would generate, if change is intentional author re-runs step 1) and commits to git changes.
Justification
- Customer impact - 1st party customers has easy tools to test their templates (popular request)
- Engineering impact - automated testing contribute to less bugs - automated testing allows to reduce amount of manual testing - teams don’t need to invent and support own tooling for testing
Prerequisite
What needs to be solved how to handle random values like PortNumber or GUIDs…
Subtasks
Investigations:
- Get stats on usage of nondeterministic generators (Guid, Now, Port, Random) - @vlada-shubina
- Investigate ways of usage of XUnit Verifier so that multiple verification can be performed and reported (even if multiple are failing)
- Verify.Net doesn’t support verification of multiple files at the moment.
- Simon is considering to implement it in near future
- We stick to 1-by-1 file comparison at the moment
- Go through CommonTemplatesTests to access what functionality we’ll need from the test framework in order to transform and adopt these tests.
- stdout and stderr content comparison
- content regex matching
- content substring matching, absence of patterns
- newlines normalization
- custom content checking (xml parsing)
- Investigate options to programaticaly change dotnet sdk version to run a sdk tool (as a fallback we can programatically create and then discard global.json) We will leverage global.json for this - simplified approach:
ren global.json global.json.bak
dotnet new globaljson --sdk-version <version>
ren global.json.bak global.json
Subtasks for MVP (not to be exposed to customers):
- Generalize and repurpose Microsoft.TemplateEngine.TemplateLocalizer as
templates authoring toolset
. Packaged as nuget - @vlada-shubina - Define configuration model for a single test case ({template to be tested; dotnet sdk version; parameter values; approvals location}). Create System.CommandLine Parser transforming CLI arguments to this configuration model
- Verification logic module (the API and actual logic doesn’t have to be polished for first version) - @JanKrivanek
- Add programatic way of simple scrubbing and/or replacing keyed by files.
- Transform and onboard CommonTemplatesTests to the new framework https://github.com/dotnet/sdk/pull/28707
V2 (preparation for customers exposed toolset):
- Define Verification module API. CLI and MSBuild interfaces should call the logic through the API
- Extract external process wrapping logic (
Command
) fromMicrosoft.DotNet.Cli.Utils
or find another utility for wrapping CLI processes - and get rid of copied code within theMicrosoft.TemplateEngine.Authoring.TemplateVerifier
. (this task might be joined with https://github.com/dotnet/templating/issues/5296) -
Support batch execution logic for multiple test cases (probably configured by files) - Support filter/ignore lists (with defualt behavior that should suffice in most common cases) - e.g. to be able to ignore images, bin/* outputs etc.
- Support for not installed templates (arbitrary location on disk)
- Support for switching sdk versions
- Documenting API and CLI in docs/wiki
- Simplify and unify working with paths (include/exclude patterns, paths given in custom scruber and custom verifier etc.) - tooling should be permissive and be able to accept mix-n-matches of dir separator chars, plus it should have settings for enforcing separator char used in path it’s passing out (custom scrubber and verifier). Paths passed out should probably be passed as custom type - allowing to fetch relative path (to template root or test root), absolute path, paths with custom spearators
- Telemetry opt out in our integration tests (currently: https://github.com/dotnet/sdk/blob/main/src/Tests/Microsoft.NET.TestFramework/Commands/DotnetNewCommand.cs#L20, possibilities: explicit set of env in integration set fixture; or ability to inject env into instantiatot process via API)
Next iterations (ideally part of the first customer facing version):
- Review (and adjust if needed) signing of tooling imposed by source build - is it required for shipping?
- Rewrite more snapshots-based integration test for templating instantiation in sdk to use the tooling
- Add telemetry
- Implement context detection and extraction for nondeterministic generators handling (so e.g. for Port generator, the logic should be able to detect the resulting value in the generated output and then process the output by replacing all instances of the generator being used).
- Add Template Validator as another tool in the authoring toolset. Implement just a sample of most importatnt validations - [ ] Create MSBuild Task version of the Template Validator
- Design and use continuable errors during validation - so that as much errors as possible cna be reported during single run (while not reporting nonsense issues caused by inconsistent data detected in previous steps).
- Investigate, Design and implement deterministic mode for Macros (and hence generators): https://github.com/dotnet/templating/pull/5223
- Build in property based testing
Issue Analytics
- State:
- Created 2 years ago
- Comments:23 (23 by maintainers)
Top GitHub Comments
I can make this happen
Based on brainstroming session with @vlada-shubina those are the tasks we came up with:
Investigations:
Subtasks for MVP (not to be exposed to customers):
templates authoring toolset
. Packaged as nuget - @vlada-shubinaV2 (preparation for customers exposed toolset):
Next iterations (ideally part of the first customer facing version):