1
0
Fork 0
mirror of https://github.com/VSadov/Satori.git synced 2025-06-12 02:30:29 +09:00

Convert wiki to docs

Commit migrated from 068c2c5f10
This commit is contained in:
Richard Lander 2015-05-30 11:55:48 -07:00
parent e0424038bc
commit 2324c4a04a
21 changed files with 177 additions and 34 deletions

58
docs/libraries/README.md Normal file
View file

@ -0,0 +1,58 @@
Documents Index
===============
Learn about .NET Core
====================
- [Brief Intro to .NET Core](https://github.com/dotnet/coreclr/blob/master/Documentation/dotnetcore-intro.md)
- [[WIP] Official .NET Core Docs](http://dotnet.readthedocs.org)
Get .NET Core
=============
- [Get .NET Core DNX SDK on Windows](https://github.com/dotnet/coreclr/blob/master/Documentation/get-dotnetcore-dnx-windows.md)
- [Get .NET Core DNX SDK on OS X](https://github.com/dotnet/coreclr/blob/master/Documentation/get-dotnetcore-dnx-osx.md)
- [Get .NET Core DNX SDK on Linux](https://github.com/dotnet/coreclr/blob/master/Documentation/get-dotnetcore-dnx-linux.md)
- [Get .NET Core (Raw) on Windows](https://github.com/dotnet/coreclr/blob/master/Documentation/get-dotnetcore-windows.md)
Project Docs
============
- [Developer Guide](developer-guide.md)
- [Project priorities](https://github.com/dotnet/coreclr/blob/master/Documentation/project-priorities.md)
- [Contributing to CoreFX](contributing.md)
- [Contributing to .NET Core](https://github.com/dotnet/coreclr/blob/master/Documentation/contributing.md)
- [Contributing Workflow](https://github.com/dotnet/coreclr/blob/master/Documentation/contributing-workflow.md)
- [Issue Guide](issue-guide.md)
- [Branching Guide](branching-guide.md)
- [API Review Process](api-review-process.md)
- [Strong Name Signing](strong-name-signing.md)
- [Open Source Signing](oss-signing.md)
- [Repo Organization](repo-organization.md)
Coding Guidelines
=================
- [C# coding style](coding-style.md)
- [Framework Design Guidelines](framework-design-guidelines-digest.md)
- [Cross-Platform Guidelines](cross-platform-guidelines.md)
- [Performance Guidelines](performance-guidelines.md)
- [Interop Guidelines](interop-guidelines.md)
- [Breaking Changes](breaking-changes.md)
- [Breaking Change Definitions](breaking-change-definitions.md)
- [Breaking Change Rules](breaking-change-rules.md)
Building from Source
====================
- [Building on Linux](linux-instructions.md)
- [Code Coverage](code-coverage.md)
Other Information
=================
- [CoreCLR Repo documentation](https://github.com/dotnet/coreclr/tree/master/Documentation)
- [Porting to .NET Core](support-dotnet-core-instructions.md)
- [.NET Standards (Ecma)](https://github.com/dotnet/coreclr/blob/master/Documentation/dotnet-standards.md)
- [MSDN Entry for the CLR](http://msdn.microsoft.com/library/8bs2ecf4.aspx)
- [Wikipedia Entry for the CLR](http://en.wikipedia.org/wiki/Common_Language_Runtime)

View file

@ -0,0 +1,58 @@
API Review Process
==================
The .NET Framework has a long standing history of taking API usability extremely seriously. Thus, we generally review every single API that is added to the product. This page discusses how we conduct API reviews for components that are open sourced.
## Process Goals
The key goals are:
* **Designed for GitHub**. In order to be sustainable and not be a hurdle for contributors the API review process must feel natural to folks familiar with GitHub.
* **Efficiency**. Performing API reviews requires looping in a set of experts. We want to conduct API reviews in an agile fashion without randomizing the reviewers or community members.
* **Transparency**. We can use the same process for both internal as well as external contributors. This allows contributors to benefit from the results of API reviews even if the implementer isn't external.
## Overall Process
GitHub is generally based around the pull-request model. The idea is that contributors perform their changes in their own fork and submit a pull request against our repository.
For trivial code changes, such as typo fixes, we want folks to directly submit a pull request rather than opening an issue. However, for bug fixes or feature work, we want contributors to first start a discussion by creating an issue.
For work that involves adding new APIs we'd like the issue to contain what we call a *speclet*. The speclet should provide a rough sketch of how the APIs are intended to be used, with sample code that shows typical scenarios. The goal isn't to be complete but rather to illustrate the direction so that readers can judge whether the proposal is sound. Here is [a good example](https://github.com/dotnet/corefx/issues/271).
![API Review Process](images/api-review-process.png)
## Steps
* **Contributor opens an issue**. The issue description should contain a speclet that represents a sketch of the new APIs, including samples on how the APIs are being used. The goal isn't to get a complete API list, but a good handle on how the new APIs would roughly look like and in what scenarios they are being used. Here is [a good example](https://github.com/dotnet/corefx/issues/271).
* **Community discusses the proposal**. If changes are necessary, the contributor is encouraged to edit the issue description. This allows folks joining later to understand the most recent proposal. To avoid confusion, the contributor should maintain a tiny change log, like a bolded "Updates:" followed by a bullet point list of the updates that were being made.
* **Issue is tagged as "Accepting PRs"**. Once the contributor and project owner agree on the overall shape and direction, the project owner tags the issue as "Accepting PRs". The contributor should indicate whether they will be providing the PR or only contributed the idea.
* **Coding**. The contributor is implementing the APIs as discussed. Minor deviations are OK, but if during the implementation the design starts to take a major shift, the contributor is encouraged to go back to the issue and raise the concerns with the current proposal.
* **Pull request is being created**. Once the contributor believes the implementation is ready for review, she creates a pull request, referencing the issue created in the first step. In order to call dips, you can also create the PR before it's completely ready. Use checkboxes to indicate which areas are still missing so that we know it's not ready for review yet. [Here is a good example](https://github.com/dotnet/corefx/pull/316). At this time, if any new API are being added to a type that has shipped in the full .NET Framework, submit the pull request to the *future* branch. See [Branching Guide](branching-guide.md).
* **Pull request is being reviewed**. The community reviews the code for the pull request. The review should focus on the code changes and architecture - not the APIs themselves. Once at least two project owners give their OK, the PR is considered good to go.
* **Pull is tagged as "Needs API Review"**. The project owner then marks the pull request as "Needs API Review".
* **API review**. Using the information in the pull request we'll create an APIX file that constitutes the API delta. The API review board meets multiple times a week to review all PRs that are tagged as needing an API review.
* **API review notes are being published**. After the review, we'll publish the notes in the [API Review repository](https://github.com/dotnet/apireviews). A good example is the [review of immutable collections](https://github.com/dotnet/apireviews/tree/master/2015-01-07-immutable).
* **Pull request is updated with the results of the API Review**. Once the API review is complete, the project owner uploads the notes and API HTML diff, including all comments. The project owner also updates the PR accordingly, with either a call to action to address some concerns or a good to go indicator.
* **Pull request is merged**. When there are no issues - or the issues were addressed by the contributor, the PR is merged.
## API Design Guidelines
The .NET design guidelines are captured in the famous book [Framework Design Guidelines](http://amazon.com/dp/0321545613) by Krzysztof Cwalina and Brad Abrams.
A digest with the most important guidelines are available in our [developer wiki](Framework-Design-Guidelines-Digest). Long term, we'd like to publish the individual guidelines in standalone repo on which we can also accept PRs and -- more importantly for API reviews -- link to.
## API Review Notes
The API review notes are being published in [API Review repository](https://github.com/dotnet/apireviews).

View file

@ -0,0 +1,26 @@
Branching Guide
===============
We will have the following branches in the corefx repository:
* **master**
* Where most development happens
* Submit your PRs here unless you are adding API to a type that exists in the full .NET Framework
* **future**
* Landing place for fully API and code reviewed changes that are not to be part of the next upcoming release.
* Submit your PRs here if you're adding surface area to a type that has shipped in the full .NET Framework as we can no longer accept those changes and achieve our compatibility goal for the first release of .NET Core
* Takes regular merges from master
* Once we snap for the first release, we will merge future to master and delete future
* **release/[name]**
* Release branches snapped from master.
* Do not submit pull requests to these branches
* Fixes here do not flow to follow-up releases
* Generally, fixes after a snap needing to make it in to a release will go in to master and get cherry-picked to the release branch.
* **dev/[name]**
* Features (aka topics) under active development by more than one developer.
* Submit PRs here only if you've made prior arrangements to work on something in one of these branches.
* It is up to the developers creating these branches to decide what level of review is required
* These features will only ship if they are successfully pulled to master or future via the standard PR and API review process.

View file

@ -0,0 +1,38 @@
Breaking Change Definitions
===========================
Behavioral Change
-----------------
A behavioral change represents changes to the behavior of a member. A behavioral change may including throwing a new exception, adding or removing internal method calls, or alternating the way in which a return value is calculated. Behavioral changes can be the hardest type of change to categorize as acceptable or not - they can be severe in impact, or relatively innocuous.
Binary Compatibility
--------------------
Refers to the ability of existing consumers of an API to be able to use a newer version without recompilation. By definition, if an assembly's public signatures have been removed, or altered so that consumers cannot no longer access the same interface exposed by the assembly, the change is said to be a _binary incompatible change_.
Source Compatibility
--------------------
Refers to the ability of existing consumers of an API to recompile against a newer version without any source changes. By definition, if a consumer needs to make changes to its code in order to for it build successfully against a newer version of an API, the change is said to be a _source incompatible change_.
Design-Time Compatibility
-------------------------
_Design-time compatibility_ refers to preserving the design-time experience across versions of Visual Studio and other design-time environments. This can involve details around the UI of the designer, but by far the most interesting design-time compatibility is project compatibility. A potential project (or solution), must be able to be opened, and used on a newer version of a designer.
Backwards Compatibility
-----------------------
_Backwards compatibility_ refers to the ability of an existing consumer of an API to run against, and behave in the same way against a newer version. By definition, if a consumer is not able to run, or behaves differently against the newer version of the API, then the API is said to be _backwards incompatible_.
Changes that affect backwards compatibility are strongly discouraged. All alternates should be actively considered, since developers will, by default, expect backwards compatibility in newer versions of an API.
Forwards Compatibility
----------------------
_Forwards compatibility_ is the exact reverse of backwards compatibility; it refers to the ability of an existing consumer of an API to run against, and behave in the way against a _older_ version. By definition, if a consumer is not able to run, or behaves differently against an older version of the API, then the API is said to be _forwards incompatible_.
Changes that affect forwards compatibility are generally less pervasive, and there is not as stringent a demand to ensure that such changes are not introduced. Customers accept that a consumer which relies upon a newer API, may not function correctly against the older API.
This document does not attempt to detail forwards incompatibilities.

View file

@ -0,0 +1,216 @@
Breaking Change Rules
=====================
* [Behavioral Changes](#behavioral-changes)
* [Property, Field, Parameter and Return Values](#property-field-parameter-and-return-values)
* [Exceptions](#exceptions)
* [Platform Support](#platform-support)
* [Code](#code)
* [Source and Binary Compatibility Changes](#source-and-binary-compatibility-changes)
* [Assemblies](#assemblies)
* [Types](#types)
* [Members](#members)
* [Signatures](#signatures)
* [Attributes](#attributes)
## Behavioral Changes
### Property, Field, Parameter and Return Values
✓ **Allowed**
* Increasing the range of accepted values for a property or parameter if the member _is not_ `virtual`
* Returning a more derived type for a property, field, return or `out` value
✗ **Disallowed**
* Increasing the range of accepted values for a property or parameter if the member _is_ `virtual`
> This is breaking because any existing overridden members will now not function correctly for the extended range of values.
* Decreasing the range of accepted values for a property or parameter, such as a change in parsing of input and throwing new errors (even if parsing behavior is not specified in the docs)
* Increasing the range of returned values for a property, field, return or `out` value
* Changing the returned values for a property, field, return or 'out' value, such as the value returned from `ToString`
> If you had an API which returned a value from 0-10, but actually intended to divide the value by two and forgot (return only 0-5) then changing the return to now give the correct value is a breaking.
* Changing the default value for a property, field or parameter (either via an overload or default value)
* Changing the value of an enum member
* Changing the precision of a numerical return value
### Exceptions
✓ **Allowed**
* Throwing a more derived exception than an existing exception
> For example, `CultureInfo.GetCultureInfo(String)` used to throw `ArgumentException` in .NET Framework 3.5. In .NET Framework 4.0, this was changed to throw `CultureNotFoundException` which derives from `ArgumentException`, and therefore is an acceptable change.
* Throwing a more specific exception than `NotSupportedException`, `NotImplementedException`, `NullReferenceException` or an exception that is considered unrecoverable
> Unrecoverable exceptions should not be getting caught and will be dealt with on a broad level by a high-level catch-all handler. Therefore, users are not expected to have code that catches these explicit exceptions. The list of unrecoverable exceptions are:
* `StackOverflowException`
* `SEHException`
* `ExecutionEngineException`
* `AccessViolationException`
* Throwing a new exception that only applies to a code-path which can only be observed with new parameter values, or state (that couldn't hit by existing code targeting the previous version)
* Removing an exception that was being thrown when the API allows more robust behavior or enables new scenarios
> For example, a Divide method which only worked on positive values, but threw an exception otherwise, can be changed to support all values and the exception is no longer thrown.
✗ **Disallowed**
* Throwing a new exception in any other case not listed above
* Removing an exception in any other case not listed above
### Platform Support
✓ **Allowed**
* An operation previously not supported on a specific platform, is now supported
✗ **Disallowed**
* An operation previously supported on a specific platform is no longer supported, or now requires a specific service-pack
### Code
✓ **Allowed**
* A change which is directly intended to increase performance of an operation
> The ability to modify the performance of an operation is essential in order to ensure we stay competitive, and we continue to give users operational benefits. This can break anything which relies upon the current speed of an operation, sometimes visible in badly built code relying upon asynchronous operations. Note that the performance change should have no affect on other behavior of the API in question, otherwise the change will be breaking.
* A change which indirectly, and often adversely, affects performance
> Assuming the change in question is not categorized as breaking for some other reason, this is acceptable. Often, actions need to be taken which may include extra operation calls, or new functionality. This will almost always affect performance, but may be essential to make the API in question function as expected.
* Changing the text of an error message
> Not only should users not rely on these text messages, but they change anyways based on culture
* Calling a brand new event that wasn't previously defined.
✗ **Disallowed**
* Adding the `checked` keyword to a code-block
> This may cause code in a block to to begin to throwing exceptions, an unacceptable change.
* Changing the order in which events are fired
> Developers can reasonably expect events to fire in the same order.
* Removing the raising of an event on a given action
* Changing a synchronous API to asynchronous (and vice versa)
* Firing an existing event when it was never fired before
* Changing the number of times given events are called
## Source and Binary Compatibility Changes
### Assemblies
✓ **Allowed**
* Making an assembly portable when the same platforms are still supported
✗ **Disallowed**
* Changing the name of an assembly
* Changing the public key of an assembly
### Types
✓ **Allowed**
* Adding the `sealed` or `abstract` keyword to a type when there are _no accessible_ (public or protected) constructors
* Increasing the visibility of a type
* Introducing a new base class
> So long as it does not introduce any new abstract members or change the semantics or behavior of existing members, a type can be introduced into a hierarchy between two existing types. For example, between .NET Framework 1.1 and .NET Framework 2.0, we introduced `DbConnection` as a new base class for `SqlConnection` which previously derived from `Component`.
* Adding an interface implementation to a type
> This is acceptable because it will not adversely affect existing clients. Any changes which could be made to the type being changed in this situation, will have to work within the boundaries of acceptable changes defined here, in order for the new implementation to remain acceptable.
> Extreme caution is urged when adding interfaces that directly affect the ability of the designer or serializer to generate code or data, that cannot be consumed down-level. An example is the `ISerializable` interface.
* Removing an interface implementation from a type when the interface is already implemented lower in the hierarchy
* Moving a type from one assembly into another assembly
> The old assembly must be marked with `TypeForwardedToAttribute` pointing to the new location
✗ **Disallowed**
* Adding the `sealed` or `abstract` keyword to a type when there _are accessible_ (public or protected) constructors
* Decreasing the visibility of a type
* Removing the implementation of an interface on a type
> It is not breaking when you added the implementation of an interface which derives from the removed interface. For example, you removed `IDisposable`, but implemented `IComponent`, which derives from `IDisposable`.
* Removing one or more base classes for a type, including changing `struct` to `class` and vice versa
* Changing the namespace or name of a type
### Members
✓ **Allowed**
* Adding an abstract member to a public type when there are _no accessible_ (`public` or `protected`) constructors, or the type is `sealed`
* Moving a member onto a class higher in the hierarchy tree of the type from which it was removed
* Increasing the visibility of a member that is not `virtual`
* Decreasing the visibility of a `protected` member when there are _no accessible_ (`public` or `protected`) constructors or the type is `sealed`
* Changing a member from `abstract` to `virtual`
* Adding `virtual` to a member
> Make note, that marking a member virtual might cause previous consumers to still call the member non-virtually.
* Introducing or removing an override
> Make note, that introducing an override might cause previous consumers to skip over the override when calling `base`.
✗ **Disallowed**
* Adding an member to an interface
* Adding an abstract member to a type when there _are accessible_ (`public` or `protected`) constructors and the type is not `sealed`
* Adding a constructor to a class which previously had no constructor, without also adding the default constructor
* Adding an overload that precludes an existing overload, and defines different behavior
> This will break existing clients that were bound to the previous overload. For example, if you have a class that has a single version of a method that accepts a `uint`, an existing consumer will
successfully bind to that overload, if simply passing an `int` value. However, if you add an overload that accepts an `int`, recompiling or via late-binding the application will now bind to the new overload. If different behavior results, then this is a breaking change.
* Removing or renaming a member, including a getter or setter from a property or enum members
* Decreasing the visibility of a `protected` member when there _are accessible_ (`public` or `protected`) constructors and the type is not `sealed`
* Adding or removing `abstract` from a member
* Removing the `virtual` keyword from a member
* Adding or removing `static` keyword from a member
### Signatures
✓ **Allowed**
* Adding `params` to a parameter
* Removing `readonly` from a field, unless the static type of the field is a mutable value type
✗ **Disallowed**
* Adding `readonly` to a field
* Adding the `FlagsAttribute` to an enum
* Changing the type of a property, field, parameter or return value
* Adding, removing or changing the order of parameters
* Removing `params` from a parameter
* Adding or removing `out` or `ref` keywords from a parameter
* Renaming a parameter (including case)
> This is considered breaking for two reasons:
* It breaks late-bound scenarios, such as Visual Basic's late-binding feature and C#'s `dynamic`
* It breaks source compatibility when developers use [named parameters](http://msdn.microsoft.com/en-us/library/dd264739.aspx).
* Changing a parameter modifier from `ref` to `out`, or vice versa
### Attributes
✓ **Allowed**
* Changing the value of an attribute that is _not observable_
✗ **Disallowed**
* Removing an attribute
> Although this item can be addressed on a case to case basis, removing an attribute will often be breaking. For example, `NonSerializedAttribute`
* Changing values of an attribute that _is observable_

View file

@ -0,0 +1,62 @@
Breaking Changes
================
We take compatibility in the .NET Framework and .NET Core extremely seriously.
Although .NET Core can be deployed app local, we are engineering it such that portable libraries can target it and still run on the full desktop framework as well. This means that the behavior of the full .NET Framework constrains the implementation of any overlapping API in .NET Core.
Below is a summary of some documentation we have internally about what kinds of things constitute breaking changes, how we categorize them, and how we decide what we're willing to take.
Note that these rules only apply to API that have shipped in a previous RTM release. New API still under development can be modified but we are still cautious not to disrupt the ecosystem unnecessarily when prerelease API change.
To help triage breaking changes, we classify them in to four buckets:
1. Public Contract
2. Reasonable Grey Area
3. Unlikely Grey Area
4. Clearly Non-Public
### Bucket 1: Public Contract
*Clear violation of public contract.*
Examples:
* throwing a new/different exception type in an existing common scenario
* An exception is no longer thrown
* A different behavior is observed after the change for an input
* renaming a public type, member, or parameter
* decreasing the range of accepted values within a given parameter
* A new instance field is added to a type (impacts serialization)
* changing the value of a public constant or enum member
### Bucket 2: Reasonable Grey Area
*Change of behavior that customers would have reasonably depended on.*
Examples:
* change in timing/order of events (even when not specified in docs)
* change in parsing of input and throwing new errors (even if parsing behavior is not specified in the docs)
These require judgment: how predictable, obvious, consistent was the behavior?
### Bucket 3: Unlikely Grey Area
*Change of behavior that customers could have depended on, but probably wouldn't.*
**Examples:**
* correcting behavior in a subtle corner case
As with type 2 changes, these require judgment: what is reasonable and whats not?
### Bucket 4: Clearly Non-Public
*Changes to surface area or behavior that is clearly internal or non-breaking in theory, but breaks an app.*
**Examples:**
* Changes to internal API that break private reflection
It is impossible to evolve a code base without making such changes, so we don't require up-front approval for these, but we will sometimes have to go back and revisit such change if there's too much pain inflicted on the ecosystem through a popular app or library.
This bucket is painful for the machine-wide .NET Framework, but we do have much more latitude here in .NET Core.
### What This Means for Contributors
* All bucket 1, 2, and 3 breaking changes require talking to the repo owners first.
* If you're not sure in which bucket applies to a given change, contact us as well.
* It doesn't matter if the old behavior is "wrong", we still need to think through the implications.
* If a change is deemed too breaking, we can help identify alternatives such as introducing a new API and obsoleting the old one.

View file

@ -0,0 +1,62 @@
Code Coverage
=============
"Code coverage" is a measure that indicates how much of our library code is exercised by our test suites. We measure code coverage using the [OpenCover](https://github.com/opencover/opencover), and a report of our latest code coverage results can be seen by clicking the coverage badge on the [CoreFX home page](https://github.com/dotnet/corefx):
[![Coverage status](https://img.shields.io/badge/coverage-report-blue.svg)](http://dotnet-ci.cloudapp.net/job/dotnet_corefx_coverage_windows/lastBuild/Code_Coverage_Report/)
This report shows each library currently being tested with code coverage and provides statistics around the quality of the code coverage for the library. It also provides a line-by-line breakdown of what lines are being covered and what lines are not.
## Goals
The code coverage report provides a percentage value per library of the number of source lines exercised by the tests. There is no hard and fast percentage that must be obtained per library, as every library is unique and comes with its own set of intricacies and constraints. While in some cases it's possible and reasonable to achieve 100% code coverage, this is rare. There are many valid reasons certain pieces of code won't be exercised in tests, e.g.:
- A code file is compiled into multiple projects, and only some of the code is used in each project.
- Code exists to handle rare race conditions too costly to simulate in normal conditions.
- Code exists to handle particular machine/OS configurations that are not used during code coverage runs.
Etc. What's important is that the right set of tests exist to ensure that the code is behaving properly and that regressions in functionality can be caught quickly, and code coverage metrics are a way to help guide us to that end.
Our default, somewhat-arbitrary initial goal for a library is 90% code coverage. That doesn't mean we're done with testing once a library hits 90%, nor does it mean we must keep going with a library until it hits 90%. We use this metric and the associated coverage information to help guide us towards the ideal for a given library.
(Note that we do not want to arbitrarily inflate our code coverage numbers. Tests must provide value in and of themselves and should not simply be written in a haphazard manner meant to execute more lines of code without providing real value.)
## Issues
Issues are opened for a library when a cursory examination of its code coverage reveal that there are likely still some meaningful gaps that need to be addressed. We welcome contributions to our test suites to help address these gaps and close these issues. Many of these issues are marked as "up for grabs".
An issue need not be addressed in its entirety. We happily accept contributions that improve our tests and work towards improving code coverage numbers even if they only incrementally improve the situation.
## Automated Code Coverage Runs
Code coverage runs are performed by Jenkins approximately twice a day. The results of these runs are all available from the site linked to by the code coverage badge on the home page.
## Local Code Coverage Runs
You can perform code coverage runs locally on your own machine. Normally to build your entire CoreFX repo, from the root of your repo you'd run:
build
To include code coverage in this run, augment it with the ```/p:Coverage=true``` argument:
build /p:Coverage=true
This will do the build and testing as with the normal ```build```, but it will run the tests using the OpenCover tool. A resulting index.htm file providing the results of the run will be available at:
bin\tests\coverage\index.htm
You can also build and test with code coverage for a particular test project rather than for the whole repo. Normally to build and test a particular test suite, from the same directory as that test suite's .csproj, you'd run:
msbuild /t:BuildAndTest
To do so with code coverage, as with ```build``` append the ```/p:Coverage=true``` argument:
msbuild /t:BuildAndTest /p:Coverage=true
The results for this one library will then also show up in the aforementioned index.htm file. For example, to build, test, and get code coverage results for the System.Diagnostics.Debug library, from the root of my repo I can do:
cd src\System.Diagnostics.Debug\tests\
msbuild /t:BuildAndTest /p:Coverage=true
And then once the run completes:
..\..\..\bin\tests\coverage\index.htm

View file

@ -0,0 +1,128 @@
C# Coding Style
===============
For non .cs files (c++, xml etc) our current best guidance is consistency. When editing files, keep new code and changes consistent with the style in the files. For new files, it should conform to the style for that component. Last, if there's a completely new component, anything that is reasonably broadly accepted is fine.
The general rule we follow is "use Visual Studio defaults".
1. We use [Allman style](http://en.wikipedia.org/wiki/Indent_style#Allman_style) braces, where each brace begins on a new line. A single line statement block can go without braces but the block must be properly indented on its own line and it must not be nested in other statement blocks that use braces (See issue [381](https://github.com/dotnet/corefx/issues/381) for examples).
2. We use four spaces of indentation (no tabs).
3. We use `_camelCase` for internal and private members and use `readonly` where possible. Prefix instance fields with `_`, static fields with `s_` and thread static fields with `t_`.
4. We avoid `this.` unless absolutely necessary.
5. We always specify the visibility, even if it's the default (i.e.
`private string _foo` not `string _foo`).
6. Namespace imports should be specified at the top of the file, *outside* of
`namespace` declarations and should be sorted alphabetically, with `System.
namespaces at the top and blank lines between different top level groups.
7. Avoid more than one empty line at any time. For example, do not have two
blank lines between members of a type.
8. Avoid spurious free spaces.
For example avoid `if (someVar == 0)...`, where the dots mark the spurious free spaces.
Consider enabling "View White Space (Ctrl+E, S)" if using Visual Studio, to aid detection.
9. If a file happens to differ in style from these guidelines (e.g. private members are named `m_member`
rather than `_member`), the existing style in that file takes precedence.
10. We only use `var` when it's obvious what the variable type is (i.e. `var stream = new FileStream(...)` not `var stream = OpenStandardInput()`).
11. We use language keywords instead of BCL types (i.e. `int, string, float` instead of `Int32, String, Single`, etc) for both type references as well as method calls (i.e. `int.Parse` instead of `Int32.Parse`). See issue [391](https://github.com/dotnet/corefx/issues/391) for examples.
12. We use PascalCasing to name all our constant local variables and fields. The only exception is for interop code where the constant value should exactly match the name and value of the code you are calling via interop.
We have provided a Visual Studio 2013 vssettings file (`corefx.vssettings`) at the root of the corefx repository, enabling C# auto-formatting conforming to the above guidelines. Note that rules 7 and 8 are not covered by the vssettings, since these are not rules currently supported by VS formatting.
We also use the [.NET Codeformatter Tool](https://github.com/dotnet/codeformatter) to ensure the code base maintains a consistent style over time, the tool automatically fixes the code base to conform to the guidelines outlined above.
### Example File:
``ObservableLinkedList`1.cs:``
```C#
using System;
using System.Collections;
using System.Collections.Generic;
using System.Collections.Specialized;
using System.ComponentModel;
using System.Diagnostics;
using Microsoft.Win32;
namespace System.Collections.Generic
{
public partial class ObservableLinkedList<T> : INotifyCollectionChanged, INotifyPropertyChanged
{
private ObservableLinkedListNode<T> _head;
private int _count;
public ObservableLinkedList(IEnumerable<T> items)
{
if (items == null)
throw new ArgumentException("items");
foreach (T item in items)
{
AddLast(item);
}
}
public event NotifyCollectionChangedEventHandler CollectionChanged;
public int Count
{
get { return _count; }
}
public ObservableLinkedListNode AddLast(T value)
{
var newNode = new LinkedListNode<T>(this, value);
InsertNodeBefore(_head, node);
}
protected virtual void OnCollectionChanged(NotifyCollectionChangedEventArgs e)
{
NotifyCollectionChangedEventHandler handler = CollectionChanged;
if (handler != null)
{
handler(this, e);
}
}
private void InsertNodeBefore(LinkedListNode<T> node, LinkedListNode<T> newNode)
{
...
}
...
}
}
```
``ObservableLinkedList`1.ObservableLinkedListNode.cs:``
```C#
using System;
namespace System.Collections.Generics
{
partial class ObservableLinkedList<T>
{
public class ObservableLinkedListNode
{
private readonly ObservableLinkedList<T> _parent;
private readonly T _value;
internal ObservableLinkedListNode(ObservableLinkedList<T> parent, T value)
{
Debug.Assert(parent != null);
_parent = parent;
_value = value;
}
public T Value
{
get { return _value; }
}
}
...
}
}
```

View file

@ -0,0 +1,17 @@
Contributing to CoreFX
======================
This document describes contribution guidelines that are specific to CoreFX. Please read [.NET Core Guidelines](https://github.com/dotnet/coreclr/blob/master/Documentation/contributing.md) for more general .NET Core contribution guidelines.
Coding Style Changes
--------------------
We intend to bring dotnet/corefx in to full conformance with the style guidelines described in [Coding Style](coding-style.md). We plan to do that with tooling, in a holistic way. In the meantime, please:
* **DO NOT** send PRs for style changes.
* **DO** give priority to the current style of the project or file you're changing even if it diverges from the general guidelines.
API Changes
-----------
* **DON'T** submit API additions to any type that has shipped in the full .NET framework to the *master* branch. Instead, use the *future* branch. See [Branching Guide](branching-guide.md).

View file

@ -0,0 +1,28 @@
Cross-Platform Guidelines
=========================
This page provides a FAQ for how we handle cross-platform code in CoreFX. (For structuring of interop code, see [interop guidelines](interop-guidelines.md).)
#### Should assemblies be binary-compatible across platforms (e.g. exact same System.IO.dll on Windows, Linux, and Mac)?
Our expectation is that the majority (estimating around 70%) of CoreFX assemblies will have no platform-specific code. These assemblies should be binary-compatible across platforms.
In some cases, the managed binary will be used across all platforms, but it'll come with its own native library that'll be compiled once per platform.
In a few dozen cases, the managed code itself will have differing implementations based on whether you're building for Windows, Linux, etc., and in such cases, the binary will not work from one platform to the next. Which binary gets used will be handled by the NuGet package delivering the libraries.
#### When should an existing platform-specific .NET API be deprecated or removed in favor of a new approach?
It's a case-by-case basis. In some cases, entire contracts that are platform-specific just won't be available on other platforms, as they don't make sense by their very nature (e.g. Microsoft.Win32.Registry.dll). In other cases, a contract will be available, but some members here and there that are platform-specific may throw PlatformNotSupportedException (e.g. Console.get_ForegroundColor on Unix). In general, though, we want to strive for having any APIs that exist on a platform (i.e. the contract is available) actually working on that platform.
#### When should partial classes be used to layer in platform-specific functionality?
Partial classes is the approach we're currently taking when the managed code needs to diverge based on underlying platform. There are a few cases where we've decided to go a different route, but even in some of those cases we may move back towards partial classes.
#### How should the platform-specific files be named (e.g. FileStream.Windows.cs? Win32FileStream.cs?)
When the whole type is for a particular platform, we've been using the prefix, e.g. PlatformFileStream.cs. When the file contains a partial class specialized for a particular platform, we've been using the *.Platform.cs suffix.
#### When should define statements be used rather than including different source files in the build environment?
We're striving to avoid defines whenever possible, instead preferring to include just the source files that are relevant.

View file

@ -0,0 +1,18 @@
Developer Guide
===============
This guide provides instructions (mostly as links) on how to build the repo and implement improvements. It will expand over time.
Building the repository
=======================
The CoreFX repo can be built from a regular, non-admin command prompt. The build produces multiple managed binaries that make up the CoreFX libraries and the accompanying tests. The repo can be built for the following platforms, using the provided instructions.
| Chip | Windows | Linux | OS X |
| :---- | :-----: | :---: | :--: |
| x64 | &#x25CF;| &#x25D2;| &#x25D2; |
| x86 | &#x25EF;| &#x25EF;| &#x25EF;|
| ARM32 | &#x25EF; | &#x25EF;| &#x25EF; |
| | [Instructions](windows-instructions.md) | [Instructions](linux-instructions.md) | |
The CoreFX build and test suite is a work in progress, as are the [building and testing instructions](README.md). The .NET Core team and the community are improving Linux and OS X support on a daily basis are and adding more tests for all platforms. See [CoreFX Issues](https://github.com/dotnet/corefx/issues) to find out about specific work items or report issues.

View file

@ -0,0 +1,313 @@
Framework Design Guidelines - Digest
====================================
This page is a distillation and a simplification of the most basic
guidelines described in detail in a book titled
[Framework Design Guidelines][FDG] by Krzysztof Cwalina and Brad Abrams.
Framework Design Guidelines were created in the early days of .NET Framework
development. They started as a small set of naming and design conventions but
have been enhanced, scrutinized, and refined to a point where they are generally
considered the canonical way to design frameworks at Microsoft. They carry the
experience and cumulative wisdom of thousands of developer hours over several
versions of the .NET Framework.
[FDG]: http://amazon.com/dp/0321545613
# General Design Principles
## Scenario Driven Design
Start the design process of your public API by defining the top scenarios for
each feature area. Write code you would like the end users to write when they
implement these scenarios using your API. Design your API based on the sample
code you wrote. For example, when designing API to measure elapsed time, you may
write the following scenario code samples:
```CSharp
// scenario #1 : measure time elapsed
Stopwatch watch = Stopwatch.StartNew();
DoSomething();
Console.WriteLine(watch.Elapsed);
// scenario #2 : reuse stopwatch
Dim watch As Stopwatch = Stopwatch.StartNew()
DoSomething();
Console.WriteLine(watch.ElapsedMilliseconds)
watch.Reset() watch.Start() DoSomething()
Console.WriteLine(watch.Elapsed)
// scenario #3: ...
```
## Usability Studies
Test usability of your API. Choose developers who are not familiar with your API
and have them implement the main scenarios. Try to identify which parts of your
API are not intuitive.
## Self Documenting API
Developers using your API should be able to implement main scenarios without
reading the documentation. Help users to discover what types they need to use in
main scenarios and what the semantics of the main methods are by choosing
intuitive names for most used types and members. Talk about naming choices
during specification reviews.
## Understand Your Customer
Realize that the majority of your customers are not like you. You should design
the API for your customer, not for developers working in your close working
group, who unlike majority of your customers are experts in the technology you
are trying to expose.
# Naming Guidelines
Casing and naming guidelines apply only to public and protected identifiers, and
privately implemented interface members. Teams are free to choose their own
guidelines for internal and private identifiers.
&#10003; **DO** use PascalCasing (capitalize the first letter of each word) for
all identifiers except parameter names. For example, use `TextColor` rather than
`Textcolor` or `Text_Color`.
&#10003; **DO** use camelCasing (capitalize first letters of each word except
for the first word) for all member parameter names. prefix descriptive type
parameter names with `T`.
```CSharp
public interface ISessionChannel<TSession>
where TSession : ISession
{
TSession Session { get; }
}
```
&#10003; **CONSIDER** using `T` as the type parameter name for types with one
single letter type parameter.
&#10003; **DO** use PascalCasing or camelCasing for any acronyms over two
characters long. For example, use `HtmlButton` rather than `HTMLButton`, but
`System.IO` instead of `System.Io`.
&#10007; **DO NOT** use acronyms that are not generally accepted in the field.
&#10003; **DO** use well-known acronyms only when absolutely necessary. For
example, use `UI` for User Interface and `Html` for Hyper-Text Markup Language.
&#10007; **DO NOT** use of shortenings or contractions as parts of identifier
names. For example, use `GetWindow` rather than `GetWin`.
&#10007; **DO NOT** use underscores, hyphens, or any other non-alphanumeric
characters.
&#10007; **DO NOT** use the Hungarian notation.
&#10003; **DO** name types and properties with nouns or noun phrases.
&#10003; **DO** name methods and events with verbs or verb phrases. Always give
events names that have a concept of before and after using the present particle
and simple past tense. For example, an event that is raised before a `Form`
closes should be named `Closing`. An event raised after a `Form` is closed
should be named `Closed`.
&#10007; **DO NOT** use the `Before` or `After` prefixes to indicate pre and
post events.
&#10003; **DO** use the following prefixes:
* `I` for interfaces.
* `T` for generic type parameters (except single letter parameters).
&#10003; **DO** use the following postfixes:
* `Exception` for types inheriting from `System.Exception`.
* `Collection` for types implementing `IEnumerable`.
* `Dictionary` for types implementing `IDictionary` or `IDictionary<K,V>`.
* `EventArgs` for types inheriting from `System.EventArgs`.
* `EventHandler` for types inheriting from `System.Delegate`.
* `Attribute` for types inheriting from `System.Attribute`.
&#10007; **DO NOT** use the postfixes listed above for any other types.
&#10007; **DO NOT** postfix type names with `Flags` or `Enum`.
&#10003; **DO** use plural noun phrases for flag enums (enums with values that
support bitwise operations) and singular noun phrases for non-flag enums.
&#10003; **DO** use the following template for naming namespaces:
<Company>.<Technology>[.<Feature>].
For example, `Microsoft.Office.ClipGallery`. Operating System components should
use System namespaces instead for the <Company> namespaces.
&#10007; **DO NOT** use organizational hierarchies as the basis for namespace
hierarchies. Namespaces should correspond to scenarios regardless of what teams
contribute APIs for those scenarios.
# General Design Guidelines
&#10003; **DO** use the most derived type for return values and the least
derived type for input parameters. For example take `IEnumerable` as an input
parameter but return `Collection<string>` as the return type. Provide a clear
API entry point for every scenario. Every feature area should have preferably
one, but sometimes more, types that are the starting points for exploring given
technology. We call such types Aggregate Components. Implementation of large
majority of scenarios in given technology area should start with one of the
Aggregate Components.
&#10003; **DO** write sample code for your top scenarios. The first type used in
all these samples should be an Aggregate Component and the sample code should be
straightforward. If the code gets longer than several lines, you need to
redesign. Writing to an event log in Win32 API was around 100 lines of code.
Writing to .NET Framework EventLog takes one line of code.
&#10003; **DO** model higher level concepts (physical objects) rather than
system level tasks with Aggregate Components. For example `File`, `Directory`,
`Drive` are easier to understand than `Stream`, `Formatter`, `Comparer`.
&#10007; **DO NOT** require users of your APIs to instantiate multiple objects
in main scenarios. Simple tasks should be done with new statement.
&#10003; **DO** support so called ”Create-Set-Call” programming style in all
Aggregate Components. It should be possible to instantiate every component with
the default constructor, set one or more properties, and call simple methods or
respond to events.
```CSharp
var applicationLog = new EventLog();
applicationLog.Source = "MySource";
applicationLog.WriteEntry(exception.Message);
```
&#10007; **DO NOT** require extensive initialization before Aggregate Components
can be used. If some initialization is necessary, the exception resulting from
not having the component initialized should clearly explain what needs to be
done.
&#10003; **DO** carefully choose names for your types, methods, and parameters.
Think hard about the first name people will try typing in the code editor when
they explore the feature area. Reserve and use this name for the Aggregate
Component. A common mistake is to use the ”best” name for a base type. Run FxCop
on your libraries.
&#10003; **DO** ensure your library is CLS compliant. Apply `CLSCompliantAttribute`
to your assembly.
&#10003; **DO** prefer classes over interfaces.
&#10007; **DO NOT** seal types unless you have a strong reason to do it.
&#10007; **DO NOT** create mutable value types.
&#10007; **DO NOT** ship abstractions (interfaces or abstract classes) without
providing at least one concrete type implementing each abstraction. This helps
to validate the interface design.
&#10007; **DO NOT** ship interfaces without providing at least one API consuming
the interface (a method taking the interface as a parameter). This helps to
validate the interface design.
&#10007; **AVOID** public nested types.
&#10003; **DO** apply `FlagsAttribute` to flag enums.
&#10003; **DO** strongly prefer collections over arrays in public API.
&#10007; **DO NOT** use `ArrayList`, `List<T>`, `Hashtable`, or `Dictionary<K,V>`
in public APIs. Use `Collection<T>`, `ReadOnlyCollection<T>`,
`KeyedCollection<K,V>`, or `CollectionBase` subtypes instead. Note that the
generic collections are only supported in the Framework version 2.0 and above.
&#10007; **DO NOT** use error codes to report failures. Use Exceptions instead.
&#10007; **DO NOT** throw `Exception` or `SystemException`.
&#10007; **AVOID** catching the `Exception` base type.
&#10003; **DO** prefer throwing existing common general purpose exceptions like
`ArgumentNullException`, `ArgumentOutOfRangeException`,
`InvalidOperationException` instead of defining custom exceptions. throw the
most specific exception possible.
&#10003; **DO** ensure that exception messages are clear and actionable.
&#10003; **DO** use `EventHandler<T>` for events, instead of manually defining
event handler delegates.
&#10003; **DO** prefer event based APIs over delegate based APIs.
&#10003; **DO** prefer constructors over factory methods.
&#10007; **DO NOT** expose public fields. Use properties instead.
&#10003; **DO** prefer properties for concepts with logical backing store but
use methods in the following cases:
* The operation is a conversion (such as `Object.ToString()`)
* The operation is expensive (orders of magnitude slower than a field set would
be)
* Obtaining a property value using the Get accessor has an observable side
effect
* Calling the member twice in succession results in different results
* The member returns an array. Note: Members returning arrays should return
copies of an internal master array, not a reference to the internal array.
&#10003; **DO** allow properties to be set in any order. Properties should be
stateless with respect to other properties.
&#10007; **DO NOT** make members virtual unless you have a strong reason to do
it.
&#10007; **AVOID** finalizers.
&#10003; **DO** implement `IDisposable` on all types acquiring native resources
and those that provide finalizers.
&#10003; **DO** be consistent in the ordering and naming of method parameters.
It is common to have a set of overloaded methods with an increasing number of
parameters to allow the developer to specify a desired level of information.
&#10003; **DO** make sure all the related overloads have a consistent parameter
order (same parameter shows in the same place in the signature) and naming
pattern. The only method in such a group that should be virtual is the one that
has the most parameters and only when extensibility is needed.
```CSharp
public class Foo
{
private readonly string _defaultForA = "default value for a";
private readonly int _defaultForB = 42;
public void Bar()
{
Bar(_defaultForA, _defaultForB);
}
public void Bar(string a)
{
Bar(a, _defaultForB);
}
public void Bar(string a, int b)
{
// core implementation here
}
}
```
&#10007; **AVOID** `out` and `ref` parameters.
# Resources
## FxCop
FxCop is a code analysis tool that checks managed code assemblies for
conformance to the [Framework Design Guidelines][FDG].
<http://code.msdn.microsoft.com/codeanalysis>
## Presentations
* [Overview of the Framework Design Guidelines](http://blogs.msdn.com/kcwalina/archive/2007/03/29/1989896.aspx)
* [TechEd 2007 Presentation about framework engineering](http://blogs.msdn.com/kcwalina/archive/2008/01/08/FrameworkEngineering.aspx)

Binary file not shown.

After

Width:  |  Height:  |  Size: 75 KiB

View file

@ -0,0 +1,160 @@
Interop Guidelines
==================
## Goals
We have the following goals related to interop code being used in CoreFX:
- Minimize code duplication for interop.
- We should only define a given interop signature in a single place.  This stuff is tricky, and we shouldnt be copy-and-pasting it.
- Minimize unnecessary IL in assemblies. 
- Interop signatures should only be compiled into the assemblies that actually consume them.  Having extra signatures bloats assemblies and makes it more difficult to do static analysis over assemblies to understand what they actually use.  It also leads to problems when such static verification is used as a gate, e.g. if a store verifies that only certain APIs are used by apps in the store.
- Keep interop code isolated and consolidated.
- This is both for good hygiene and to help keep platform-specific code separated from platform-neutral code, which is important for maximizing reusable code above PAL layers.
## Approach
### Interop type
- All code related to interop signatures (DllImports, interop structs used in DllImports, constants that map to native values, etc.) should live in a partial, static, and internal “Interop” class in the root namespace, e.g.
```C#
internal static partial class Interop { … }
```
- Declarations shouldnt be in Interop directly, but rather within a partial, static, internal nested type named for a given library or set of libraries, e.g.
```C#
internal static partial class Interop
{
internal static partial class libc { … }
}
...
internal static partial class Interop
{
internal static partial class mincore { … }
}
```
- With few exceptions, the only methods that should be defined in these interop types are DllImports.
- Exceptions are limited to times when most or every consumer of a particular DllImport will need to wrap its invocation in a helper, e.g. to provide additional marshaling support, to hide thread-safety issues in the underlying OS implementation, to do any required manipulation of safe handles, etc.  In such cases, the DllImport should be private whenever possible rather than internal, with the helper code exposed to consumers rather than having the DllImport exposed directly.
### File organization
- The Interop partial class definitions should live in Interop.*.cs files.  These Interop.*.cs files should all live under Common rather than within a given assemblys folder.
- The only exception to this should be when an assembly P/Invokes to its own native library that isnt available to or consumed by anyone else, e.g. System.IO.Compression P/Invoking to clrcompression.dll.  In such cases, System.IO.Compression should have its own Interop folder which follows a similar scheme as outlined in this proposal, but just for these private P/Invokes.
- Under Common\src\Interop, well have a folder for each target platform, and within each platform, for each library from which functionality is being consumed.  The Interop.*.cs files will live within those library folders, e.g.
```
\Common\src\Interop
\Windows
\mincore
... interop files
\Unix
\libc
... interop files
\Linux
\libc
... interop files
```
As shown above, platforms may be additive, in that an assembly may use functionality from multiple folders, e.g. System.IO.FileSystems Linux build will use functionality both from Unix (common across all Unix systems) and from Linux (specific to Linux and not available across non-Linux Unix systems).
 
- Interop.*.cs files are created in a way such that every assembly consuming the file will need every DllImport it contains.
- If multiple related DllImports will all be needed by every consumer, they may be declared in the same file, named for the functionality grouping, e.g. Interop.IOErrors.cs.
- Otherwise, in the limit (and the expected case for most situations) each Interop.*.cs file will contain a single DllImport and associated interop types (e.g. the structs used with that signature) and helper wrappers, e.g. Interop.strerror.cs.
```
\Common\src\Interop
\Unix
\libc
\Interop.strerror.cs
\Windows
\mincore
\Interop.OutputDebugString.cs
```
- If structs/constants will be used on their own without an associated DllImport, or if they may be used with multiple DllImports not in the same file, they should be declared in a separate file.
- In the case of multiple overloads of the same DllImport (e.g. some overloads taking a SafeHandle and others taking an IntPtr, or overloads taking different kinds of SafeHandles), if they cant all be declared in the same file (because they wont all be consumed by all consumers), the file should be qualified with the key differentiator, e.g.
```
\Common\src\Interop
\Windows
\mincore
\Interop.DuplicateHandle_SafeTokenHandle.cs
\Interop.DuplicateHandle_IntPtr.cs
```
- The library names used per-platform are stored in internal constants in the Interop class in a private Libraries class in a per-platform file named Interop.Libraries.cs.  These constants are then used for all DllImports to that library, rather than having the string duplicated each time, e.g.
```C#
internal static partial class Interop // contents of Common\src\Interop\Windows\Interop.Libraries.cs
{
private static class Libraries
{
internal const string Kernel32 = "kernel32.dll";
internal const string Localization = "api-ms-win-core-localization-l1-2-0.dll";
internal const string Handle = "api-ms-win-core-handle-l1-1-0.dll";
internal const string ProcessThreads = "api-ms-win-core-processthreads-l1-1-0.dll";
internal const string File = "api-ms-win-core-file-l1-1-0.dll";
internal const string NamedPipe = "api-ms-win-core-namedpipe-l1-1-0.dll";
internal const string IO = "api-ms-win-core-io-l1-1-0.dll";
...
}
}
```
(Note that this will likely result in some extra constants defined in each assembly that uses interop, which minimally violates one of the goals, but its very minimal.)
 
- .csproj project files then include the interop code they need, e.g.
```XML
<ItemGroup Condition=" '$(TargetsUnix)' == 'true' ">
<Compile Include="Interop\Unix\Interop.Libraries.cs" />
<Compile Include="Interop\Unix\libc\Interop.strerror.cs" />
<Compile Include="Interop\Unix\libc\Interop.getenv.cs" />
<Compile Include="Interop\Unix\libc\Interop.getenv.cs" />
<Compile Include="Interop\Unix\libc\Interop.open64.cs" />
<Compile Include="Interop\Unix\libc\Interop.close.cs" />
<Compile Include="Interop\Unix\libc\Interop.snprintf.cs" />
...
</ItemGroup
```
### Build System
When building CoreFx, we use the "OSGroup" property to control what target platform we are building for. The valid values for this property are Windows_NT (which is the default value from MSBuild when running on Windows), Linux and OSX.
The build system sets a few MSBuild properties, depending on the OSGroup setting:
* TargetsWindows
* TargetsLinux
* TargetsOSX
* TargetsUnix
TargetsUnix is true for both OSX and Linux builds and can be used to include code that can be used on both Linux and OSX (e.g. it is written against a POSIX API that is present on both platforms).
You should not test the value of the OSGroup property directly, instead use one of the values above.
#### Project Files
Whenever possible, a single .csproj should be used per assembly, spanning all target platforms, e.g. System.Console.csproj includes conditional entries for when targeting Windows vs when targeting Linux. A property can be passed to msbuild to control which flavor is built, e.g. msbuild /p:OSGroup=OSX System.Console.csproj.
### Constants
- Wherever possible, constants should be defined as "const".  Only if the data type doesnt support this (e.g. IntPtr) should they instead be static readonly fields.
- Related constants should be grouped under a partial, static, internal type, e.g. for error codes they'd be grouped under an Errors type:
```C#
internal static partial class Interop
{
internal static partial class libc
{
internal static partial class Errors
{
internal const int ENOENT = 2;
internal const int EINTR = 4;
internal const int EWOULDBLOCK = 11;
internal const int EACCES = 13;
internal const int EEXIST = 17;
internal const int EXDEV = 18;
internal const int EISDIR = 21;
internal const int EINVAL = 22;
internal const int EFBIG = 27;
internal const int ENAMETOOLONG = 36;
internal const int ECANCELED = 125;
...
}
}
}
```
Using enums instead of partial, static classes can lead to needing lots of casts at call sites and can cause problems if such a type needs to be split across multiple files (enums cant currently be partial).  However, enums can be valuable in making it clear in a DllImport signature what values are permissible. Enums may be used in limited circumstances where these arent concerns: the full set of values can be represented in the enum, and the interop signature can be defined to use the enum type rather than the underlying integral type.
## Naming
- Interop signatures / structs / constants should be defined using the same name / capitalization / etc. thats used in the corresponding native code.
- We should not rename any of these based on managed coding guidelines.  The only exception to this is for the constant grouping type, which should be named with the most discoverable name possible; if that name is a concept (e.g. Errors), it can be named using managed naming guidelines.

View file

@ -0,0 +1,42 @@
Issue Guide
===========
This page outlines how the CoreFx team thinks about and handles issues. For us, issues on GitHub represent actionable work that should be done at some future point. It may be as simple as a small product or test bug or as large as the work tracking the design of a new feature. However, it should be work that falls under the charter of CoreFx, which is a collection of foundational libraries that make up the .NET Core development stack. We will keep issues open even if the CoreFx team internally has no plans to address them in an upcoming release, as long as we consider the issue to fall under our purview.
### When we close issues
As noted above, we don't close issues just because we don't plan to address them in an upcoming release. So why do we close issues? There are few major reasons:
1. Issues unrelated to CoreFx. When possible, we'll try to find a better home for the issue and open it there on your behalf.
2. Cross cutting work better suited for another team. Sometimes the line between the framework, languages and runtime blurs. For some issues, we may feel that the work is better suited for the runtime team, language team or other partner. In these cases, we'll close the issue and open it with the partner team. If they end up not deciding to take on the issue, we can reconsider it here.
3. Nebulous and Large open issues. Large open issues are sometimes better suited for [User Voice](http://visualstudio.uservoice.com/forums/121579-visual-studio/category/31481--net), especially when the work will cross the boundaries of the framework, language and runtime. A good example of this is the SIMD support we recently added to CoreFx. This started as a User Voice request, and eventually turned into work for both the core libraries and runtime.
Sometimes after debate, we'll decide an issue isn't a good fit for CoreFx. In that case, we'll also close it. Because of this, we ask that you don't start working on an issue until it's tagged with "up for grabs" or "feature approved". Both you and the team will be unhappy if you spend time and effort working on a change we'll ultimately be unable to take. We try to avoid that.
### Labels
We use GitHub labels on our issues in order to classify them. We have the following categories per issue:
* **Area**: These labels call out the assembly or assemblies the issue applies to. In addition to tags per assembly, we have a few other tags: Infrastructure, for issues that relate to our build or test infrastructure, and Meta for issues that deal with the repository itself, the direction of the .NET Core Platform, our processes, etc.
* **Type**: These labels classify the type of issue. We use the following types:
* [api addition](https://github.com/dotnet/corefx/labels/api%20addition): Issues which would add APIs to an assembly.
* [bug](https://github.com/dotnet/corefx/labels/bug): Issues for bugs in an assembly.
* [documentation](https://github.com/dotnet/corefx/labels/documentation): Issues relating to documentation (e.g. incorrect documentation, enhancement requests)
* [enhancement](https://github.com/dotnet/corefx/labels/enhancement): Issues related to an assembly that improve it, but do not add new APIs (e.g performance improvements, code cleanup)
* [test bug](https://github.com/dotnet/corefx/labels/test%20bug): Issues for bugs in the tests for a specific assembly.
* **Ownership**: These labels are used to specify who owns specific issue. Issues without an ownership tag are still considered "up for discussion" and haven't been approved yet. We have the following different types of ownership:
* [up for grabs](https://github.com/dotnet/corefx/labels/up%20for%20grabs): Small sections of work which we believe are well scoped. These sorts of issues are a good place to start if you are new. Anyone is free to work on these issues.
* [feature approved](https://github.com/dotnet/corefx/labels/feature%20approved): Larger scale issues. Like up for grabs, anyone is free to work on these issues, but they may be trickier or require more work.
* [grabbed by community](https://github.com/dotnet/corefx/labels/grabbed%20by%20community): Someone outside the CoreFx team has assumed responsibility for addressing this issue and is working on a fix. The comments for the issue will call out who is working on it. You shouldn't try to address the issue without coordinating with the owner.
* [grabbed by assignee](https://github.com/dotnet/corefx/labels/grabbed%20by%20assignee): Like grabbed by community, except the person the issue is assigned to is making a fix. This will be someone on the CoreFx team.
* **Project Management**: These labels are used to facilitate the team's [Kanban Board](https://huboard.com/dotnet/corefx). Labels indicate the current status and swim lane.
* [0 - Backlog](https://github.com/dotnet/corefx/issues?q=is%3Aopen+is%3Aissue+label%3A%220+-+Backlog%22): Tasks that are not yet ready for development or are not yet prioritized for the current development cycle.
* [1 - Up Next](https://github.com/dotnet/corefx/issues?q=is%3Aopen+is%3Aissue+label%3A%221+-+Up+Next%22): Tasks that are ready for development and prioritized above the rest of the backlog.
* [2 - In Progress](https://github.com/dotnet/corefx/issues?q=is%3Aopen+is%3Aissue+label%3A%222+-+In+Progress%22): Tasks that are under active development.
* [3 - Done](https://github.com/dotnet/corefx/issues?q=is%3Aopen+is%3Aissue+label%3A%223+-+Done%22): Tasks that are finished. There should be no open issues in the Done stage.
* [Community](https://github.com/dotnet/corefx/issues?q=is%3Aopen+is%3Aissue+label%3ACommunity): Community Engagement & Open Development Swim Lane. :swimmer:
* [Port to GitHub](https://github.com/dotnet/corefx/issues?q=is%3Aopen+is%3Aissue+label%3A%22Port+to+GitHub%22): Swim lane :swimmer: tracking the work remaining to open source .NET Core Framework
* [Infrastructure](https://github.com/dotnet/corefx/issues?q=is%3Aopen+is%3Aissue+label%3AInfrastructure): Swim lane :swimmer: tracking OSS Engineering and Infrastructure
* [X-Plat](https://github.com/dotnet/corefx/issues?q=is%3Aopen+is%3Aissue+label%3AX-Plat): Swim lane :swimmer: for Cross Platform Support
In addition to the above, we have a handful of other labels we use to help classify our issues. Some of these tag cross cutting concerns (e.g. cross platform, performance, serialization impact) where as others are used to help us track additional work needed before closing an issue (e.g. needs API review). Finally, we have the "needs more info" label. We use this label to mark issues where we need more information in order to proceed. Usually this will be because we can't reproduce a reported bug. We'll close these issues after a little bit if we haven't gotten actionable information, but we welcome folks who have acquired more information to reopen the issue.
### Assignee
We assign each issue to a CoreFx team member. In most cases, the assignee will not be the one who ultimately fixes the issue (that only happens in the case where the issue is tagged "grabbed by assignee"). The purpose of the assignee is to act as a point of contact between the CoreFx team and the community for the issue and make sure it's driven to resolution. If you're working on an issue and get stuck, please reach out to the assignee (just at mention them) and they will work to help you out.

View file

@ -0,0 +1,78 @@
Building CoreFX on Linux
========================
CoreFx can be built on top of current [Mono CI builds](#installing-mono-packages) or a direct [build/install of Mono](http://www.mono-project.com/docs/compiling-mono/). It builds using MSBuild and Roslyn and requires changes that have not yet made it to official released builds.
After preparing Mono, clone if you haven't already, and run the build script.
```
git clone https://github.com/dotnet/corefx.git
cd corefx
./build.sh
```
>These instructions have been validated on:
* Ubuntu 15.04, 14.04, and 12.04
* Fedora 22
* MacOS 10.10 (Yosemite)
# Installing Mono Packages
_Mono installation instructions are taken from ["Install Mono"](http://www.mono-project.com/docs/getting-started/install/) and ["Continuous Integration Packages"](http://www.mono-project.com/docs/getting-started/install/linux/ci-packages/)._
_**Note:** CI packages are not produced for Mac. As CoreFx needs current bits you must build Mono [yourself](http://www.mono-project.com/docs/compiling-mono/)._
### Add Mono key and package sources
##### Debian/Ubuntu (and other derivatives)
```
sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF
echo "deb http://download.mono-project.com/repo/debian wheezy main" | sudo tee /etc/apt/sources.list.d/mono-xamarin.list
echo "deb http://jenkins.mono-project.com/repo/debian sid main" | sudo tee /etc/apt/sources.list.d/mono-jenkins.list
sudo apt-get update
```
##### Fedora/CentOS (and other derivatives)
```
sudo rpm --import "http://keyserver.ubuntu.com/pks/lookup?op=get&search=0x3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF"
sudo yum-config-manager --add-repo http://download.mono-project.com/repo/centos/
sudo yum-config-manager --add-repo http://jenkins.mono-project.com/repo/centos/
sudo yum upgrade
```
### Install CI build and reference assemblies
Install a recent (Continuous Integration) Mono build and the PCL reference assemblies. (_This instruction installs latest. To see available Mono builds, use `apt-cache search mono-snapshot` (Ubuntu) or `yum search mono-snapshot` (Fedora)_)
##### Debian/Ubuntu (and other derivatives)
```
sudo apt-get install mono-snapshot-latest referenceassemblies-pcl
```
##### Fedora/CentOS (and other derivatives)
```
sudo yum install mono-snapshot-latest referenceassemblies-pcl
```
### Switch to the mono snapshot build
```
. mono-snapshot mono
```
# Known Issues
If you see errors along the lines of `SendFailure (Error writing headers)` you may need to import trusted root certificates:
```
mozroots --import --sync
```
PCL reference assemblies and targets are not installed by default. They also are not available for snapshot builds, and must be copied, linked in, or use the ReferenceAssemblyRoot override. The build script will use an MSBuild override, the following is how to link the PCL folder in the right place:
```
sudo ln -s /usr/lib/mono/xbuild-frameworks/.NETPortable/ $MONO_PREFIX/lib/mono/xbuild-frameworks/
```
If you are seeing errors like the following you likely either have not installed the package or haven't linked it properly.
```
warning MSB3252: The currently targeted framework ".NETPortable,Version=v4.5,Profile=Profile7" does not include the referenced assembly
MSB3644: The reference assemblies for framework ".NETPortable,Version=v4.5,Profile=Profile7" were not found.
```
Mono may intermittently fail when compiling. Retry if you see sigfaults or other unexpected issues.
PDBs aren't generated by Roslyn on Unix. https://github.com/dotnet/roslyn/issues/2449
Test runs are currently disabled when building on Unix. https://github.com/dotnet/corefx/issues/1776
System.Diagnostics.FileVersionInfo.Tests.csproj does not build on Unix. https://github.com/dotnet/corefx/issues/1610
System.Diagnostics.Debug.Tests does not build on Unix. https://github.com/dotnet/corefx/issues/1609
Mono fails when trying to get custom attributes on CoreFx assemblies. https://bugzilla.xamarin.com/show_bug.cgi?id=29679

View file

@ -0,0 +1,18 @@
Open Source Signing
===================
For reasons listed over on [Strong Naming](strong-name-signing.md), all .NET Core assemblies are strong-named.
To enable you to build assemblies that have a matching identity to what Microsoft would build, we leverage a new signing mechanism called _Open Source Signing (OSS)_. This lets you clone the dotnet/corefx repository, build and then drop the resulting assembly in your application with zero changes to consuming libraries. By default, all .NET Core projects build using OSS.
OSS is very similar to [delay signing](http://msdn.microsoft.com/en-us/library/t07a3dye(v=vs.110).aspx) but without the need to add skip verification entries to your machine. This allows you to load the assembly in most contexts, or more precisely in any context that doesn't require validating the strong-name signature.
When running on the full .NET Framework we only support using OSS assemblies for debugging and testing purposes. Microsoft does not guarantee that you can successfully load OSS assemblies in all scenarios that are required for production use. For list of known scenarios where OSS does not work when running on .NET Framework, see below.
However, in the context of ASP.NET 5 on .NET Core, or .NET Native, Microsoft supports using OSS assemblies for production uses. Make note, however, that while ability to load OSS binaries is supported on these platforms, the API and contents of the assembly itself is unsupported (due to it being privately built).
Known issues when debugging and testing OSS assemblies on .NET Framework:
- You will not be able to install the assembly to the [Global Assembly Cache (GAC)](https://msdn.microsoft.com/en-us/library/yf1d93sz.aspx)
- You will not be able to load the assembly in an AppDomain where shadow copying is turned on.
- You will not be able to load the assembly in a partially trusted AppDomain

View file

@ -0,0 +1,16 @@
Managed Code Performance Guidelines
===================================
Different applications have different needs when it comes to performance. For libraries that may be used in any of them and potentially on critical paths, however, it is of the utmost importance that code is as efficient as possible. The code in CoreFX should strive to be high-performing, including minimizing the number and size of allocations, minimizing the number of branches involved in code, and overall minimizing the amount of work that must be done to perform any given operation.
Much has been written about writing high-performance code in C#. This page provides links to some of that material and will expand over time as additional resources are found and identified as being relevant and useful.
You can read [CoreCLR Performance Requirements](https://github.com/master/coreclr/blob/master/Documentation/performance-requirements.md) to learn more.
# Memory Management
* **Avoiding delegate and closure allocations for lambdas**. The code generated by the C# compiler for anonymous methods and lambdas may involve one or more allocations. Certain patterns in APIs can help to avoid these allocations. See [Know Thine Implicit Allocations](http://blogs.msdn.com/b/pfxteam/archive/2012/02/03/10263921.aspx) for more information.
# Asynchrony
* **Best practices for async/await performance**. The C# async/await feature makes it easy to write asynchronous code in the same manner that you'd write synchronous code, but it comes with its own set of costs, and understanding these costs can help you to avoid them. This [presentation from Build](http://channel9.msdn.com/Events/BUILD/BUILD2011/TOOL-829T) and this [MSDN Magazine article](http://msdn.microsoft.com/en-us/magazine/hh456402.aspx) outline some best practices.

View file

@ -0,0 +1,32 @@
Repo Organization
=================
Tests for a project are kept under the `tests` folder, which is a peer of the `src` folder. If you need to have multiple test projects for a component, structure them in sub folders.
For example, lay things out like this:
```
tests\
test_project_1\
test_1.cs
test_2.cs
test_project_1.csproj
test_project_2\
test_1.cs
test_2.cs
test_project_2.csproj
```
Not like this:
```
tests\
test_project_1.csproj
test_project_2.csproj
test_folder_1\
test_1.cs
test_2.cs
test_folder_2\
test_1.cs
test_2.cs
```

View file

@ -0,0 +1,24 @@
Strong Name Signing
===================
All .NET Core assemblies are [strong-named](http://msdn.microsoft.com/en-us/library/wd40t7ad.aspx). We do this for two reasons:
1. _Compatibility_. We want to maintain type identity with previous versions of our assemblies that have shipped across various versions of our platforms. Removing a strong-name from an assembly is a breaking change, and would break the ability to consume and run libraries built against the previous identities.
2. _Serviceability_. When running on .NET Framework some of .NET Core assemblies ship locally ("app-local") with the application, this is in contrast to other framework assemblies that are placed in the [GAC](http://msdn.microsoft.com/en-us/library/yf1d93sz.aspx). To be able to service these libraries for critical security updates, we make use of the [app-local servicing](http://blogs.msdn.com/b/dotnet/archive/2014/01/22/net-4-5-1-supports-microsoft-security-updates-for-net-nuget-libraries.aspx) feature which requires that assemblies have strong-names.
## FAQ
### 1. Microsoft strong-names their assemblies, should I?
For the most part, the majority of applications and libraries do not need strong-names. Strong-names are left over from previous eras of .NET where [sandboxing](http://en.wikipedia.org/wiki/Sandbox_(computer_security)) needed to differentiate between code that was trusted, versus code that was untrusted. However in recent years, sandboxing via AppDomains, especially to [isolate ASP.NET web applications] (http://support.microsoft.com/kb/2698981), is no longer guaranteed and is not recommended.
However, strong-names are still required in some rare situations, most of which are called out on this page: [Strong-Named Assemblies](http://msdn.microsoft.com/en-us/library/wd40t7ad.aspx).
### 2. I really, _really_ need to strong-name, what kinds of issues will I run into?
There are three major problems that developers run into after strong naming their assemblies:
1. _Binding Policy_. When developers talk about strong-names, they are usually conflating it with the strict binding policy of the .NET Framework that kicks in _when_ you strong-name. This binding policy is problematic because it forces, by default, an exact match between reference and version, and requires developers to author complex [binding redirects](http://msdn.microsoft.com/en-us/library/eftw1fys.aspx) when they don't. In recent versions of Visual Studio, however, we've added [Automatic Binding Redirection](http://msdn.microsoft.com/en-us/library/2fc472t2.aspx) as an attempt to reduce pain of this policy on developers. On top of this, all newer platforms, including _Silverlight_, _WinRT-based platforms_ (Phone and Store), _.NET Native_ and _ASP.NET 5_ this policy has been loosened, allowing later versions of an assembly to satisfy earlier references, thereby completely removing the need to ever write binding redirects on those platforms.
2. _Virality_. Once you've strong-named an assembly, you can only statically reference other strong-named assemblies.
3. _No drop-in replacement_. This is a problem for open source libraries where the strong-name private key is not checked into the repository. This means that developers are unable to build to their own version of the library and then use it as a drop-in replacement without recompiling _all_ consuming libraries up stack to pick up the new identity. This is extremely problematic for libraries, such as Json.NET, which have large incoming dependencies. Firstly, we would recommend that these open source projects check-in their private key (remember, [strong-names are used for identity, and not for security](http://msdn.microsoft.com/en-us/library/wd40t7ad.aspx)). Failing that, however, we've introduced a new concept called [Open Source Signing](oss-signing.md) that enables developers to build drop-in replacements without needing access to the strong-name private key. This is the mechanism that .NET Core libraries use by default.

View file

@ -0,0 +1,141 @@
Building CoreFX on Windows
==========================
You can build .NET Core either via the command line or by using Visual Studio.
We currently only support building and running on Windows. Other platforms will
come later.
## Required Software
Visual Studio 2013 (Update 3 or later) or Visual Studio 2015 (Preview or later) is required.
The following free downloads are compatible:
* [Visual Studio Community 2013 (with Update 3)](http://www.visualstudio.com/en-us/visual-studio-community-vs.aspx)
* [Visual Studio Enterprise 2015 RC](http://www.visualstudio.com/en-us/downloads/visual-studio-2015-downloads-vs)
## Building From the Command Line
Open a [Visual Studio Command Prompt](http://msdn.microsoft.com/en-us/library/ms229859(v=vs.110).aspx).
From the root of the repository, type `build`. This will build everything and run
the core tests for the project. Visual Studio Solution (.sln) files exist for
related groups of libraries. These can be loaded to build, debug and test inside
the Visual Studio IDE.
[Building On Linux](linux-instructions.md)
## Tests
We use the OSS testing framework [xunit](http://xunit.github.io/)
### Running tests on the command line
By default, the core tests are run as part of the build. Running the tests from
the command line is as simple as invoking `build.cmd` on windows, and `run-test.sh` on linux and osx.
You can also run the test for an individual project by building just one test
project, e.g.:
```
cd src\System.Collections.Immutable\tests
msbuild /t:BuildAndTest (or /t:Test to just run the tests if the binaries are already built)
```
It is possible to pass parameters to the underlying xunit runner via the `XunitOptions` parameter, e.g.:
````
msbuild /t:Test "/p:XunitOptions=-class Test.ClassUnderTests -notrait category=outerloop"
````
In some test directories there may be multiple test projects or directories so you may need to specify the specific test project to get it to build and run the tests.
Tests participate in the incremental build. This means that if tests have already been run, and inputs to the incremental build have not changed, rerunning the tests target will not execute the test runner again. To force re-executing tests in this situation, use msbuild /t:clean;build;test.
The tests can also be filtered based on xunit trait attributes defined in XunitTraitDiscoverers project. These attributes are to be specified over the test method. The available xunit attributes are:
_**OuterLoop:**_
This attribute returns the 'outerloop' category, so to run outerloop tests use the following commandline,
```
xunit.console.netcore.exe *.dll -trait category=outerloop
build.cmd *.csproj /p:RunTestsWithCategories=OuterLoop
```
_**PlatformSpecific(Xunit.PlatformID platforms):**_
Use this attribute on test methods, to specifiy that this test may only be run on the specified platforms. This attribute returns the following categories based on platform
- nonwindowstests: for tests that don't run on Windows
- nonlinuxtests: for tests that don't run on Linux
- nonosxtests: for tests that don't run on OSX
To run Linux specific tests on a Linux box, use the following commandline,
```
xunit.console.netcore.exe *.dll -notrait category=nonlinuxtests
```
_**ActiveIssue(int issue, Xunit.PlatformID platforms):**_
Use this attribute over tests methods, to skip failing tests only on the specific platforms, if no platforms is specified, then the test is skipped on all platforms. This attribute returns the 'failing' category, so to run all acceptable tests on Linux that are not failing, use the following commandline,
```
xunit.console.netcore.exe *.dll -notrait category=failing -notrait category=nonlinuxtests
```
And to run all acceptable tests on Linux that are failing,
```
xunit.console.netcore.exe *.dll -trait category=failing -notrait category=nonlinuxtests
```
_**A few common examples with the above attributes:**_
- Run all tests acceptable on Windows
```
xunit.console.netcore.exe *.dll -notrait category=nonwindowstests
```
- Run all inner loop tests acceptable on Linux
```
xunit.console.netcore.exe *.dll -notrait category=nonlinuxtests -notrait category=OuterLoop
```
- Run all outer loop tests acceptable on OSX that are not currently associated with active issues
```
xunit.console.netcore.exe *.dll -notrait category=nonosxtests -trait category=OuterLoop -notrait category=failing
```
- Run all tests acceptable on Linux that are currently associated with active issues
```
xunit.console.netcore.exe *.dll -notrait category=nonlinuxtests -trait category=failing
```
All the required dlls to run a test project can be found in the bin\\tests\\{Flavor}\\{Project}.Tests\\aspnetcore50\\ which should be created on building the test project.
To skip an entire test project from being run on a specific platform, for ex, skip running registry tests on linux and mac, use the <UnsupportedPlatforms> msbuild property on the csproj. Valid platform values are
```
<UnsupportedPlatforms>Windows_NT;Linux;OSX</UnsupportedPlatforms>
```
### Running tests from Visual Studio
1. Open solution of interest
2. Right click test project and select 'Set as startup project'
3. Ctrl+F5 (Run)
### Debugging tests in Visual Studio
1. Install VS 2015 Preview or later including Web Developer Tools
2. Open solution of interest in VS 2015
3. Right click test project and select 'Set as startup project'
4. Set breakpoint appropriately
5. F5 (Debug)
### Code Coverage
Code coverage is built into the corefx build system. It utilizes OpenCover for generating coverage data and ReportGenerator for generating reports about that data. To run:
```
// Run full coverage
build.cmd /p:Coverage=true
// To run a single project with code coverage enabled pass the /p:Coverage=true property
cd src\System.Collections.Immutable\tests
msbuild /t:BuildAndTest /p:Coverage=true
```
If coverage succeeds, the code coverage report will be generated automatically and placed in the bin\tests\coverage directory. You can view the full report by opening index.htm
Code coverage reports from the continuous integration system are available from the links on the front page of the corefx repo.
### Notes
* Running tests from using the VS test explorer does not currently work after we switched to running on CoreCLR. [We will be working on enabling full VS test integration](https://github.com/dotnet/corefx/issues/1318) but we don't have an ETA yet. In the meantime, use the steps above to launch/debug the tests using the console runner.
* VS 2015 is required to debug tests running on CoreCLR as the CoreCLR
debug engine is a VS 2015 component.
* If the Xamarin PCL profiles are installed, the build will fail due to [issue #449](https://github.com/dotnet/corefx/issues/449). A possible workaround is listed [in the issue](https://github.com/dotnet/corefx/issues/449#issuecomment-95117040) itself.