Skip to content

Commit 2f234e4

Browse files
jignparmpranavsharma
authored andcommitted
Minor fixes to nuget README.md file (microsoft#49)
1 parent 564907f commit 2f234e4

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

docs/CSharp_API.md

+6-6
Original file line numberDiff line numberDiff line change
@@ -2,14 +2,14 @@
22
The ONNX runtime provides a C# .Net binding for running inference on ONNX models in any of the .Net standard platforms. The API is .Net standard 1.1 compliant for maximum portability. This document describes the API.
33

44
## NuGet Package
5-
There is a NuGet package Microsoft.ML.OnnxRuntime available for .Net consumers, which includes the prebuilt binaries for ONNX runtime. The API is portable across all platforms and architectures supported by the .Net standard, although currently the NuGet package contains the prebuilt binaries for Windows 10 platform on x64 CPUs only.
5+
The Microsoft.ML.OnnxRuntime Nuget package includes the precompiled binaries for ONNX runtime, and includes libraries for Windows 10 platform and X64 CPUs. The APIs conform to .Net Standard 1.1.
66

77
## Getting Started
8-
Here is simple tutorial for getting started with running inference on an existing ONNX model for a given input data (a.k.a query). Say the model is trained using any of the well-known training frameworks and exported as an ONNX model into a file named `model.onnx`. The runtime incarnation of a model is an `InferenceSession` object. You simply construct an `InferenceSession` object using the model file as parameter --
8+
Here is simple tutorial for getting started with running inference on an existing ONNX model for a given input data. The model is typically trained using any of the well-known training frameworks and exported into the ONNX format. To start scoring using the model, open a session using the `InferenceSession` class, passing in the file path to the model as a parameter.
99

1010
var session = new InferenceSession("model.onnx");
1111

12-
Once a session is created, you can run queries on the session using your input data, using the `Run` method of the `InferenceSession`. Both input and output of `Run` method are represented as collections of .Net `Tensor` objects (as defined in [System.Numerics.Tensor](https://www.nuget.org/packages/System.Numerics.Tensors)) -
12+
Once a session is created, you can execute queries using the `Run` method of the `InferenceSession` object. Currently, only `Tensor` type of input and outputs are supported. The results of the `Run` method are represented as a collection of .Net `Tensor` objects (as defined in [System.Numerics.Tensor](https://www.nuget.org/packages/System.Numerics.Tensors)).
1313

1414
Tensor<float> t1, t2; // let's say data is fed into the Tensor objects
1515
var inputs = new List<NamedOnnxValue>()
@@ -19,7 +19,8 @@ Once a session is created, you can run queries on the session using your input d
1919
};
2020
IReadOnlyCollection<NamedOnnxValue> results = session.Run(inputs);
2121

22-
You can load your input data into Tensor<T> objects in several ways. A simple example is to create the Tensor from arrays -
22+
You can load your input data into Tensor<T> objects in several ways. A simple example is to create the Tensor from arrays.
23+
2324
float[] sourceData; // assume your data is loaded into a flat float array
2425
int[] dimensions; // and the dimensions of the input is stored here
2526
Tensor<float> t1 = new DenseTensor<float>(sourceData, dimensions);
@@ -84,7 +85,7 @@ Accessor to the default static option object
8485

8586
#### Methods
8687
AppendExecutionProvider(ExecutionProvider provider);
87-
Appends execution provider to the session. For any operator in the graph the first execution provider that implements the operator will be user. ExecutionProvider is defined as the following enum -
88+
Appends execution provider to the session. For any operator in the graph the first execution provider that implements the operator will be user. ExecutionProvider is defined as the following enum.
8889

8990
enum ExecutionProvider
9091
{
@@ -112,4 +113,3 @@ The type of Exception that is thrown in most of the error conditions related to
112113

113114

114115

115-

0 commit comments

Comments
 (0)