Tensor 0.4.11

.NET Standard 2.0
dotnet add package Tensor --version 0.4.11
NuGet\Install-Package Tensor -Version 0.4.11
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Tensor" Version="0.4.11" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Tensor --version 0.4.11
#r "nuget: Tensor, 0.4.11"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Tensor as a Cake Addin
#addin nuget:?package=Tensor&version=0.4.11

// Install Tensor as a Cake Tool
#tool nuget:?package=Tensor&version=0.4.11

Tensor (n-dimensional array) library for F#

     Core features:
       - n-dimensional arrays (tensors) in host memory or on CUDA GPUs
       - element-wise operations (addition, multiplication, absolute value, etc.)
       - basic linear algebra operations (dot product, SVD decomposition, matrix inverse, etc.)
       - reduction operations (sum, product, average, maximum, arg max, etc.)
       - logic operations (comparision, and, or, etc.)
       - views, slicing, reshaping, broadcasting (similar to NumPy)
       - scatter and gather by indices
       - standard functional operations (map, fold, etc.)

     Data exchange:
       - read/write support for HDF5 (.h5)
       - interop with standard F# types (Seq, List, Array, Array2D, Array3D, etc.)

     Performance:
       - host: SIMD and BLAS accelerated operations
         - by default Intel MKL is used (shipped with NuGet package)
         - other BLASes (OpenBLAS, vendor-specific) can be selected by configuration option
       - CUDA GPU: all operations performed locally on GPU and cuBLAS used for matrix operations

     Requirements:
       - Linux, MacOS or Windows on x64
       - Linux requires libgomp.so.1 installed.

     Additional algorithms are provided in the Tensor.Algorithm package.

Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 was computed.  net462 was computed.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Additional computed target framework(s)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (3)

Showing the top 3 NuGet packages that depend on Tensor:

Package Downloads
DeepNet

Deep learning library for F#. Provides symbolic model differentiation, automatic differentiation and compilation to CUDA GPUs. Includes optimizers and model blocks used in deep learning. Make sure to set the platform of your project to x64.

RPlotTools

Tools for plotting using R from F#.

Tensor.Algorithm

Data types: - arbitrary precision rational numbers Matrix algebra (integer, rational): - Row echelon form - Smith normal form - Kernel, cokernel and (pseudo-)inverse Matrix decomposition (floating point): - Principal component analysis (PCA) - ZCA whitening Misc: - Bezout's identity - Loading of NumPy's .npy and .npz files.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
0.4.11 6,466 5/8/2018
0.4.11-v0.4.11-215 620 5/8/2018
0.4.11-symtensor-core-242 1,113 11/15/2018
0.4.11-symtensor-core-241 1,088 11/15/2018
0.4.11-symtensor-core-240 1,085 11/15/2018
0.4.11-symtensor-core-239 1,030 11/15/2018
0.4.11-symtensor-core-238 1,075 11/15/2018
0.4.11-symtensor-core-237 1,125 11/15/2018
0.4.11-symtensor-core-236 1,021 11/14/2018
0.4.11-symtensor-core-235 1,028 11/14/2018
0.4.11-symtensor-core-234 999 11/14/2018
0.4.11-symtensor-core-231 1,123 11/9/2018
0.4.11-symtensor-core-230 1,026 11/9/2018
0.4.11-symtensor-core-229 1,028 11/8/2018
0.4.11-symtensor-core-228 1,073 11/8/2018
0.4.11-symtensor-core-227 1,083 10/30/2018
0.4.11-symtensor-core-226 1,144 10/30/2018
0.4.11-symtensor-core-225 1,042 10/30/2018
0.4.11-develop-216 1,289 5/8/2018
0.4.10-develop-213 1,296 5/8/2018
0.4.10-develop-212 1,226 5/7/2018
0.4.10-develop-211 1,334 5/7/2018
0.3.0.712-master 983 9/1/2017
0.3.0.711-master 997 9/1/2017
0.3.0.710-master 948 9/1/2017
0.3.0.709-master 967 8/31/2017
0.3.0.708-master 969 8/30/2017
0.3.0.707-master 944 8/30/2017
0.3.0.706-master 991 8/30/2017
0.3.0.701-master 1,025 6/26/2017
0.3.0.700-master 1,006 6/22/2017
0.3.0.699-master 971 6/22/2017
0.3.0.698-master 977 6/21/2017
0.3.0.697-master 979 6/21/2017
0.3.0.696-master 1,053 6/21/2017
0.3.0.695-master 987 6/21/2017
0.3.0.694-master 973 6/21/2017
0.3.0.693-master 996 6/20/2017
0.3.0.692-master 975 6/19/2017
0.3.0.691-master 1,005 6/19/2017
0.3.0.690-master 1,011 6/19/2017
0.3.0.689-master 988 5/14/2017
0.3.0.688 7,148 5/14/2017
0.3.0.686-master 990 5/14/2017
0.2.0.591-master 966 4/19/2017
0.2.0.565-master 946 4/11/2017
0.2.0.556-master 961 3/21/2017
0.2.0.551-master 1,011 3/17/2017
0.2.0.540-master 935 3/15/2017
0.2.0.536-master 945 3/14/2017
0.2.0.519-master 967 3/2/2017
0.2.0.516-master 940 3/2/2017
0.2.0.499-master 982 2/13/2017
0.2.0.494-master 961 2/7/2017
0.2.0.479-master 967 2/1/2017
0.2.0.463-master 978 1/17/2017
0.2.0.431-master 1,033 12/2/2016
0.2.0.422-master 1,325 11/9/2016
0.2.0.421-master 1,273 11/9/2016
0.2.0.411-master 1,026 10/26/2016
0.2.0.400-master 972 10/26/2016
0.2.0.394-master 966 10/25/2016
0.2.0.382-master 978 10/21/2016
0.2.0.377-master 983 10/20/2016
0.2.0.323-master 956 10/11/2016
0.2.0.262-master 991 9/29/2016
0.2.0.248-master 992 9/27/2016
0.2.0.174-master 983 9/16/2016
0.2.0.128-master 1,012 9/8/2016
0.2.0.122-master 990 9/8/2016
0.2.0.121-master 977 9/7/2016
0.2.0.111-master 959 9/7/2016
0.2.0.105-ci 1,026 9/5/2016
0.2.0.97-ci 1,039 8/30/2016
0.2.0.96-ci 968 8/29/2016
0.2.0.90-ci 1,003 8/25/2016
0.2.0.89-ci 960 8/24/2016
0.2.0.88-ci 992 8/24/2016
0.2.0.87-ci 988 8/24/2016
0.2.0.86-ci 979 8/23/2016
0.2.0.85-ci 967 8/22/2016
0.2.0.84-ci 1,004 8/22/2016
0.2.0.83-ci 1,016 8/22/2016
0.2.0.82 2,203 8/22/2016
0.2.0.81-ci 991 8/19/2016
0.2.0.80-ci 1,009 6/27/2016
0.2.0.79-ci 1,002 6/27/2016
0.2.0.77-ci 992 6/22/2016
0.2.0.76-ci 1,017 6/22/2016
0.2.0.75 1,665 6/15/2016
0.2.0.74-ci 1,342 6/15/2016
0.2.0.73 1,916 6/15/2016
0.2.0.72 1,906 6/15/2016
0.2.0.71 1,900 6/14/2016
0.2.0.70 1,779 6/9/2016
0.2.0.69 1,707 6/9/2016
0.2.0.68 1,583 6/9/2016
0.2.0.67 2,059 6/8/2016
0.2.0.66-ci 1,020 6/8/2016
0.2.0.65-ci 1,001 6/8/2016
0.2.0.64-ci 1,054 6/8/2016
0.2.0.63-ci 994 6/7/2016
0.2.0.62 1,563 6/7/2016
0.2.0.61 1,531 6/6/2016
0.2.0.60 1,540 6/6/2016
0.2.0.59 1,490 6/6/2016
0.2.0.57 1,568 6/3/2016
0.2.0.56 1,540 6/3/2016
0.2.0.55 1,622 6/3/2016
0.2.0.54 1,573 6/3/2016
0.2.0.53 1,894 6/3/2016
0.2.0.52-ci 978 6/2/2016
0.2.0.51-ci 1,019 6/2/2016
0.2.0.50-ci 1,008 6/2/2016
0.2.0.49 1,922 5/31/2016
0.2.0.48-ci 1,073 5/31/2016
0.2.0.46-ci 1,025 5/31/2016
0.2.0.45 1,750 5/31/2016
0.2.0.44 1,778 5/31/2016
0.2.0.43 1,707 5/31/2016
0.2.0.42 1,740 5/30/2016
0.2.0.41 1,750 5/30/2016
0.2.0.40 1,745 5/30/2016
0.2.0.39 1,805 5/30/2016
0.2.0.38 1,793 5/30/2016
0.2.0.37 1,722 5/30/2016
0.2.0.36 1,773 5/25/2016
0.2.0.35 1,760 5/24/2016
0.2.0.34 1,756 5/24/2016
0.2.0.33 2,566 5/24/2016
0.2.0.32-ci 994 5/24/2016
0.1.26-ci 1,028 5/24/2016
0.1.24-ci 1,004 5/24/2016
0.1.19-ci 997 5/24/2016