Tensor 0.4.11

Tensor (n-dimensional array) library for F#

     Core features:
       - n-dimensional arrays (tensors) in host memory or on CUDA GPUs
       - element-wise operations (addition, multiplication, absolute value, etc.)
       - basic linear algebra operations (dot product, SVD decomposition, matrix inverse, etc.)
       - reduction operations (sum, product, average, maximum, arg max, etc.)
       - logic operations (comparision, and, or, etc.)
       - views, slicing, reshaping, broadcasting (similar to NumPy)
       - scatter and gather by indices
       - standard functional operations (map, fold, etc.)

     Data exchange:
       - read/write support for HDF5 (.h5)
       - interop with standard F# types (Seq, List, Array, Array2D, Array3D, etc.)

     Performance:
       - host: SIMD and BLAS accelerated operations
         - by default Intel MKL is used (shipped with NuGet package)
         - other BLASes (OpenBLAS, vendor-specific) can be selected by configuration option
       - CUDA GPU: all operations performed locally on GPU and cuBLAS used for matrix operations

     Requirements:
       - Linux, MacOS or Windows on x64
       - Linux requires libgomp.so.1 installed.

     Additional algorithms are provided in the Tensor.Algorithm package.

Install-Package Tensor -Version 0.4.11
dotnet add package Tensor --version 0.4.11
<PackageReference Include="Tensor" Version="0.4.11" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Tensor --version 0.4.11
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version History

Version Downloads Last updated
0.4.11 1,320 5/8/2018
0.4.11-v0.4.11-215 219 5/8/2018
0.4.11-symtensor-core-242 234 11/15/2018
0.4.11-symtensor-core-241 219 11/15/2018
0.4.11-symtensor-core-240 216 11/15/2018
0.4.11-symtensor-core-239 195 11/15/2018
0.4.11-symtensor-core-238 219 11/15/2018
0.4.11-symtensor-core-237 240 11/15/2018
0.4.11-symtensor-core-236 197 11/14/2018
0.4.11-symtensor-core-235 198 11/14/2018
0.4.11-symtensor-core-234 195 11/14/2018
0.4.11-symtensor-core-231 233 11/9/2018
0.4.11-symtensor-core-230 230 11/9/2018
0.4.11-symtensor-core-229 200 11/8/2018
0.4.11-symtensor-core-228 198 11/8/2018
0.4.11-symtensor-core-227 234 10/30/2018
0.4.11-symtensor-core-226 230 10/30/2018
0.4.11-symtensor-core-225 203 10/30/2018
0.4.11-develop-216 328 5/8/2018
0.4.10-develop-213 329 5/8/2018
0.4.10-develop-212 322 5/7/2018
0.4.10-develop-211 323 5/7/2018
0.3.0.712-master 324 9/1/2017
0.3.0.711-master 324 9/1/2017
0.3.0.710-master 312 9/1/2017
0.3.0.709-master 298 8/31/2017
0.3.0.708-master 319 8/30/2017
0.3.0.707-master 329 8/30/2017
0.3.0.706-master 309 8/30/2017
0.3.0.701-master 348 6/26/2017
0.3.0.700-master 370 6/22/2017
0.3.0.699-master 341 6/22/2017
0.3.0.698-master 340 6/21/2017
0.3.0.697-master 337 6/21/2017
0.3.0.696-master 366 6/21/2017
0.3.0.695-master 337 6/21/2017
0.3.0.694-master 332 6/21/2017
0.3.0.693-master 343 6/20/2017
0.3.0.692-master 335 6/19/2017
0.3.0.691-master 360 6/19/2017
0.3.0.690-master 341 6/19/2017
0.3.0.689-master 341 5/14/2017
0.3.0.688 1,478 5/14/2017
0.3.0.686-master 345 5/14/2017
0.2.0.591-master 352 4/19/2017
0.2.0.565-master 362 4/11/2017
0.2.0.556-master 352 3/21/2017
0.2.0.551-master 404 3/17/2017
0.2.0.540-master 337 3/15/2017
0.2.0.536-master 333 3/14/2017
0.2.0.519-master 349 3/2/2017
0.2.0.516-master 338 3/2/2017
0.2.0.499-master 362 2/13/2017
0.2.0.494-master 346 2/7/2017
0.2.0.479-master 363 2/1/2017
0.2.0.463-master 360 1/17/2017
0.2.0.431-master 438 12/2/2016
0.2.0.422-master 374 11/9/2016
0.2.0.421-master 364 11/9/2016
0.2.0.411-master 415 10/26/2016
0.2.0.400-master 364 10/26/2016
0.2.0.394-master 385 10/25/2016
0.2.0.382-master 369 10/21/2016
0.2.0.377-master 361 10/20/2016
0.2.0.323-master 363 10/11/2016
0.2.0.262-master 377 9/29/2016
0.2.0.248-master 381 9/27/2016
0.2.0.174-master 378 9/16/2016
0.2.0.128-master 376 9/8/2016
0.2.0.122-master 383 9/8/2016
0.2.0.121-master 370 9/7/2016
0.2.0.111-master 365 9/7/2016
0.2.0.105-ci 410 9/5/2016
0.2.0.97-ci 399 8/30/2016
0.2.0.96-ci 373 8/29/2016
0.2.0.90-ci 376 8/25/2016
0.2.0.89-ci 363 8/24/2016
0.2.0.88-ci 370 8/24/2016
0.2.0.87-ci 385 8/24/2016
0.2.0.86-ci 372 8/23/2016
0.2.0.85-ci 369 8/22/2016
0.2.0.84-ci 381 8/22/2016
0.2.0.83-ci 384 8/22/2016
0.2.0.82 603 8/22/2016
0.2.0.81-ci 383 8/19/2016
0.2.0.80-ci 395 6/27/2016
0.2.0.79-ci 390 6/27/2016
0.2.0.77-ci 399 6/22/2016
0.2.0.76-ci 399 6/22/2016
0.2.0.75 459 6/15/2016
0.2.0.74-ci 389 6/15/2016
0.2.0.73 427 6/15/2016
0.2.0.72 439 6/15/2016
0.2.0.71 472 6/14/2016
0.2.0.70 428 6/9/2016
0.2.0.69 397 6/9/2016
0.2.0.68 425 6/9/2016
0.2.0.67 505 6/8/2016
0.2.0.66-ci 386 6/8/2016
0.2.0.65-ci 380 6/8/2016
0.2.0.64-ci 423 6/8/2016
0.2.0.63-ci 374 6/7/2016
0.2.0.62 426 6/7/2016
0.2.0.61 411 6/6/2016
0.2.0.60 407 6/6/2016
0.2.0.59 406 6/6/2016
0.2.0.57 427 6/3/2016
0.2.0.56 418 6/3/2016
0.2.0.55 454 6/3/2016
0.2.0.54 431 6/3/2016
0.2.0.53 467 6/3/2016
0.2.0.52-ci 381 6/2/2016
0.2.0.51-ci 385 6/2/2016
0.2.0.50-ci 392 6/2/2016
0.2.0.49 490 5/31/2016
0.2.0.48-ci 398 5/31/2016
0.2.0.46-ci 383 5/31/2016
0.2.0.45 425 5/31/2016
0.2.0.44 428 5/31/2016
0.2.0.43 442 5/31/2016
0.2.0.42 443 5/30/2016
0.2.0.41 436 5/30/2016
0.2.0.40 433 5/30/2016
0.2.0.39 443 5/30/2016
0.2.0.38 427 5/30/2016
0.2.0.37 428 5/30/2016
0.2.0.36 428 5/25/2016
0.2.0.35 449 5/24/2016
0.2.0.34 441 5/24/2016
0.2.0.33 583 5/24/2016
0.2.0.32-ci 377 5/24/2016
0.1.26-ci 401 5/24/2016
0.1.24-ci 390 5/24/2016
0.1.19-ci 379 5/24/2016