Roadmap to DiffSharp 1.0
See original GitHub issueThis is a catalog issue to track what needs doing for DiffSharp 1.0, based on 1:1 discussions with @gbaydin. There will be a long list of other things, we’ll extend this as necessary
-
Typed Backend.None CPU tensors (draft)
-
Add
keep_dims
onMean
(done) -
Fix
CompareTo
in RawTensorFloat32CPU.fs (done) -
Broadcasting. Full pytorch-style broadcasting for
Add
(see TODO here). The design principle expected here is “we should do the same thing as PyTorch”. Similarly full pytorch-style broadcasting forMul
and other operations. (done) -
Convolutions (@gbaydin)
-
Transposed convolutions
-
Batchnorm
-
Dropout
-
Switch to Python-style casing on all operations to align with SciSharp
-
Remove excess overloads and use optional arguments instead
-
Finalize API
dsharp.abc
-
libtorch and cuda backends
-
Add
Reshape
(similar code toView
) -
Add
OneHot
-
Differentiation API
-
Optimizers
-
Fix possible memory leak on Linux
-
Tensor save/load
-
General transpose
-
probability distributions
-
Generalization and batching of
Transpose
. -
Zero-size tensors #150
-
Docs tooling #134
-
Docs #167
-
PyTorch Half support
-
Batching conv1d/2d/3d/… https://github.com/DiffSharp/DiffSharp/issues/98
-
Batching for
MatMul
. Currently no batching is supported, only 2D x 2D done -
Einstein summation #92
-
norm #93
-
matrix inverse
Things out of scope for 1.0
-
Strided views
-
Quantized
-
Complex
-
Sparse
Issue Analytics
- State:
- Created 4 years ago
- Reactions:3
- Comments:10 (4 by maintainers)
I’m not in favor of the new naming. While I agree with following PyTorch naming convetions, we should also respect our own F# casing convetions.
So couldn’t we just explain Pythonistas this one sentence:
Everything is the same as in PyTorch, except in F# we write object methods in PascalCase like
t.ZerosLike
and static functions in camelCase likeTensor.zerosLike t
… and avoid the damage to the F# side?
I think, sooner or later even the potential Pythonista converts will have to interact with the rest of the F# ecosystem and find such casing inconsistencies weird. Let’s not damage the F# API just for the ‘profit’ meme.
Note that the direction of travel here is that the DiffSharp 1.0 API looks and acts a lot like an F# version of the PyTorch API.