question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Roadmap to DiffSharp 1.0

See original GitHub issue

This is a catalog issue to track what needs doing for DiffSharp 1.0, based on 1:1 discussions with @gbaydin. There will be a long list of other things, we’ll extend this as necessary

  • Typed Backend.None CPU tensors (draft)

  • Add keep_dims on Mean (done)

  • Fix CompareTo in RawTensorFloat32CPU.fs (done)

  • Broadcasting. Full pytorch-style broadcasting for Add (see TODO here). The design principle expected here is “we should do the same thing as PyTorch”. Similarly full pytorch-style broadcasting for Mul and other operations. (done)

  • Convolutions (@gbaydin)

  • Transposed convolutions

  • Batchnorm

  • Dropout

  • Switch to Python-style casing on all operations to align with SciSharp

  • Remove excess overloads and use optional arguments instead

  • Finalize API dsharp.abc

  • libtorch and cuda backends

  • Add Reshape (similar code to View)

  • Add OneHot

  • Differentiation API

  • Optimizers

  • Fix possible memory leak on Linux

  • Tensor save/load

  • General transpose

  • probability distributions

  • Generalization and batching of Transpose.

  • Zero-size tensors #150

  • Docs tooling #134

  • Docs #167

  • PyTorch Half support

  • Batching conv1d/2d/3d/… https://github.com/DiffSharp/DiffSharp/issues/98

  • Batching for MatMul. Currently no batching is supported, only 2D x 2D done

  • Einstein summation #92

  • norm #93

  • matrix inverse

Things out of scope for 1.0

  • Strided views

  • Quantized

  • Complex

  • Sparse

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Reactions:3
  • Comments:10 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
pkesecommented, May 2, 2020

I’m not in favor of the new naming. While I agree with following PyTorch naming convetions, we should also respect our own F# casing convetions.

So couldn’t we just explain Pythonistas this one sentence:

Everything is the same as in PyTorch, except in F# we write object methods in PascalCase like t.ZerosLike and static functions in camelCase like Tensor.zerosLike t

… and avoid the damage to the F# side?

I think, sooner or later even the potential Pythonista converts will have to interact with the rest of the F# ecosystem and find such casing inconsistencies weird. Let’s not damage the F# API just for the ‘profit’ meme.

1reaction
dsymecommented, Feb 14, 2020

Note that the direction of travel here is that the DiffSharp 1.0 API looks and acts a lot like an F# version of the PyTorch API.

Read more comments on GitHub >

github_iconTop Results From Across the Web

DiffSharp: Differentiable Tensor Programming Made Simple
DiffSharp is a tensor library with support for differentiable programming. It is designed for use in machine learning, probabilistic programming, ...
Read more >
Figuring things out, one model at a time.
This post is a continuation of my exploration of DiffSharp, an F# Automatic Differentiation library. In the previous post, I covered some introductory ......
Read more >
Maximum Likelihood Estimation of Weibull reliability with ...
This post is a continuation of my exploration of DiffSharp, an F# Automatic Differentiation library. In the previous post, I covered some ...
Read more >
F# 5: A New Era of Functional Programming with .NET
In fact, when F# 1.0 was developed, a tool called F# Interactive (FSI) was ... Our roadmap includes integration with various other tools,...
Read more >
Differentiable programming for gradient-based machine learning
Hello Swift community, The development of differentiable programming in Swift (“Differentiable Swift”, “AutoDiff”) has come a long way since ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found