[FEATURE REQUEST]: Spark 3.0 Readiness
See original GitHub issueAPIs
SparkSession
-
csharp
public static void SetActiveSession(SparkSession session) (#641) -
csharp
public static void ClearActiveSession() (#641) -
csharp
public static SparkSession GetActiveSession() (#641) -
csharp
public DataFrame ExecuteCommand(string runner, string command, Dictionary<string, string> options) (#647)
DataFrame
-
csharp
public DataFrame Transform(Func<DataFrame, DataFrame> func) (#688) -
csharp
public IEnumerable<Row> Tail(int n) (#647) -
csharp
public void PrintSchema(int level) (#647) -
csharp
public void Explain(string mode) (#647) -
csharp
public DataFrame Observe(string name, Column expr, params Column[] exprs) (#647) -
csharp
WriteTo(string table) (#677)
DataFrameStatFunctions
-
csharp
public DataFrame SampleBy<T>(Column column, IDictionary<T, double> fractions, long seed) (#647)
DataFrameWriterV2
-
csharp
public DataFrameWriterV2 Using(string provider) (#677) -
csharp
public DataFrameWriterV2 Option(string key, string value) (#677) -
csharp
public DataFrameWriterV2 Option(string key, bool value) (#677) -
csharp
public DataFrameWriterV2 Option(string key, long value) (#677) -
csharp
public DataFrameWriterV2 Option(string key, double value) (#677) -
csharp
public DataFrameWriterV2 Options(Dictionary<string, string> options) (#677) -
csharp
public DataFrameWriterV2 TableProperty(string property, string value) (#677) -
csharp
public DataFrameWriterV2 PartitionedBy(Column column, params Column[] columns) (#677) -
csharp
public void Create() (#677) -
csharp
public void Replace() (#677) -
csharp
public void CreateOrReplace() (#677) -
csharp
public void Append() (#677) -
csharp
public void Overwrite(Column condition) (#677) -
csharp
public void OverwritePartitions() (#677)
RelationalGroupedDataset
-
scala
def as[K: Encoder, T: Encoder]: KeyValueGroupedDataset[K, T]
Functions
-
csharp
public static Column XXHash64(params Column[] columns) (#649) -
csharp
public static Column Split(Column column, string pattern, int limit) (#649) -
csharp
public static Column Overlay(Column src, Column replace, Column pos, Column len) (#649) -
csharp
public static Column Overlay(Column src, Column replace, Column pos) (#649) -
csharp
public static Column AddMonths(Column startDate, Column numMonths) (#649) -
csharp
public static Column DateAdd(Column start, Column days) (#649) -
csharp
public static Column DateSub(Column start, Column days) (#649) -
scala
def transform(column: Column, f: Column => Column): ColumnUnsupported: passing function as parameter -
scala
def transform(column: Column, f: (Column, Column) => Column): ColumnUnsupported: passing function as parameter -
scala
def exists(column: Column, f: Column => Column): ColumnUnsupported: passing function as parameter -
scala
def forall(column: Column, f: Column => Column): ColumnUnsupported: passing function as parameter -
scala
def filter(column: Column, f: Column => Column): ColumnUnsupported: passing function as parameter -
scala
def filter(column: Column, f: (Column, Column) => Column): ColumnUnsupported: passing function as parameter -
scala
def aggregate(expr: Column, initialValue: Column, merge: (Column, Column) => Column, finish: Column => Column): ColumnUnsupported: passing function as parameter -
scala
def aggregate(expr: Column, initialValue: Column, merge: (Column, Column) => Column): ColumnUnsupported: passing function as parameter -
scala
def zip_with(left: Column, right: Column, f: (Column, Column) => Column): ColumnUnsupported: passing function as parameter -
scala
def transform_keys(expr: Column, f: (Column, Column) => Column): ColumnUnsupported: passing function as parameter -
scala
def transform_values(expr: Column, f: (Column, Column) => Column): ColumnUnsupported: passing function as parameter -
scala
def map_filter(expr: Column, f: (Column, Column) => Column): ColumnUnsupported: passing function as parameter -
scala
def map_zip_with(left: Column, right: Column, f: (Column, Column, Column) => Column): ColumnUnsupported: passing function as parameter -
csharp
public static Column SchemaOfJson(Column json, Dictionary<string, string> options) (#649) -
csharp
public static Column MapEntries(Column column) (#649) -
csharp
public static Column FromCsv(Column column, StructType schema, Dictionary<string, string> options) (#649) -
csharp
public static Column FromCsv(Column column, Column schema, Dictionary<string, string> options) (#649) -
csharp
public static Column SchemaOfCsv(string csv) (#649) -
csharp
public static Column SchemaOfCsv(Column csv) (#649) -
csharp
public static Column SchemaOfCsv(Column csv, Dictionary<string, string> options) (#649) -
csharp
public static Column ToCsv(Column column, Dictionary<string, string> options) (#649) -
csharp
public static Column ToCsv(Column column) (#649) -
csharp
public static Column Years(Column column) (#649) -
csharp
public static Column Months(Column column) (#649) -
csharp
public static Column Days(Column column) (#649) -
csharp
public static Column Hours(Column column) (#649) -
csharp
public static Column Bucket(Column numBuckets, Column column) (#649) -
csharp
public static Column Bucket(int numBuckets, Column column) (#649)
Issue Analytics
- State:
- Created 3 years ago
- Reactions:3
- Comments:9 (5 by maintainers)
Top Results From Across the Web
Upcoming features in Spark 3
General features: · Templates - use pre-written emails for a fast reply to frequent messages. · Labels - Gmail labels support. · Integrations...
Read more >The Spark v3 macOS issues : r/macapps
I've updated Spark on iOS & iPadOS and those are solid upgrades with a few great new features. But what happened to the...
Read more >Spark 3.0 Is The Most User-Friendly Email Client Around
The 3.0 version of Spark Desktop is a mail client that is clean, inviting, and user-friendly. It makes managing emails easier and helps...
Read more >Spark Release 3.2.0
Apache Spark 3.2.0 is the third release of the 3.x line. With tremendous contribution from the open-source community, this release managed to resolve...
Read more >Solved: Is there a location where customers can submit sug ...
Welcome to the community! You can submit suggestions and feature requests at our Ideas Portal. View solution in original post.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@GoEddie , I am new to Spark and just trying out this library for one of my use case. However, I can definitely give it a try from September 2nd week.
All features in this issue have been merged, this can be closed.