Skip to content

Commit f9fd5c7

Browse files
authored
Adjusting logo (#10)
* Nice 'G' * Using new logo.
1 parent b04b998 commit f9fd5c7

22 files changed

+260
-191
lines changed

README.md

Lines changed: 94 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,94 @@
1+
![image](https://raw.githubusercontent.com/kbc-opensource/gelanis/master/logo/banner_black-w1500.png%0A%20:target:%20https://github.com/kbc-opensource/gelanis)
2+
3+
[![pypi-badge](https://badge.fury.io/py/gelanis.svg)](https://pypi.python.org/pypi/gelanis/), [![test-badge](https://github.com/kbc-opensource/gelanis/workflows/Tests/badge.svg)](https://github.com/kbc-opensource/gelanis/actions?query=workflow%3ATests), [![Documentation Status](https://readthedocs.org/projects/gelanis/badge/?version=latest)](https://pysparkling.readthedocs.io/en/latest/?badge=latest)
4+
5+
**Gelanis** is an enhanced version of [pysparkling](https://github.com/svenkreiss/pysparkling).
6+
7+
List of improvements:
8+
9+
- Data types of the resulting dataframes are equal to pyspark
10+
11+
List of todos:
12+
13+
- Implemented since/until + be able to target a certain pyspark version
14+
- Get a drop-in API compatibility with pyspark (auto-injector is written, but more tests are needed here).
15+
- Test the tests against both pyspark & pysparkling & compare the outputs so we're 100% certain both libraries are API equal.
16+
- Achieve API equality between pyspark & pysparkling (meaning that any public symbol should exist in both libraries).
17+
- Increase tests to ensure 100% compatibility with pyspark.
18+
19+
Some performance metrics I observed (5 simple union-tests):
20+
21+
| | gelanis | pyspark | speedup (times slower than gelanis) |
22+
| --- | --- | --- | --- |
23+
| startup | 0.542 | 47.112 | 87.0 |
24+
| test 1 | 0.009 | 2.610 | 274.7 |
25+
| test 2 | 0.008 | 2.721 | 340.1 |
26+
| test 3 | 0.008 | 2.761 | 345.1 |
27+
| test 4 | 0.009 | 2.471 | 274.6 |
28+
| test 5 | 0.013 | 2.486 | 191.2 |
29+
30+
```python
31+
import gelanis
32+
gelanis.setup(spark_version='2.3.2')
33+
```
34+
35+
36+
Original pysparkling documentation:
37+
============================================
38+
39+
**Pysparkling** provides a faster, more responsive way to develop programs for PySpark. It enables code intended for Spark applications to execute entirely in Python, without incurring the overhead of initializing and passing data through the JVM and Hadoop. The focus is on having a lightweight and fast implementation for small datasets at the expense of some data resilience features and some parallel processing features.
40+
41+
**How does it work?** To switch execution of a script from PySpark to pysparkling, have the code initialize a pysparkling Context instead of a SparkContext, and use the pysparkling Context to set up your RDDs. The beauty is you don't have to change a single line of code after the Context initialization, because pysparkling's API is (almost) exactly the same as PySpark's. Since it's so easy to switch between PySpark and pysparkling, you can choose the right tool for your use case.
42+
43+
**When would I use it?** Say you are writing a Spark application because you need robust computation on huge datasets, but you also want the same application to provide fast answers on a small dataset. You're finding Spark is not responsive enough for your needs, but you don't want to rewrite an entire separate application for the *small-answers-fast* problem. You'd rather reuse your Spark code but somehow get it to run fast. Pysparkling bypasses the stuff that causes Spark's long startup times and less responsive feel.
44+
45+
Here are a few areas where pysparkling excels:
46+
47+
- Small to medium-scale exploratory data analysis
48+
- Application prototyping
49+
- Low-latency web deployments
50+
- Unit tests
51+
52+
Install
53+
=======
54+
55+
``` {.sourceCode .bash}
56+
pip install pysparkling[s3,hdfs,streaming]
57+
```
58+
59+
[Documentation](https://pysparkling.trivial.io):
60+
61+
[![image](https://raw.githubusercontent.com/svenkreiss/pysparkling/master/docs/readthedocs.png)](https://pysparkling.trivial.io)
62+
63+
Other links: [Github](https://github.com/svenkreiss/pysparkling), [![pypi-badge](https://badge.fury.io/py/pysparkling.svg)](https://pypi.python.org/pypi/pysparkling/), [![test-badge](https://github.com/svenkreiss/pysparkling/workflows/Tests/badge.svg)](https://github.com/svenkreiss/pysparkling/actions?query=workflow%3ATests), [![Documentation Status](https://readthedocs.org/projects/pysparkling/badge/?version=latest)](https://pysparkling.readthedocs.io/en/latest/?badge=latest)
64+
65+
Features
66+
========
67+
68+
- Supports URI schemes `s3://`, `hdfs://`, `gs://`, `http://` and `file://` for Amazon S3, HDFS, Google Storage, web and local file access. Specify multiple files separated by comma. Resolves `*` and `?` wildcards.
69+
- Handles `.gz`, `.zip`, `.lzma`, `.xz`, `.bz2`, `.tar`, `.tar.gz` and `.tar.bz2` compressed files. Supports reading of `.7z` files.
70+
- Parallelization via `multiprocessing.Pool`, `concurrent.futures.ThreadPoolExecutor` or any other Pool-like objects that have a `map(func, iterable)` method.
71+
- Plain pysparkling does not have any dependencies (use `pip install pysparkling`). Some file access methods have optional dependencies: `boto` for AWS S3, `requests` for http, `hdfs` for hdfs
72+
73+
Examples
74+
========
75+
76+
Some demos are in the notebooks [docs/demo.ipynb](https://github.com/svenkreiss/pysparkling/blob/master/docs/demo.ipynb) and [docs/iris.ipynb](https://github.com/svenkreiss/pysparkling/blob/master/docs/iris.ipynb) .
77+
78+
**Word Count**
79+
80+
``` {.sourceCode .python}
81+
from pysparkling import Context
82+
83+
counts = (
84+
Context()
85+
.textFile('README.rst')
86+
.map(lambda line: ''.join(ch if ch.isalnum() else ' ' for ch in line))
87+
.flatMap(lambda line: line.split(' '))
88+
.map(lambda word: (word, 1))
89+
.reduceByKey(lambda a, b: a + b)
90+
)
91+
print(counts.collect())
92+
```
93+
94+
which prints a long list of pairs of words and their counts.

README.rst

Lines changed: 0 additions & 127 deletions
This file was deleted.

logo/banner-w1500.png

138 KB
Loading

logo/banner-w500.png

13.8 KB
Loading

0 commit comments

Comments
 (0)