Skip to content

Commit b805cc2

Browse files
Merge branch 'development' of github.com:v3io/tutorials into 3.x
2 parents 777c611 + 91c8f87 commit b805cc2

13 files changed

+145
-95
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -278,8 +278,8 @@ You might also find the following resources useful:
278278

279279
- [Introduction video](https://www.youtube.com/watch?v=8OmAN4wd7To)
280280
- [In-depth platform overview](platform-overview.ipynb) with a break down of the steps for developing a full data science workflow from development to production
281-
- [Platform components, services, and development ecosystem introduction](https://www.iguazio.com/docs/v3.0/intro/ecosystem/)
282-
- [Platform references](https://iguazio.com/docs/latest-release/reference/)
281+
- [Platform Services](https://www.iguazio.com/docs/v3.0/services/)
282+
- [Platform data layer](https://www.iguazio.com/docs/v3.0/data-layer/), including [references](https://www.iguazio.com/docs/v3.0/data-layer/reference/)
283283
- [nuclio-jupyter SDK](https://github.com/nuclio/nuclio-jupyter/blob/master/README.md) for creating and deploying Nuclio functions with Python and Jupyter Notebook
284284

285285
<a id="misc"></a>

data-ingestion-and-preparation/README.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -137,7 +137,7 @@
137137
"- In Hadoop FS or Spark DataFrame commands you use a fully qualified path of the format `v3io://<container name>/<data path>`.\n",
138138
" You can also use environment variables with these interfaces.\n",
139139
"\n",
140-
"For detailed information and examples on how to set the data path for each interface, see [Setting Data Paths](https://www.iguazio.com/docs/v3.0/tutorials/getting-started/fundamentals/#data-paths) and the examples in the platform's tutorial Jupyter notebooks."
140+
"For detailed information and examples on how to set the data path for each interface, see [API Data Paths](https://www.iguazio.com/docs/v3.0/data-layer/apis/data-paths/) and the examples in the platform's tutorial Jupyter notebooks."
141141
]
142142
},
143143
{
@@ -293,7 +293,7 @@
293293
"\n",
294294
"The [**spark-sql-analytics**](spark-sql-analytics.ipynb) tutorial demonstrates how to use Spark SQL and DataFrames to access objects, tables, and unstructured data that persists in the platform's data store.\n",
295295
"\n",
296-
"For more information and examples of data ingestion with Spark DataFrames, see [Getting Started with Data Ingestion Using Spark](https://www.iguazio.com/docs/v3.0/tutorials/getting-started/data-ingestn-w-spark-qs/).<br>\n",
296+
"For more information and examples of data ingestion with Spark DataFrames, see [Getting Started with Data Ingestion Using Spark](https://www.iguazio.com/docs/v3.0/data-layer/spark-data-ingestion-qs/).<br>\n",
297297
"For more about running SQL queries with Spark, see [Running Spark SQL Queries](#data-ingest-sql-spark) under \"Running SQL Queries on Platform Data\"."
298298
]
299299
},
@@ -339,7 +339,7 @@
339339
"\n",
340340
"Nuclio serverless functions can sustain high workloads with very low latencies, thus making them very useful for building an event-driven applications with strict latency requirements.\n",
341341
"\n",
342-
"For more information about Nuclio, see the platform's [serverless introduction](https://www.iguazio.com/docs/intro/latest-release/serverless/)."
342+
"For more information about Nuclio, see [the platform's Nuclio service overview](https://www.iguazio.com/docs/v3.0/services/app-services/nuclio/)."
343343
]
344344
},
345345
{
@@ -358,7 +358,7 @@
358358
"The platform features a custom streaming engine and a related stream format &mdash; a platform stream (a.k.a. V3IO stream).\n",
359359
"You can use the platform's streaming engine to write data into a queue in a real-time data pipeline, or as a standard streaming engine (similar to Kafka and Kinesis), so you don't need to use an external engine.\n",
360360
"\n",
361-
"The platform's streaming engine is currently available via the platform's [Streaming Web API](https://www.iguazio.com/docs/v3.0/reference/api-reference/web-apis/streaming-web-api/).<br>\n",
361+
"The platform's streaming engine is currently available via the platform's [Streaming Web API](https://www.iguazio.com/docs/v3.0/data-layer/reference/web-apis/streaming-web-api/).<br>\n",
362362
"In addition, the platform's Spark-Streaming Integration API enables using the Spark Streaming API to work with platform streams, as explained in the next section ([Using Spark Streaming](#data-ingest-streams-spark)).\n",
363363
"\n",
364364
"The [**v3io-streams**](v3io-streams.ipynb) tutorial demonstrates basic usage of the streaming API.\n",
@@ -382,7 +382,7 @@
382382
"### Using Spark Streaming\n",
383383
"\n",
384384
"You can use the [Spark Streaming](http://spark.apache.org/streaming/) API to ingest, consume, and analyze data using data streams.\n",
385-
"The platform features a custom [Spark-Streaming Integration API](https://www.iguazio.com/docs/v3.0/reference/api-reference/spark-apis/spark-streaming-integration-api/) to allow using the Spark Streaming API with [platform streams](#data-ingest-streams-platform).\n",
385+
"The platform features a custom [Spark-Streaming Integration API](https://www.iguazio.com/docs/v3.0/data-layer/reference/spark-apis/spark-streaming-integration-api/) to allow using the Spark Streaming API with [platform streams](#data-ingest-streams-platform).\n",
386386
"\n",
387387
"<!-- TODO: Add more information / add a tutorial and refer to it. -->"
388388
]

data-ingestion-and-preparation/README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ For example:
8686
- In Hadoop FS or Spark DataFrame commands you use a fully qualified path of the format `v3io://<container name>/<data path>`.
8787
You can also use environment variables with these interfaces.
8888

89-
For detailed information and examples on how to set the data path for each interface, see [Setting Data Paths](https://www.iguazio.com/docs/v3.0/tutorials/getting-started/fundamentals/#data-paths) and the examples in the platform's tutorial Jupyter notebooks.
89+
For detailed information and examples on how to set the data path for each interface, see [API Data Paths](https://www.iguazio.com/docs/v3.0/data-layer/apis/data-paths/) and the examples in the platform's tutorial Jupyter notebooks.
9090

9191
<a id="data-ingest-platform-simple-object-api"></a>
9292

@@ -152,7 +152,7 @@ This allows accelerated and high-speed access from Spark to data stored in the p
152152

153153
The [**spark-sql-analytics**](spark-sql-analytics.ipynb) tutorial demonstrates how to use Spark SQL and DataFrames to access objects, tables, and unstructured data that persists in the platform's data store.
154154

155-
For more information and examples of data ingestion with Spark DataFrames, see [Getting Started with Data Ingestion Using Spark](https://www.iguazio.com/docs/v3.0/tutorials/getting-started/data-ingestn-w-spark-qs/).<br>
155+
For more information and examples of data ingestion with Spark DataFrames, see [Getting Started with Data Ingestion Using Spark](https://www.iguazio.com/docs/v3.0/data-layer/spark-data-ingestion-qs/).<br>
156156
For more about running SQL queries with Spark, see [Running Spark SQL Queries](#data-ingest-sql-spark) under "Running SQL Queries on Platform Data".
157157

158158
<a id="data-ingest-streams"></a>
@@ -178,7 +178,7 @@ You can also implement your own logic within the Nuclio function to manipulate o
178178

179179
Nuclio serverless functions can sustain high workloads with very low latencies, thus making them very useful for building an event-driven applications with strict latency requirements.
180180

181-
For more information about Nuclio, see the platform's [serverless introduction](https://www.iguazio.com/docs/intro/latest-release/serverless/).
181+
For more information about Nuclio, see [the platform's Nuclio service overview](https://www.iguazio.com/docs/v3.0/services/app-services/nuclio/).
182182

183183
<a id="data-ingest-streams-platform"></a>
184184

@@ -187,7 +187,7 @@ For more information about Nuclio, see the platform's [serverless introduction](
187187
The platform features a custom streaming engine and a related stream format &mdash; a platform stream (a.k.a. V3IO stream).
188188
You can use the platform's streaming engine to write data into a queue in a real-time data pipeline, or as a standard streaming engine (similar to Kafka and Kinesis), so you don't need to use an external engine.
189189

190-
The platform's streaming engine is currently available via the platform's [Streaming Web API](https://www.iguazio.com/docs/v3.0/reference/api-reference/web-apis/streaming-web-api/).<br>
190+
The platform's streaming engine is currently available via the platform's [Streaming Web API](https://www.iguazio.com/docs/v3.0/data-layer/reference/web-apis/streaming-web-api/).<br>
191191
In addition, the platform's Spark-Streaming Integration API enables using the Spark Streaming API to work with platform streams, as explained in the next section ([Using Spark Streaming](#data-ingest-streams-spark)).
192192

193193
The [**v3io-streams**](v3io-streams.ipynb) tutorial demonstrates basic usage of the streaming API.
@@ -201,7 +201,7 @@ The [**model deployment with streaming**](https://github.com/mlrun/demo-model-de
201201
### Using Spark Streaming
202202

203203
You can use the [Spark Streaming](http://spark.apache.org/streaming/) API to ingest, consume, and analyze data using data streams.
204-
The platform features a custom [Spark-Streaming Integration API](https://www.iguazio.com/docs/v3.0/reference/api-reference/spark-apis/spark-streaming-integration-api/) to allow using the Spark Streaming API with [platform streams](#data-ingest-streams-platform).
204+
The platform features a custom [Spark-Streaming Integration API](https://www.iguazio.com/docs/v3.0/data-layer/reference/spark-apis/spark-streaming-integration-api/) to allow using the Spark Streaming API with [platform streams](#data-ingest-streams-platform).
205205

206206
<!-- TODO: Add more information / add a tutorial and refer to it. -->
207207

data-ingestion-and-preparation/basic-data-ingestion-and-preparation.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1002,7 +1002,7 @@
10021002
"### Delete Data\n",
10031003
"\n",
10041004
"Optionally delete any of the directories or files that you created.\n",
1005-
"See the instructions in the [Creating and Deleting Container Directories](https://www.iguazio.com/docs/v3.0/tutorials/getting-started/containers/#create-delete-container-dirs) tutorial.\n",
1005+
"See the instructions in the [Creating and Deleting Container Directories](https://www.iguazio.com/docs/v3.0/data-layer/containers/working-with-containers/#create-delete-container-dirs) tutorial.\n",
10061006
"The following example uses a local file-system command to delete the entire contents of the **users/&lt;running user&gt;/examples/stocks** directory that was created in this example, but not the directory itself."
10071007
]
10081008
},

data-ingestion-and-preparation/file-access.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -591,7 +591,7 @@
591591
"### Delete Data\n",
592592
"\n",
593593
"You can optionally delete any of the directories or files that you created.\n",
594-
"See the instructions in the [Creating and Deleting Container Directories](https://www.iguazio.com/docs/latest-release/tutorials/getting-started/containers/#create-delete-container-dirs) tutorial.\n",
594+
"See the instructions in the [Creating and Deleting Container Directories](https://www.iguazio.com/docs/v3.0/data-layer/containers/working-with-containers/#create-delete-container-dirs) tutorial.\n",
595595
"For example, the following code uses a local file-system command to delete a **&lt;running user&gt;/examples/stocks** directory in the \"users\" container.\n",
596596
"Edit the path, as needed, then remove the comment mark (`#`) and run the code."
597597
]

data-ingestion-and-preparation/frames.ipynb

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@
3434
"- `execute` &mdash; executes a command on a table.\n",
3535
" Each backend may support multiple commands.\n",
3636
"\n",
37-
"For a detailed description of the Frames API, see the [Frames API reference](https://www.iguazio.com/docs/v3.0/reference/api-reference/frames/).<br>\n",
37+
"For a detailed description of the Frames API, see the [Frames API reference](https://www.iguazio.com/docs/v3.0/data-layer/reference/frames/).<br>\n",
3838
"For more help and usage details, use the internal API help &mdash; `<client object>.<command>?` in Jupyter Notebook or `print(<client object>.<command>.__doc__)`.<br>\n",
3939
"For example, the following command returns information about the read operation for a client object named `client`:\n",
4040
"```\n",
@@ -162,7 +162,7 @@
162162
"You can run SQL queries on your NoSQL table (using Presto) to offload data filtering, grouping, joins, etc. to a scale-out high-speed database engine.\n",
163163
"\n",
164164
"> **Note:** To query a table in a platform data container, the table path in the `from` section of the SQL query should be of the format `v3io.<container name>.\"/path/to/table\"`.\n",
165-
"> See [Presto Data Paths](https://www.iguazio.com/docs/v3.0/tutorials/getting-started/fundamentals/#data-paths-presto) in the platform documentation.\n",
165+
"> See [Presto Data Paths](https://www.iguazio.com/docs/v3.0/data-layer/apis/data-paths/#data-paths-presto) in the platform documentation.\n",
166166
"> In the following example, the path is set by using the `sql_table_path` variable that was defined in the [kv backend initialization](#frames-kv-init) step.\n",
167167
"> Unless you changed the code, this variable translates to `v3io.users.\"<running user>/examples/bank\"`; for example, `v3io.users.\"iguazio/examples/bank\"` for user \"iguazio\"."
168168
]
@@ -1095,14 +1095,14 @@
10951095
"\n",
10961096
"- `columns` defines the query metrics (default = all).\n",
10971097
"- `aggregators` defines aggregation functions (\"aggregators\") to execute for all the configured metrics.\n",
1098-
"- `filter` restricts the query by using a platform [filter expression](https://www.iguazio.com/docs/v3.0/reference/expressions/condition-expression/#filter-expression).\n",
1098+
"- `filter` restricts the query by using a platform [filter expression](https://www.iguazio.com/docs/v3.0/data-layer/reference/expressions/condition-expression/#filter-expression).\n",
10991099
"- `start` and `end` define the query's time range &mdash; the metric-sample timestamps to which to apply the query.\n",
11001100
" The default `end` time is `\"now\"` and the default `start` time is 1 hour before the end time (`<end> - 1h`).\n",
11011101
"- `step` defines the interval for aggregation or raw-data downsampling (default = the query's time range).\n",
11021102
"- `multi_index` casn be set to `True` to return labels as index columns, as demonstrated in the following examples.\n",
11031103
" By default, only the metric sample-time primary-key attribute is returned as an index column.\n",
11041104
"\n",
1105-
"See the [Frames API reference](https://www.iguazio.com/docs/v3.0/reference/api-reference/frames/tsdb/read/) for more information about the `read` parameters that are supported for the `tsdb` backend."
1105+
"See the [Frames API reference](https://www.iguazio.com/docs/v3.0/data-layer/reference/frames/tsdb/read/) for more information about the `read` parameters that are supported for the `tsdb` backend."
11061106
]
11071107
},
11081108
{
@@ -1248,7 +1248,7 @@
12481248
"#### Conditional Read\n",
12491249
"\n",
12501250
"The following example demonstrates how to use a query filter to conditionally read only a subset of the data from a TSDB table.\n",
1251-
"This is done by setting the value of the `filter` parameter to a [platform filter expression](https://www.iguazio.com/docs/v3.0/reference/expressions/condition-expression/#filter-expression)."
1251+
"This is done by setting the value of the `filter` parameter to a [platform filter expression](https://www.iguazio.com/docs/v3.0/data-layer/reference/expressions/condition-expression/#filter-expression)."
12521252
]
12531253
},
12541254
{
@@ -1455,7 +1455,7 @@
14551455
"## Cleanup\n",
14561456
"\n",
14571457
"You can optionally delete any of the directories or files that you created.\n",
1458-
"See the instructions in the [Creating and Deleting Container Directories](https://www.iguazio.com/docs/v3.0/tutorials/getting-started/containers/#create-delete-container-dirs) tutorial.\n",
1458+
"See the instructions in the [Creating and Deleting Container Directories](https://www.iguazio.com/docs/v3.0/data-layer/containers/working-with-containers/#create-delete-container-dirs) tutorial.\n",
14591459
"For example, the following code uses a local file-system command to delete the entire **&lt;running user&gt;/examples/** directory in the \"users\" container.\n",
14601460
"Edit the path, as needed, then remove the comment mark (`#`) and run the code."
14611461
]

data-ingestion-and-preparation/spark-jdbc.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -586,7 +586,7 @@
586586
"### Delete Data\n",
587587
"\n",
588588
"You can optionally delete any of the directories or files that you created.\n",
589-
"See the instructions in the [Creating and Deleting Container Directories](https://www.iguazio.com/docs/v3.0/tutorials/getting-started/containers/#create-delete-container-dirs) tutorial.\n",
589+
"See the instructions in the [Creating and Deleting Container Directories](https://www.iguazio.com/docs/v3.0/data-layer/containers/working-with-containers/#create-delete-container-dirs) tutorial.\n",
590590
"For example, the following code uses a local file-system command to delete a **&lt;running user&gt;/examples/spark-jdbc** directory in the \"users\" container.\n",
591591
"Edit the path, as needed, then remove the comment mark (`#`) and run the code."
592592
]

data-ingestion-and-preparation/spark-sql-analytics.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@
4343
"The platform's Spark drivers implement the data-source API and support predicate push down: the queries are passed to the platform's data store, which returns only the relevant data.\n",
4444
"This allow accelerated and high-speed access from Spark to data stored in the platform.\n",
4545
"\n",
46-
"For more, details read the [Spark SQL and DataFrames documentation](https://spark.apache.org/docs/2.3.1/sql-programming-guide.html) and the overview in platform's [Spark APIs Reference](https://www.iguazio.com/docs/v3.0/reference/api-reference/spark-apis/overview/)."
46+
"For more, details read the [Spark SQL and DataFrames documentation](https://spark.apache.org/docs/2.3.1/sql-programming-guide.html) and the overview in platform's [Spark APIs Reference](https://www.iguazio.com/docs/v3.0/data-layer/reference/spark-apis/overview/)."
4747
]
4848
},
4949
{
@@ -1559,7 +1559,7 @@
15591559
"- The path to the NoSQL table that is associated with the DataFrame should be defined as a fully qualified path of the format `v3io://<data container>/<table path>` &mdash; where `<data container>` is the name of the table's parent data container and `<table path>` is the relative path to the data within the specified container.\n",
15601560
"- You must use the `key` option to define the table's primary key attribute (column). Note that the value of the primary-key attributes must be unique.<br>\n",
15611561
" You can also optionally use the platform's custom `sorting-key` option to define a sorting-key attribute for the table (which enables performing range scans).<br>\n",
1562-
" For more information, see the [platform documentation](https://www.iguazio.com/docs/v3.0/concepts/containers-collections-objects/#sharding-n-sorting-keys)."
1562+
" For more information, see the [platform documentation](https://www.iguazio.com/docs/v3.0/data-layer/objects/object-names-and-keys/#sharding-n-sorting-keys)."
15631563
]
15641564
},
15651565
{

0 commit comments

Comments
 (0)