You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: CHANGELOG.md
+14-13
Original file line number
Diff line number
Diff line change
@@ -3,8 +3,9 @@
3
3
## Enhancements
4
4
5
5
- References librdkafka.redist 2.4.0. Refer to the [librdkafka v2.4.0 release notes](https://github.com/confluentinc/librdkafka/releases/tag/v2.4.0) for more information.
Integration tests running with the new consumer group protocol. The feature is an Early Access: not production ready, still not supported (#2212).
6
+
-[KIP-848 EA](https://cwiki.apache.org/confluence/display/KAFKA/KIP-848%3A+The+Next+Generation+of+the+Consumer+Rebalance+Protocol): Added KIP-848 based new consumer group rebalance protocol.
7
+
Integration tests running with the new consumer group protocol. The feature is an **Early Access**: not production ready. Please refer
8
+
[detailed doc](https://github.com/confluentinc/librdkafka/blob/master/INTRODUCTION.md#next-generation-of-the-consumer-group-protocol-kip-848) for more information. (#2212).
Return authorized operations in describe responses (#2021, @jainruchir).
17
18
-[KIP-396](https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=97551484): Added support for ListOffsets Admin API (#2086).
18
-
- Add `Rack` to the `Node` type, so AdminAPI calls can expose racks for brokers (currently, all Describe
19
+
- Add `Rack` to the `Node` type, so AdminAPI calls can expose racks for brokers (currently, all Describe
19
20
Responses) (#2021, @jainruchir).
20
21
- Added support for external JSON schemas in `JsonSerializer` and `JsonDeserializer` (#2042).
21
22
- Added compatibility methods to CachedSchemaRegistryClient ([ISBronny](https://github.com/ISBronny), #2097).
@@ -98,15 +99,15 @@ OpenSSL 3.0.x upgrade in librdkafka requires a major version bump, as some legac
98
99
**Note: There were no 2.0.0 and 2.0.1 releases.**
99
100
100
101
101
-
# 1.9.3
102
+
# 1.9.3
102
103
103
104
## Enhancements
104
105
105
106
- Added `NormalizeSchemas` configuration property to the Avro, Json and Protobuf serdes.
106
107
107
108
## Fixes
108
109
109
-
- Schema Registry authentication now works with passwords that contain the ':' character ([luismedel](https://github.com/luismedel)).
110
+
- Schema Registry authentication now works with passwords that contain the ':' character ([luismedel](https://github.com/luismedel)).
110
111
- Added missing librdkafka internal and broker error codes to the `ErrorCode` enum.
111
112
112
113
@@ -169,7 +170,7 @@ for a complete list of changes, enhancements, fixes and upgrade considerations.
169
170
170
171
# 1.8.1
171
172
172
-
## Enhancements
173
+
## Enhancements
173
174
174
175
- Updated `NJsonSchema` to v10.5.2.
175
176
@@ -318,7 +319,7 @@ Version 1.6.0 and 1.6.1 were not released.
318
319
## Changes
319
320
320
321
- Some internal improvements to the `Consmer` (thanks to [@andypook](https://github.com/AndyPook)).
321
-
- BREAKING CHANGE: `net452` is no longer a target framework of `Confluent.SchemaRegistry` or `Confluent.SchemaRegistry.Serdes` due to the switch to the official Apache Avro package which only targets `netstandard2.0`.
322
+
- BREAKING CHANGE: `net452` is no longer a target framework of `Confluent.SchemaRegistry` or `Confluent.SchemaRegistry.Serdes` due to the switch to the official Apache Avro package which only targets `netstandard2.0`.
322
323
- Marked properties on `ConsumeResult` that simply delegate to the corresponding properties on `ConsumeResult.Message` as obsolete.
323
324
324
325
## Fixes
@@ -360,7 +361,7 @@ Version 1.6.0 and 1.6.1 were not released.
360
361
## Bugs
361
362
362
363
**WARNING: There is an issue with SASL GSSAPI authentication on Windows with this release. This is resolved in v1.2.1.**
363
-
364
+
364
365
## Enhancements
365
366
366
367
- References librdkafka v1.2.0. Refer to the [release notes](https://github.com/edenhill/librdkafka/releases/tag/v1.2.0) for more information. Headline feature is consumer side support for transactions.
@@ -424,7 +425,7 @@ Feature highlights:
424
425
- Non-blocking support for async serializers.
425
426
- Very flexible:
426
427
- e.g. can be easily extended to support header serialization.
427
-
- Capability to specify custom timestamps when producing messages.
428
+
- Capability to specify custom timestamps when producing messages.
428
429
- Message persistence status support.
429
430
- Renamed ProduceAsync variants with a callback to Produce.
430
431
- Consumer improvements:
@@ -541,7 +542,7 @@ Feature highlights:
541
542
542
543
- Revamped producer and consumer serialization functionality.
543
544
- There are now two types of serializer and deserializer: `ISerializer<T>` / `IAsyncSerializer<T>` and `IDeserializer<T>` / `IAsyncDeserializer<T>`.
544
-
-`ISerializer<T>`/`IDeserializer<T>` are appropriate for most use cases.
545
+
-`ISerializer<T>`/`IDeserializer<T>` are appropriate for most use cases.
545
546
-`IAsyncSerializer<T>`/`IAsyncDeserializer<T>` are async friendly, but less performant (they return `Task`s).
546
547
- Changed the name of `Confluent.Kafka.Avro` to `Confluent.SchemaRegistry.Serdes` (Schema Registry may support other serialization formats in the future).
547
548
- Added an example demonstrating working with protobuf serialized data.
- Note: End of partition notification is now disabled by default (enable using the `EnablePartitionEof` config property).
559
560
- Removed the `Consumer.OnPartitionEOF` event in favor notifying of partition eof via `ConsumeResult.IsPartitionEOF`.
560
-
- Removed `ErrorEvent` class and added `IsFatal` to `Error` class.
561
+
- Removed `ErrorEvent` class and added `IsFatal` to `Error` class.
561
562
- The `IsFatal` flag is now set appropriately for all errors (previously it was always set to `false`).
562
563
- Added `PersistenceStatus` property to `DeliveryResult`, which provides information on the persitence status of the message.
563
564
@@ -595,7 +596,7 @@ Feature highlights:
595
596
- Producers can utilize the underlying librdkafka handle from other Producers (replaces the 0.11.x `GetSerializingProducer` method on the `Producer` class).
596
597
-`AdminClient` can utilize the underlying librdkafka handle from other `AdminClient`s, `Producer`s or `Consumer`s.
597
598
-`IDeserializer` now exposes message data via `ReadOnlySpan<byte>`, directly referencing librdkafka allocated memory. This results in a considerable (up to 2x) performance increase and reduced memory.
598
-
- Most blocking operations now accept a `CancellationToken` parameter.
599
+
- Most blocking operations now accept a `CancellationToken` parameter.
599
600
- TODO: in some cases there is no backing implementation yet.
600
601
- .NET Specific configuration parameters are all specified/documented in the `ConfigPropertyNames` class.
601
602
@@ -621,7 +622,7 @@ Feature highlights:
621
622
-`Commit` errors are reported via an exception and method return values have correspondingly changed.
622
623
-`ListGroups`, `ListGroup`, `GetWatermarkOffsets`, `QueryWatermarkOffsets`, and `GetMetadata` have been removed from `Producer` and `Consumer` and exposed only via `AdminClient`.
623
624
- Added `Consumer.Close`.
624
-
- Various methods that formerly returned `TopicPartitionOffsetError` / `TopicPartitionError` now return `TopicPartitionOffset` / `TopicPartition` and throw an exception in
625
+
- Various methods that formerly returned `TopicPartitionOffsetError` / `TopicPartitionError` now return `TopicPartitionOffset` / `TopicPartition` and throw an exception in
625
626
case of error (with a `Result` property of type `TopicPartitionOffsetError` / `TopicPartitionError`).
Copy file name to clipboardexpand all lines: README.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -43,13 +43,13 @@ confluent-kafka-dotnet is distributed via NuGet. We provide five packages:
43
43
To install Confluent.Kafka from within Visual Studio, search for Confluent.Kafka in the NuGet Package Manager UI, or run the following command in the Package Manager Console:
44
44
45
45
```
46
-
Install-Package Confluent.Kafka -Version 2.3.0
46
+
Install-Package Confluent.Kafka -Version 2.4.0
47
47
```
48
48
49
49
To add a reference to a dotnet core project, execute the following at the command line:
50
50
51
51
```
52
-
dotnet add package -v 2.3.0 Confluent.Kafka
52
+
dotnet add package -v 2.4.0 Confluent.Kafka
53
53
```
54
54
55
55
Note: `Confluent.Kafka` depends on the `librdkafka.redist` package which provides a number of different builds of `librdkafka` that are compatible with [common platforms](https://github.com/edenhill/librdkafka/wiki/librdkafka.redist-NuGet-package-runtime-libraries). If you are on one of these platforms this will all work seamlessly (and you don't need to explicitly reference `librdkafka.redist`). If you are on a different platform, you may need to [build librdkafka](https://github.com/edenhill/librdkafka#building) manually (or acquire it via other means) and load it using the [Library.Load](https://docs.confluent.io/current/clients/confluent-kafka-dotnet/api/Confluent.Kafka.Library.html#Confluent_Kafka_Library_Load_System_String_) method.
0 commit comments