diff --git a/docs/content/about.md b/docs/content/about.md
index 58c19ab9d..20affe1a6 100644
--- a/docs/content/about.md
+++ b/docs/content/about.md
@@ -31,7 +31,7 @@ The following table shows the version mapping between Flink® CDC Con
| 2.0.* | 1.13.* |
| 2.1.* | 1.13.* |
| 2.2.* | 1.13.\*, 1.14.\* |
-| 2.3.* | 1.13.\*, 1.14.\*, 1.15.\* |
+| 2.3.* | 1.13.\*, 1.14.\*, 1.15.\*, 1.16.0 |
## Features
diff --git a/docs/content/connectors/mongodb-cdc.md b/docs/content/connectors/mongodb-cdc.md
index bb1e1e5cc..ac7a742e0 100644
--- a/docs/content/connectors/mongodb-cdc.md
+++ b/docs/content/connectors/mongodb-cdc.md
@@ -405,7 +405,7 @@ Data Type Mapping
----------------
[BSON](https://docs.mongodb.com/manual/reference/bson-types/) short for **Binary JSON** is a binary-encoded serialization of JSON-like format used to store documents and make remote procedure calls in MongoDB.
-[Flink SQL Data Type](https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/dev/table/types/) is similar to the SQL standard’s data type terminology which describes the logical type of a value in the table ecosystem. It can be used to declare input and/or output types of operations.
+[Flink SQL Data Type](https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/dev/table/types/) is similar to the SQL standard’s data type terminology which describes the logical type of a value in the table ecosystem. It can be used to declare input and/or output types of operations.
In order to enable Flink SQL to process data from heterogeneous data sources, the data types of heterogeneous data sources need to be uniformly converted to Flink SQL data types.
@@ -525,7 +525,7 @@ Reference
- [Replica set protocol](https://docs.mongodb.com/manual/reference/replica-configuration/#mongodb-rsconf-rsconf.protocolVersion)
- [Connection String Options](https://docs.mongodb.com/manual/reference/connection-string/#std-label-connections-connection-options)
- [BSON Types](https://docs.mongodb.com/manual/reference/bson-types/)
-- [Flink DataTypes](https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/dev/table/types/)
+- [Flink DataTypes](https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/dev/table/types/)
FAQ
--------
diff --git a/docs/content/connectors/mysql-cdc(ZH).md b/docs/content/connectors/mysql-cdc(ZH).md
index c4d9e5b1e..3c0ad47dc 100644
--- a/docs/content/connectors/mysql-cdc(ZH).md
+++ b/docs/content/connectors/mysql-cdc(ZH).md
@@ -686,7 +686,7 @@ $ ./bin/flink run \
--fromSavepoint /tmp/flink-savepoints/savepoint-cca7bc-bb1e257f0dab \
./FlinkCDCExample.jar
```
-**注意:** 请参考文档 [Restore the job from previous savepoint](https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/deployment/cli/#command-line-interface) 了解更多详细信息。
+**注意:** 请参考文档 [Restore the job from previous savepoint](https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/deployment/cli/#command-line-interface) 了解更多详细信息。
数据类型映射
----------------
diff --git a/docs/content/connectors/mysql-cdc.md b/docs/content/connectors/mysql-cdc.md
index 75520cdba..46034cb17 100644
--- a/docs/content/connectors/mysql-cdc.md
+++ b/docs/content/connectors/mysql-cdc.md
@@ -693,7 +693,7 @@ $ ./bin/flink run \
--fromSavepoint /tmp/flink-savepoints/savepoint-cca7bc-bb1e257f0dab \
./FlinkCDCExample.jar
```
-**Note:** Please refer the doc [Restore the job from previous savepoint](https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/deployment/cli/#command-line-interface) for more details.
+**Note:** Please refer the doc [Restore the job from previous savepoint](https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/deployment/cli/#command-line-interface) for more details.
Data Type Mapping
----------------
diff --git a/docs/content/formats/changelog-json.md b/docs/content/formats/changelog-json.md
index 33bcb81f6..a199134e0 100644
--- a/docs/content/formats/changelog-json.md
+++ b/docs/content/formats/changelog-json.md
@@ -1,7 +1,7 @@
# Changelog JSON Format
**WARNING:** The CDC format `changelog-json` is deprecated since Flink CDC version 2.2.
-The CDC format `changelog-json` was introduced at the point that Flink didn't offer any CDC format. Currently, Flink offers several well-maintained CDC formats i.e.[Debezium CDC, MAXWELL CDC, CANAL CDC](https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/connectors/table/formats/overview/), we recommend user to use above CDC formats.
+The CDC format `changelog-json` was introduced at the point that Flink didn't offer any CDC format. Currently, Flink offers several well-maintained CDC formats i.e.[Debezium CDC, MAXWELL CDC, CANAL CDC](https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/connectors/table/formats/overview/), we recommend user to use above CDC formats.
### Compatibility Note
diff --git a/docs/content/quickstart/db2-tutorial.md b/docs/content/quickstart/db2-tutorial.md
index a2d8f13d4..e02c829b2 100644
--- a/docs/content/quickstart/db2-tutorial.md
+++ b/docs/content/quickstart/db2-tutorial.md
@@ -61,7 +61,7 @@ docker-compose down
*Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. *
-- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+- [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
- [flink-sql-connector-db2-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-db2-cdc/2.3-SNAPSHOT/flink-sql-connector-db2-cdc-2.3-SNAPSHOT.jar)
**3. Launch a Flink cluster and start a Flink SQL CLI**
diff --git a/docs/content/quickstart/mongodb-tutorial.md b/docs/content/quickstart/mongodb-tutorial.md
index a86d6f972..a50d5f9a5 100644
--- a/docs/content/quickstart/mongodb-tutorial.md
+++ b/docs/content/quickstart/mongodb-tutorial.md
@@ -109,7 +109,7 @@ db.customers.insertMany([
```Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. ```
-- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+- [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
- [flink-sql-connector-mongodb-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mongodb-cdc/2.3-SNAPSHOT/flink-sql-connector-mongodb-cdc-2.3-SNAPSHOT.jar)
4. Launch a Flink cluster, then start a Flink SQL CLI and execute following SQL statements inside:
diff --git a/docs/content/quickstart/mysql-postgres-tutorial.md b/docs/content/quickstart/mysql-postgres-tutorial.md
index d432c1758..0a8b3ffb5 100644
--- a/docs/content/quickstart/mysql-postgres-tutorial.md
+++ b/docs/content/quickstart/mysql-postgres-tutorial.md
@@ -73,11 +73,11 @@ This command automatically starts all the containers defined in the Docker Compo
We can also visit [http://localhost:5601/](http://localhost:5601/) to see if Kibana is running normally.
### Preparing Flink and JAR package required
-1. Download [Flink 1.15.2](https://archive.apache.org/dist/flink/flink-1.15.2/flink-1.15.2-bin-scala_2.12.tgz) and unzip it to the directory `flink-1.15.2`
-2. Download following JAR package required and put them under `flink-1.15.2/lib/`:
+1. Download [Flink 1.16.0](https://archive.apache.org/dist/flink/flink-1.16.0/flink-1.16.0-bin-scala_2.12.tgz) and unzip it to the directory `flink-1.16.0`
+2. Download following JAR package required and put them under `flink-1.16.0/lib/`:
**Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. **
- - [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+ - [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
- [flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mysql-cdc/2.3-SNAPSHOT/flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar)
- [flink-sql-connector-postgres-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-postgres-cdc/2.3-SNAPSHOT/flink-sql-connector-postgres-cdc-2.3-SNAPSHOT.jar)
@@ -151,7 +151,7 @@ We can also visit [http://localhost:5601/](http://localhost:5601/) to see if Kib
1. Use the following command to change to the Flink directory:
```
- cd flink-1.15.2
+ cd flink-1.16.0
```
2. Use the following command to start a Flink cluster:
@@ -311,7 +311,7 @@ After finishing the tutorial, run the following command to stop all containers i
```shell
docker-compose down
```
-Run the following command to stop the Flink cluster in the directory of Flink `flink-1.15.2`:
+Run the following command to stop the Flink cluster in the directory of Flink `flink-1.16.0`:
```shell
./bin/stop-cluster.sh
```
diff --git a/docs/content/quickstart/oceanbase-tutorial.md b/docs/content/quickstart/oceanbase-tutorial.md
index 6e4cc9fd9..c4d7e0055 100644
--- a/docs/content/quickstart/oceanbase-tutorial.md
+++ b/docs/content/quickstart/oceanbase-tutorial.md
@@ -111,7 +111,7 @@ VALUES (default, '2020-07-30 10:08:22', 'Jark', 50.50, 102, false),
```Download links are only available for stable releases.```
-- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+- [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
- [flink-sql-connector-oceanbase-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-oceanbase-cdc/2.3-SNAPSHOT/flink-sql-connector-oceanbase-cdc-2.3-SNAPSHOT.jar)
### Use Flink DDL to create dynamic table in Flink SQL CLI
diff --git a/docs/content/quickstart/oracle-tutorial.md b/docs/content/quickstart/oracle-tutorial.md
index 0c47ef032..09adccec7 100644
--- a/docs/content/quickstart/oracle-tutorial.md
+++ b/docs/content/quickstart/oracle-tutorial.md
@@ -54,7 +54,7 @@ docker-compose down
*Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. *
-- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+- [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
- [flink-sql-connector-oracle-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-oracle-cdc/2.3-SNAPSHOT/flink-sql-connector-oracle-cdc-2.3-SNAPSHOT.jar)
**3. Launch a Flink cluster and start a Flink SQL CLI**
diff --git a/docs/content/quickstart/polardbx-tutorial.md b/docs/content/quickstart/polardbx-tutorial.md
index 4a35eb0f8..d393dde45 100644
--- a/docs/content/quickstart/polardbx-tutorial.md
+++ b/docs/content/quickstart/polardbx-tutorial.md
@@ -63,12 +63,12 @@ This command automatically starts all the containers defined in the Docker Compo
We can also visit [http://localhost:5601/](http://localhost:5601/) to see if Kibana is running normally.
### Preparing Flink and JAR package required
-1. Download [Flink 1.15.2](https://archive.apache.org/dist/flink/flink-1.15.2/flink-1.15.2-bin-scala_2.12.tgz) and unzip it to the directory `flink-1.15.2`
-2. Download following JAR package required and put them under `flink-1.15.2/lib/`:
+1. Download [Flink 1.16.0](https://archive.apache.org/dist/flink/flink-1.16.0/flink-1.16.0-bin-scala_2.12.tgz) and unzip it to the directory `flink-1.16.0`
+2. Download following JAR package required and put them under `flink-1.16.0/lib/`:
**Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. **
- [flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mysql-cdc/2.3-SNAPSHOT/flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar)
- - [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+ - [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
### Preparing data in databases
#### Preparing data in PolarDB-X
@@ -116,7 +116,7 @@ We can also visit [http://localhost:5601/](http://localhost:5601/) to see if Kib
1. Use the following command to change to the Flink directory:
```
- cd flink-1.15.2
+ cd flink-1.16.0
```
2. Use the following command to start a Flink cluster:
@@ -255,7 +255,7 @@ After finishing the tutorial, run the following command to stop all containers i
```shell
docker-compose down
```
-Run the following command to stop the Flink cluster in the directory of Flink `flink-1.15.2`:
+Run the following command to stop the Flink cluster in the directory of Flink `flink-1.16.0`:
```shell
./bin/stop-cluster.sh
```
diff --git a/docs/content/quickstart/sqlserver-tutorial.md b/docs/content/quickstart/sqlserver-tutorial.md
index 8c7fcaa43..13b4d7a97 100644
--- a/docs/content/quickstart/sqlserver-tutorial.md
+++ b/docs/content/quickstart/sqlserver-tutorial.md
@@ -63,7 +63,7 @@ docker-compose down
*Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. *
-- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+- [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
- [flink-sql-connector-sqlserver-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-sqlserver-cdc/2.3-SNAPSHOT/flink-sql-connector-sqlserver-cdc-2.3-SNAPSHOT.jar)
diff --git a/docs/content/quickstart/tidb-tutorial.md b/docs/content/quickstart/tidb-tutorial.md
index 13eefa8e0..1752f678d 100644
--- a/docs/content/quickstart/tidb-tutorial.md
+++ b/docs/content/quickstart/tidb-tutorial.md
@@ -116,7 +116,7 @@ docker-compose down
*Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. *
-- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+- [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
- [flink-sql-connector-tidb-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-tidb-cdc/2.3-SNAPSHOT/flink-sql-connector-tidb-cdc-2.3-SNAPSHOT.jar)
diff --git a/docs/content/快速上手/mongodb-tutorial-zh.md b/docs/content/快速上手/mongodb-tutorial-zh.md
index 4b6af35e1..58836d7f1 100644
--- a/docs/content/快速上手/mongodb-tutorial-zh.md
+++ b/docs/content/快速上手/mongodb-tutorial-zh.md
@@ -109,7 +109,7 @@ db.customers.insertMany([
```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```
- - [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+ - [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
- [flink-sql-connector-mongodb-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mongodb-cdc/2.3-SNAPSHOT/flink-sql-connector-mongodb-cdc-2.3-SNAPSHOT.jar)
4. 然后启动 Flink 集群,再启动 SQL CLI.
diff --git a/docs/content/快速上手/mysql-postgres-tutorial-zh.md b/docs/content/快速上手/mysql-postgres-tutorial-zh.md
index d0600df7a..e87b1b109 100644
--- a/docs/content/快速上手/mysql-postgres-tutorial-zh.md
+++ b/docs/content/快速上手/mysql-postgres-tutorial-zh.md
@@ -69,11 +69,11 @@ docker-compose up -d
该命令将以 detached 模式自动启动 Docker Compose 配置中定义的所有容器。你可以通过 docker ps 来观察上述的容器是否正常启动了,也可以通过访问 [http://localhost:5601/](http://localhost:5601/) 来查看 Kibana 是否运行正常。
### 下载 Flink 和所需要的依赖包
-1. 下载 [Flink 1.15.2](https://archive.apache.org/dist/flink/flink-1.15.2/flink-1.15.2-bin-scala_2.12.tgz) 并将其解压至目录 `flink-1.15.2`
-2. 下载下面列出的依赖包,并将它们放到目录 `flink-1.15.2/lib/` 下:
+1. 下载 [Flink 1.16.0](https://archive.apache.org/dist/flink/flink-1.16.0/flink-1.16.0-bin-scala_2.12.tgz) 并将其解压至目录 `flink-1.16.0`
+2. 下载下面列出的依赖包,并将它们放到目录 `flink-1.16.0/lib/` 下:
**下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译**
- - [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+ - [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
- [flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mysql-cdc/2.3-SNAPSHOT/flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar)
- [flink-sql-connector-postgres-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-postgres-cdc/2.3-SNAPSHOT/flink-sql-connector-postgres-cdc-2.3-SNAPSHOT.jar)
@@ -308,7 +308,7 @@ Flink SQL> INSERT INTO enriched_orders
```shell
docker-compose down
```
-在 Flink 所在目录 `flink-1.15.2` 下执行如下命令停止 Flink 集群:
+在 Flink 所在目录 `flink-1.16.0` 下执行如下命令停止 Flink 集群:
```shell
./bin/stop-cluster.sh
```
diff --git a/docs/content/快速上手/oceanbase-tutorial-zh.md b/docs/content/快速上手/oceanbase-tutorial-zh.md
index cf0f987c3..b3ad98814 100644
--- a/docs/content/快速上手/oceanbase-tutorial-zh.md
+++ b/docs/content/快速上手/oceanbase-tutorial-zh.md
@@ -110,7 +110,7 @@ VALUES (default, '2020-07-30 10:08:22', 'Jark', 50.50, 102, false),
```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```
-- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+- [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
- [flink-sql-connector-oceanbase-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-oceanbase-cdc/2.3-SNAPSHOT/flink-sql-connector-oceanbase-cdc-2.3-SNAPSHOT.jar)
### 在 Flink SQL CLI 中使用 Flink DDL 创建表
diff --git a/docs/content/快速上手/oracle-tutorial-zh.md b/docs/content/快速上手/oracle-tutorial-zh.md
index 382de9f62..0a0903d4a 100644
--- a/docs/content/快速上手/oracle-tutorial-zh.md
+++ b/docs/content/快速上手/oracle-tutorial-zh.md
@@ -54,7 +54,7 @@ docker-compose down
*下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译*
-- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+- [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
- [flink-sql-connector-oracle-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-oracle-cdc/2.3-SNAPSHOT/flink-sql-connector-oracle-cdc-2.3-SNAPSHOT.jar)
**3. 然后启动 Flink 集群,再启动 SQL CLI:**
diff --git a/docs/content/快速上手/polardbx-tutorial-zh.md b/docs/content/快速上手/polardbx-tutorial-zh.md
index 64bcb9040..c6b780bb1 100644
--- a/docs/content/快速上手/polardbx-tutorial-zh.md
+++ b/docs/content/快速上手/polardbx-tutorial-zh.md
@@ -105,12 +105,12 @@ VALUES (default, '2020-07-30 10:08:22', 'Jark', 50.50, 102, false),
```
### 下载 Flink 和所需要的依赖包
-1. 下载 [Flink 1.15.2](https://archive.apache.org/dist/flink/flink-1.15.2/flink-1.15.2-bin-scala_2.12.tgz) 并将其解压至目录 `flink-1.15.2`
-2. 下载下面列出的依赖包,并将它们放到目录 `flink-1.15.2/lib/` 下
+1. 下载 [Flink 1.16.0](https://archive.apache.org/dist/flink/flink-1.16.0/flink-1.16.0-bin-scala_2.12.tgz) 并将其解压至目录 `flink-1.16.0`
+2. 下载下面列出的依赖包,并将它们放到目录 `flink-1.16.0/lib/` 下
```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```
- 用于订阅PolarDB-X Binlog: [flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mysql-cdc/2.3-SNAPSHOT/flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar)
-- 用于写入Elasticsearch: [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+- 用于写入Elasticsearch: [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
3. 启动flink服务:
```shell
./bin/start-cluster.sh
diff --git a/docs/content/快速上手/sqlserver-tutorial-zh.md b/docs/content/快速上手/sqlserver-tutorial-zh.md
index 33966b01e..7bdbf5178 100644
--- a/docs/content/快速上手/sqlserver-tutorial-zh.md
+++ b/docs/content/快速上手/sqlserver-tutorial-zh.md
@@ -63,7 +63,7 @@ docker-compose down
```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```
-- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+- [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
- [flink-sql-connector-sqlserver-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-sqlserver-cdc/2.3-SNAPSHOT/flink-sql-connector-sqlserver-cdc-2.3-SNAPSHOT.jar)
diff --git a/docs/content/快速上手/tidb-tutorial-zh.md b/docs/content/快速上手/tidb-tutorial-zh.md
index 1bf42f2b2..7ec4ad91a 100644
--- a/docs/content/快速上手/tidb-tutorial-zh.md
+++ b/docs/content/快速上手/tidb-tutorial-zh.md
@@ -116,7 +116,7 @@ docker-compose down
```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```
-- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
+- [flink-sql-connector-elasticsearch7-1.16.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.16.0/flink-sql-connector-elasticsearch7-1.16.0.jar)
- [flink-sql-connector-tidb-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-tidb-cdc/2.3-SNAPSHOT/flink-sql-connector-tidb-cdc-2.3-SNAPSHOT.jar)