{ "cells": [ { "cell_type": "markdown", "metadata": { "id": "6bYaCABobL5q" }, "source": [ "##### Copyright 2018 The TensorFlow Authors." ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "cellView": "form", "execution": { "iopub.execute_input": "2022-12-14T22:39:47.016895Z", "iopub.status.busy": "2022-12-14T22:39:47.016469Z", "iopub.status.idle": "2022-12-14T22:39:47.020088Z", "shell.execute_reply": "2022-12-14T22:39:47.019567Z" }, "id": "FlUw7tSKbtg4" }, "outputs": [], "source": [ "#@title Licensed under the Apache License, Version 2.0 (the \"License\");\n", "# you may not use this file except in compliance with the License.\n", "# You may obtain a copy of the License at\n", "#\n", "# https://www.apache.org/licenses/LICENSE-2.0\n", "#\n", "# Unless required by applicable law or agreed to in writing, software\n", "# distributed under the License is distributed on an \"AS IS\" BASIS,\n", "# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n", "# See the License for the specific language governing permissions and\n", "# limitations under the License." ] }, { "cell_type": "markdown", "metadata": { "id": "08OTcmxgqkc2" }, "source": [ "# 自动将代码升级到 TensorFlow 2\n", "\n", "
\n",
" ![]() | \n",
" \n",
" ![]() | \n",
" \n",
" ![]() | \n",
" \n",
" ![]() | \n",
"
tf_upgrade_v2 \\\n", " --intree my_project/ \\\n", " --outtree my_project_v2/ \\\n", " --reportfile report.txt\n", "\n", "\n", "将现有 TensorFlow 1.x Python 脚本转换为 TensorFlow 2.0 脚本可以加快升级流程。\n", "\n", "转换脚本会尽可能实现自动化处理,但仍有一些语法和样式变更无法通过脚本执行转换。" ] }, { "cell_type": "markdown", "metadata": { "id": "gP9v2vgptdfi" }, "source": [ "## 兼容性模块\n", "\n", "某些 API 符号无法通过简单的字符串替换进行升级。为了确保代码在 TensorFlow 2.0 中仍受支持,升级脚本包含了一个 `compat.v1` 模块。该模块可将 TF 1.x 符号(如 `tf.foo`)替换为等效的 `tf.compat.v1.foo` 引用。虽然该兼容性模块效果不错,但我们仍建议人工校对替换,并尽快将代码迁移到 `tf.*` 命名空间(而不是 `tf.compat.v1` 命名空间)中的新 API。\n", "\n", "由于 TensorFlow 2.x 模块弃用(例如,`tf.flags` 和 `tf.contrib`),切换到 `compat.v1` 无法解决某些更改。升级此代码可能需要其他库(例如,[`absl.flags`](https://github.com/abseil/abseil-py))或切换到 [tensorflow/addons](http://www.github.com/tensorflow/addons) 中的软件包。\n" ] }, { "cell_type": "markdown", "metadata": { "id": "s78bbfjkXYb7" }, "source": [ "## 推荐的升级流程\n", "\n", "本指南的剩余部分演示如何使用升级脚本。虽然升级脚本的使用非常简单,我们仍强烈建议在以下流程中使用脚本:\n", "\n", "1. **单元测试**:确保要升级的代码包含具有合理覆盖范围的单元测试套件。这是 Python 代码,该语言并不会帮助您避免各种类型的错误。同时为了与 TensorFlow 2.0 兼容,还要确保升级所有依赖项。\n", "\n", "2. **安装 TensorFlow 1.14**:将 TensorFlow 升级到最新的 TensorFlow 1.x 版本(最低为 1.14 版本)。其中包括 `tf.compat.v2` 中的最终 TensorFlow 2.0 API。\n", "\n", "3. **通过 1.14 版本进行测试**:确保此时可通过单元测试。在升级过程中,您将反复进行测试,因此,从无错误的代码开始非常重要。\n", "\n", "4. **运行升级脚本**:对整个源代码树运行 `tf_upgrade_v2`(已包含测试)。这样可将代码升级为仅使用 TensorFlow 2.0 中所提供的符号的格式。被弃用的符号将通过 `tf.compat.v1` 进行访问。最终需要人工检查这些升级,但不是现在。\n", "\n", "5. **通过 TensorFlow 1.14 运行转换的测试**:代码在 TensorFlow 1.14 中应该仍可以正常运行。再次运行单元测试。测试在此时产生任何错误都意味着升级脚本存在错误。[请通知我们](https://github.com/tensorflow/tensorflow/issues)。\n", "\n", "6. **检查更新报告中的警告和错误**:该脚本会编写一个对需要复查的转换或需要执行的人工操作进行解释的报告文件。例如:contrib 的所有剩余实例需要通过人工操作删除。请查阅 [RFC 中的详细说明](https://github.com/tensorflow/community/blob/master/rfcs/20180907-contrib-sunset.md)。\n", "\n", "7. **安装 TensorFlow 2.0**:此时应该可以安全切换到 TensorFlow 2.0\n", "\n", "8. **使用 `v1.disable_v2_behavior` 进行测试**:使用测试主函数中的 `v1.disable_v2_behavior()` 重新运行测试产生的结果应与在 1.14 下运行时产生的结果相同。\n", "\n", "9. **启用 V2 行为**:现在,使用 v2 API 已经成功通过了测试,不妨开始考虑启用 v2 行为。这可能需要执行一些更改,具体取决于代码编写方式。有关详细信息,请参阅[迁移指南](migrate.ipynb)。" ] }, { "cell_type": "markdown", "metadata": { "id": "6pwSAQEwvscP" }, "source": [ "## 使用升级脚本\n" ] }, { "cell_type": "markdown", "metadata": { "id": "I9NCvDt5GwX4" }, "source": [ "### 设置\n", "\n", "开始之前,请确保已安装 TensorlFlow 2.0。" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:39:47.023858Z", "iopub.status.busy": "2022-12-14T22:39:47.023375Z", "iopub.status.idle": "2022-12-14T22:39:48.984231Z", "shell.execute_reply": "2022-12-14T22:39:48.983466Z" }, "id": "DWVYbvi1WCeY" }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "2022-12-14 22:39:47.969893: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory\n", "2022-12-14 22:39:47.970022: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvinfer_plugin.so.7: cannot open shared object file: No such file or directory\n", "2022-12-14 22:39:47.970032: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly.\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2.11.0\n" ] } ], "source": [ "import tensorflow as tf\n", "\n", "print(tf.__version__)" ] }, { "cell_type": "markdown", "metadata": { "id": "Ycy3B5PNGutU" }, "source": [ "克隆 [tensorflow/models](https://github.com/tensorflow/models) git 仓库,以便获得一些要测试的代码:" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:39:48.988330Z", "iopub.status.busy": "2022-12-14T22:39:48.987528Z", "iopub.status.idle": "2022-12-14T22:40:00.854342Z", "shell.execute_reply": "2022-12-14T22:40:00.853454Z" }, "id": "jyckoWyAZEhZ" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Cloning into 'models'...\r\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Enumerating objects: 2927, done.\u001b[K\r\n", "remote: Counting objects: 0% (1/2927)\u001b[K\r", "remote: Counting objects: 1% (30/2927)\u001b[K\r", "remote: Counting objects: 2% (59/2927)\u001b[K\r", "remote: Counting objects: 3% (88/2927)\u001b[K\r", "remote: Counting objects: 4% (118/2927)\u001b[K\r", "remote: Counting objects: 5% (147/2927)\u001b[K\r", "remote: Counting objects: 6% (176/2927)\u001b[K\r", "remote: Counting objects: 7% (205/2927)\u001b[K\r", "remote: Counting objects: 8% (235/2927)\u001b[K\r", "remote: Counting objects: 9% (264/2927)\u001b[K\r", "remote: Counting objects: 10% (293/2927)\u001b[K\r", "remote: Counting objects: 11% (322/2927)\u001b[K\r", "remote: Counting objects: 12% (352/2927)\u001b[K\r", "remote: Counting objects: 13% (381/2927)\u001b[K\r", "remote: Counting objects: 14% (410/2927)\u001b[K\r", "remote: Counting objects: 15% (440/2927)\u001b[K\r", "remote: Counting objects: 16% (469/2927)\u001b[K\r", "remote: Counting objects: 17% (498/2927)\u001b[K\r", "remote: Counting objects: 18% (527/2927)\u001b[K\r", "remote: Counting objects: 19% (557/2927)\u001b[K\r", "remote: Counting objects: 20% (586/2927)\u001b[K\r", "remote: Counting objects: 21% (615/2927)\u001b[K\r", "remote: Counting objects: 22% (644/2927)\u001b[K\r", "remote: Counting objects: 23% (674/2927)\u001b[K\r", "remote: Counting objects: 24% (703/2927)\u001b[K\r", "remote: Counting objects: 25% (732/2927)\u001b[K\r", "remote: Counting objects: 26% (762/2927)\u001b[K\r", "remote: Counting objects: 27% (791/2927)\u001b[K\r", "remote: Counting objects: 28% (820/2927)\u001b[K\r", "remote: Counting objects: 29% (849/2927)\u001b[K\r", "remote: Counting objects: 30% (879/2927)\u001b[K\r", "remote: Counting objects: 31% (908/2927)\u001b[K\r", "remote: Counting objects: 32% (937/2927)\u001b[K\r", "remote: Counting objects: 33% (966/2927)\u001b[K\r", "remote: Counting objects: 34% (996/2927)\u001b[K\r", "remote: Counting objects: 35% (1025/2927)\u001b[K\r", "remote: Counting objects: 36% (1054/2927)\u001b[K\r", "remote: Counting objects: 37% (1083/2927)\u001b[K\r", "remote: Counting objects: 38% (1113/2927)\u001b[K\r", "remote: Counting objects: 39% (1142/2927)\u001b[K\r", "remote: Counting objects: 40% (1171/2927)\u001b[K\r", "remote: Counting objects: 41% (1201/2927)\u001b[K\r", "remote: Counting objects: 42% (1230/2927)\u001b[K\r", "remote: Counting objects: 43% (1259/2927)\u001b[K\r", "remote: Counting objects: 44% (1288/2927)\u001b[K\r", "remote: Counting objects: 45% (1318/2927)\u001b[K\r", "remote: Counting objects: 46% (1347/2927)\u001b[K\r", "remote: Counting objects: 47% (1376/2927)\u001b[K\r", "remote: Counting objects: 48% (1405/2927)\u001b[K\r", "remote: Counting objects: 49% (1435/2927)\u001b[K\r", "remote: Counting objects: 50% (1464/2927)\u001b[K\r", "remote: Counting objects: 51% (1493/2927)\u001b[K\r", "remote: Counting objects: 52% (1523/2927)\u001b[K\r", "remote: Counting objects: 53% (1552/2927)\u001b[K\r", "remote: Counting objects: 54% (1581/2927)\u001b[K\r", "remote: Counting objects: 55% (1610/2927)\u001b[K\r", "remote: Counting objects: 56% (1640/2927)\u001b[K\r", "remote: Counting objects: 57% (1669/2927)\u001b[K\r", "remote: Counting objects: 58% (1698/2927)\u001b[K\r", "remote: Counting objects: 59% (1727/2927)\u001b[K\r", "remote: Counting objects: 60% (1757/2927)\u001b[K\r", "remote: Counting objects: 61% (1786/2927)\u001b[K\r", "remote: Counting objects: 62% (1815/2927)\u001b[K\r", "remote: Counting objects: 63% (1845/2927)\u001b[K\r", "remote: Counting objects: 64% (1874/2927)\u001b[K\r", "remote: Counting objects: 65% (1903/2927)\u001b[K\r", "remote: Counting objects: 66% (1932/2927)\u001b[K\r", "remote: Counting objects: 67% (1962/2927)\u001b[K\r", "remote: Counting objects: 68% (1991/2927)\u001b[K\r", "remote: Counting objects: 69% (2020/2927)\u001b[K\r", "remote: Counting objects: 70% (2049/2927)\u001b[K\r", "remote: Counting objects: 71% (2079/2927)\u001b[K\r", "remote: Counting objects: 72% (2108/2927)\u001b[K\r", "remote: Counting objects: 73% (2137/2927)\u001b[K\r", "remote: Counting objects: 74% (2166/2927)\u001b[K\r", "remote: Counting objects: 75% (2196/2927)\u001b[K\r", "remote: Counting objects: 76% (2225/2927)\u001b[K\r", "remote: Counting objects: 77% (2254/2927)\u001b[K\r", "remote: Counting objects: 78% (2284/2927)\u001b[K\r", "remote: Counting objects: 79% (2313/2927)\u001b[K\r", "remote: Counting objects: 80% (2342/2927)\u001b[K\r", "remote: Counting objects: 81% (2371/2927)\u001b[K\r", "remote: Counting objects: 82% (2401/2927)\u001b[K\r", "remote: Counting objects: 83% (2430/2927)\u001b[K\r", "remote: Counting objects: 84% (2459/2927)\u001b[K\r", "remote: Counting objects: 85% (2488/2927)\u001b[K\r", "remote: Counting objects: 86% (2518/2927)\u001b[K\r", "remote: Counting objects: 87% (2547/2927)\u001b[K\r", "remote: Counting objects: 88% (2576/2927)\u001b[K\r", "remote: Counting objects: 89% (2606/2927)\u001b[K\r", "remote: Counting objects: 90% (2635/2927)\u001b[K\r", "remote: Counting objects: 91% (2664/2927)\u001b[K\r", "remote: Counting objects: 92% (2693/2927)\u001b[K\r", "remote: Counting objects: 93% (2723/2927)\u001b[K\r", "remote: Counting objects: 94% (2752/2927)\u001b[K\r", "remote: Counting objects: 95% (2781/2927)\u001b[K\r", "remote: Counting objects: 96% (2810/2927)\u001b[K\r", "remote: Counting objects: 97% (2840/2927)\u001b[K\r", "remote: Counting objects: 98% (2869/2927)\u001b[K\r", "remote: Counting objects: 99% (2898/2927)\u001b[K\r", "remote: Counting objects: 100% (2927/2927)\u001b[K\r", "remote: Counting objects: 100% (2927/2927), done.\u001b[K\r\n", "remote: Compressing objects: 0% (1/2428)\u001b[K\r", "remote: Compressing objects: 1% (25/2428)\u001b[K\r", "remote: Compressing objects: 2% (49/2428)\u001b[K\r", "remote: Compressing objects: 3% (73/2428)\u001b[K\r", "remote: Compressing objects: 4% (98/2428)\u001b[K\r", "remote: Compressing objects: 5% (122/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 6% (146/2428)\u001b[K\r", "remote: Compressing objects: 7% (170/2428)\u001b[K\r", "remote: Compressing objects: 8% (195/2428)\u001b[K\r", "remote: Compressing objects: 9% (219/2428)\u001b[K\r", "remote: Compressing objects: 10% (243/2428)\u001b[K\r", "remote: Compressing objects: 11% (268/2428)\u001b[K\r", "remote: Compressing objects: 12% (292/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 13% (316/2428)\u001b[K\r", "remote: Compressing objects: 14% (340/2428)\u001b[K\r", "remote: Compressing objects: 15% (365/2428)\u001b[K\r", "remote: Compressing objects: 16% (389/2428)\u001b[K\r", "remote: Compressing objects: 17% (413/2428)\u001b[K\r", "remote: Compressing objects: 18% (438/2428)\u001b[K\r", "remote: Compressing objects: 19% (462/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 20% (486/2428)\u001b[K\r", "remote: Compressing objects: 21% (510/2428)\u001b[K\r", "remote: Compressing objects: 22% (535/2428)\u001b[K\r", "remote: Compressing objects: 23% (559/2428)\u001b[K\r", "remote: Compressing objects: 24% (583/2428)\u001b[K\r", "remote: Compressing objects: 25% (607/2428)\u001b[K\r", "remote: Compressing objects: 26% (632/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 27% (656/2428)\u001b[K\r", "remote: Compressing objects: 28% (680/2428)\u001b[K\r", "remote: Compressing objects: 29% (705/2428)\u001b[K\r", "remote: Compressing objects: 30% (729/2428)\u001b[K\r", "remote: Compressing objects: 31% (753/2428)\u001b[K\r", "remote: Compressing objects: 32% (777/2428)\u001b[K\r", "remote: Compressing objects: 33% (802/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 34% (826/2428)\u001b[K\r", "remote: Compressing objects: 35% (850/2428)\u001b[K\r", "remote: Compressing objects: 36% (875/2428)\u001b[K\r", "remote: Compressing objects: 37% (899/2428)\u001b[K\r", "remote: Compressing objects: 38% (923/2428)\u001b[K\r", "remote: Compressing objects: 39% (947/2428)\u001b[K\r", "remote: Compressing objects: 40% (972/2428)\u001b[K\r", "remote: Compressing objects: 41% (996/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 42% (1020/2428)\u001b[K\r", "remote: Compressing objects: 43% (1045/2428)\u001b[K\r", "remote: Compressing objects: 44% (1069/2428)\u001b[K\r", "remote: Compressing objects: 45% (1093/2428)\u001b[K\r", "remote: Compressing objects: 46% (1117/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 47% (1142/2428)\u001b[K\r", "remote: Compressing objects: 48% (1166/2428)\u001b[K\r", "remote: Compressing objects: 49% (1190/2428)\u001b[K\r", "remote: Compressing objects: 50% (1214/2428)\u001b[K\r", "remote: Compressing objects: 51% (1239/2428)\u001b[K\r", "remote: Compressing objects: 52% (1263/2428)\u001b[K\r", "remote: Compressing objects: 53% (1287/2428)\u001b[K\r", "remote: Compressing objects: 54% (1312/2428)\u001b[K\r", "remote: Compressing objects: 55% (1336/2428)\u001b[K\r", "remote: Compressing objects: 56% (1360/2428)\u001b[K\r", "remote: Compressing objects: 57% (1384/2428)\u001b[K\r", "remote: Compressing objects: 58% (1409/2428)\u001b[K\r", "remote: Compressing objects: 59% (1433/2428)\u001b[K\r", "remote: Compressing objects: 60% (1457/2428)\u001b[K\r", "remote: Compressing objects: 61% (1482/2428)\u001b[K\r", "remote: Compressing objects: 62% (1506/2428)\u001b[K\r", "remote: Compressing objects: 63% (1530/2428)\u001b[K\r", "remote: Compressing objects: 64% (1554/2428)\u001b[K\r", "remote: Compressing objects: 65% (1579/2428)\u001b[K\r", "remote: Compressing objects: 66% (1603/2428)\u001b[K\r", "remote: Compressing objects: 67% (1627/2428)\u001b[K\r", "remote: Compressing objects: 68% (1652/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 69% (1676/2428)\u001b[K\r", "remote: Compressing objects: 70% (1700/2428)\u001b[K\r", "remote: Compressing objects: 71% (1724/2428)\u001b[K\r", "remote: Compressing objects: 72% (1749/2428)\u001b[K\r", "remote: Compressing objects: 73% (1773/2428)\u001b[K\r", "remote: Compressing objects: 74% (1797/2428)\u001b[K\r", "remote: Compressing objects: 75% (1821/2428)\u001b[K\r", "remote: Compressing objects: 76% (1846/2428)\u001b[K\r", "remote: Compressing objects: 77% (1870/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 78% (1894/2428)\u001b[K\r", "remote: Compressing objects: 79% (1919/2428)\u001b[K\r", "remote: Compressing objects: 80% (1943/2428)\u001b[K\r", "remote: Compressing objects: 81% (1967/2428)\u001b[K\r", "remote: Compressing objects: 82% (1991/2428)\u001b[K\r", "remote: Compressing objects: 83% (2016/2428)\u001b[K\r", "remote: Compressing objects: 84% (2040/2428)\u001b[K\r", "remote: Compressing objects: 85% (2064/2428)\u001b[K\r", "remote: Compressing objects: 86% (2089/2428)\u001b[K\r", "remote: Compressing objects: 87% (2113/2428)\u001b[K\r", "remote: Compressing objects: 88% (2137/2428)\u001b[K\r", "remote: Compressing objects: 89% (2161/2428)\u001b[K\r", "remote: Compressing objects: 90% (2186/2428)\u001b[K\r", "remote: Compressing objects: 91% (2210/2428)\u001b[K\r", "remote: Compressing objects: 92% (2234/2428)\u001b[K\r", "remote: Compressing objects: 93% (2259/2428)\u001b[K\r", "remote: Compressing objects: 94% (2283/2428)\u001b[K\r", "remote: Compressing objects: 95% (2307/2428)\u001b[K\r", "remote: Compressing objects: 96% (2331/2428)\u001b[K\r", "remote: Compressing objects: 97% (2356/2428)\u001b[K\r", "remote: Compressing objects: 98% (2380/2428)\u001b[K\r", "remote: Compressing objects: 99% (2404/2428)\u001b[K\r", "remote: Compressing objects: 100% (2428/2428)\u001b[K\r", "remote: Compressing objects: 100% (2428/2428), done.\u001b[K\r\n", "Receiving objects: 0% (1/2927)\r", "Receiving objects: 1% (30/2927)\r", "Receiving objects: 2% (59/2927)\r", "Receiving objects: 3% (88/2927)\r", "Receiving objects: 4% (118/2927)\r", "Receiving objects: 5% (147/2927)\r", "Receiving objects: 6% (176/2927)\r", "Receiving objects: 7% (205/2927)\r", "Receiving objects: 8% (235/2927)\r", "Receiving objects: 9% (264/2927)\r", "Receiving objects: 10% (293/2927)\r", "Receiving objects: 11% (322/2927)\r", "Receiving objects: 12% (352/2927)\r", "Receiving objects: 13% (381/2927)\r", "Receiving objects: 14% (410/2927)\r", "Receiving objects: 15% (440/2927)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 16% (469/2927)\r", "Receiving objects: 17% (498/2927)\r", "Receiving objects: 18% (527/2927)\r", "Receiving objects: 19% (557/2927)\r", "Receiving objects: 20% (586/2927)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 21% (615/2927)\r", "Receiving objects: 22% (644/2927)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 23% (674/2927)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 24% (703/2927), 12.37 MiB | 24.74 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 25% (732/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 26% (762/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 27% (791/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 28% (820/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 29% (849/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 30% (879/2927), 12.37 MiB | 24.74 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 31% (908/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 32% (937/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 33% (966/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 34% (996/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 35% (1025/2927), 12.37 MiB | 24.74 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 36% (1054/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 37% (1083/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 38% (1113/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 39% (1142/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 40% (1171/2927), 12.37 MiB | 24.74 MiB/s\r", "Receiving objects: 41% (1201/2927), 12.37 MiB | 24.74 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 41% (1222/2927), 43.36 MiB | 43.35 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 42% (1230/2927), 78.91 MiB | 52.60 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 42% (1257/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 43% (1259/2927), 107.37 MiB | 53.68 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 44% (1288/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 45% (1318/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 46% (1347/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 47% (1376/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 48% (1405/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 49% (1435/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 50% (1464/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 51% (1493/2927), 107.37 MiB | 53.68 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 52% (1523/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 53% (1552/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 54% (1581/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 55% (1610/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 56% (1640/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 57% (1669/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 58% (1698/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 59% (1727/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 60% (1757/2927), 107.37 MiB | 53.68 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 61% (1786/2927), 107.37 MiB | 53.68 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 62% (1815/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 63% (1845/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 64% (1874/2927), 107.37 MiB | 53.68 MiB/s\r", "Receiving objects: 65% (1903/2927), 107.37 MiB | 53.68 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 65% (1912/2927), 167.07 MiB | 55.71 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 66% (1932/2927), 195.54 MiB | 55.88 MiB/s\r", "Receiving objects: 67% (1962/2927), 195.54 MiB | 55.88 MiB/s\r", "Receiving objects: 68% (1991/2927), 195.54 MiB | 55.88 MiB/s\r", "Receiving objects: 69% (2020/2927), 195.54 MiB | 55.88 MiB/s\r", "Receiving objects: 70% (2049/2927), 195.54 MiB | 55.88 MiB/s\r", "Receiving objects: 71% (2079/2927), 195.54 MiB | 55.88 MiB/s\r", "Receiving objects: 72% (2108/2927), 195.54 MiB | 55.88 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 73% (2137/2927), 195.54 MiB | 55.88 MiB/s\r", "Receiving objects: 74% (2166/2927), 195.54 MiB | 55.88 MiB/s\r", "Receiving objects: 75% (2196/2927), 195.54 MiB | 55.88 MiB/s\r", "Receiving objects: 76% (2225/2927), 195.54 MiB | 55.88 MiB/s\r", "Receiving objects: 77% (2254/2927), 195.54 MiB | 55.88 MiB/s\r", "Receiving objects: 78% (2284/2927), 195.54 MiB | 55.88 MiB/s\r", "Receiving objects: 79% (2313/2927), 195.54 MiB | 55.88 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 79% (2322/2927), 223.14 MiB | 55.80 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 80% (2342/2927), 223.14 MiB | 55.80 MiB/s\r", "Receiving objects: 81% (2371/2927), 223.14 MiB | 55.80 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 82% (2401/2927), 247.84 MiB | 55.09 MiB/s\r", "Receiving objects: 83% (2430/2927), 247.84 MiB | 55.09 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 83% (2444/2927), 273.63 MiB | 58.07 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 84% (2459/2927), 273.63 MiB | 58.07 MiB/s\r", "Receiving objects: 85% (2488/2927), 273.63 MiB | 58.07 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 86% (2518/2927), 302.57 MiB | 57.61 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 87% (2547/2927), 302.57 MiB | 57.61 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 88% (2576/2927), 302.57 MiB | 57.61 MiB/s\r", "Receiving objects: 89% (2606/2927), 302.57 MiB | 57.61 MiB/s\r", "Receiving objects: 90% (2635/2927), 302.57 MiB | 57.61 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 90% (2640/2927), 327.34 MiB | 55.22 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 91% (2664/2927), 327.34 MiB | 55.22 MiB/s\r", "Receiving objects: 92% (2693/2927), 327.34 MiB | 55.22 MiB/s\r", "Receiving objects: 93% (2723/2927), 327.34 MiB | 55.22 MiB/s\r", "Receiving objects: 94% (2752/2927), 327.34 MiB | 55.22 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 95% (2781/2927), 357.02 MiB | 55.49 MiB/s\r", "Receiving objects: 96% (2810/2927), 357.02 MiB | 55.49 MiB/s\r", "Receiving objects: 97% (2840/2927), 357.02 MiB | 55.49 MiB/s\r", "Receiving objects: 98% (2869/2927), 357.02 MiB | 55.49 MiB/s\r", "remote: Total 2927 (delta 503), reused 2114 (delta 424), pack-reused 0\u001b[K\r\n", "Receiving objects: 99% (2898/2927), 357.02 MiB | 55.49 MiB/s\r", "Receiving objects: 100% (2927/2927), 357.02 MiB | 55.49 MiB/s\r", "Receiving objects: 100% (2927/2927), 369.04 MiB | 55.21 MiB/s, done.\r\n", "Resolving deltas: 0% (0/503)\r", "Resolving deltas: 1% (6/503)\r", "Resolving deltas: 2% (11/503)\r", "Resolving deltas: 3% (16/503)\r", "Resolving deltas: 4% (22/503)\r", "Resolving deltas: 5% (28/503)\r", "Resolving deltas: 6% (32/503)\r", "Resolving deltas: 7% (36/503)\r", "Resolving deltas: 8% (42/503)\r", "Resolving deltas: 9% (46/503)\r", "Resolving deltas: 10% (51/503)\r", "Resolving deltas: 11% (56/503)\r", "Resolving deltas: 12% (61/503)\r", "Resolving deltas: 13% (68/503)\r", "Resolving deltas: 14% (73/503)\r", "Resolving deltas: 15% (76/503)\r", "Resolving deltas: 16% (81/503)\r", "Resolving deltas: 17% (86/503)\r", "Resolving deltas: 18% (91/503)\r", "Resolving deltas: 19% (96/503)\r", "Resolving deltas: 20% (101/503)\r", "Resolving deltas: 21% (108/503)\r", "Resolving deltas: 22% (112/503)\r", "Resolving deltas: 23% (116/503)\r", "Resolving deltas: 24% (123/503)\r", "Resolving deltas: 25% (126/503)\r", "Resolving deltas: 26% (132/503)\r", "Resolving deltas: 27% (136/503)\r", "Resolving deltas: 28% (141/503)\r", "Resolving deltas: 29% (146/503)\r", "Resolving deltas: 30% (151/503)\r", "Resolving deltas: 31% (156/503)\r", "Resolving deltas: 32% (161/503)\r", "Resolving deltas: 33% (166/503)\r", "Resolving deltas: 34% (172/503)\r", "Resolving deltas: 35% (177/503)\r", "Resolving deltas: 36% (184/503)\r", "Resolving deltas: 37% (191/503)\r", "Resolving deltas: 41% (207/503)\r", "Resolving deltas: 42% (216/503)\r", "Resolving deltas: 43% (218/503)\r", "Resolving deltas: 44% (223/503)\r", "Resolving deltas: 45% (227/503)\r", "Resolving deltas: 46% (235/503)\r", "Resolving deltas: 47% (237/503)\r", "Resolving deltas: 48% (242/503)\r", "Resolving deltas: 49% (248/503)\r", "Resolving deltas: 50% (252/503)\r", "Resolving deltas: 51% (257/503)\r", "Resolving deltas: 52% (263/503)\r", "Resolving deltas: 53% (267/503)\r", "Resolving deltas: 54% (273/503)\r", "Resolving deltas: 56% (282/503)\r", "Resolving deltas: 58% (292/503)\r", "Resolving deltas: 59% (297/503)\r", "Resolving deltas: 60% (302/503)\r", "Resolving deltas: 61% (307/503)\r", "Resolving deltas: 62% (312/503)\r", "Resolving deltas: 63% (318/503)\r", "Resolving deltas: 64% (322/503)\r", "Resolving deltas: 65% (328/503)\r", "Resolving deltas: 66% (332/503)\r", "Resolving deltas: 67% (339/503)\r", "Resolving deltas: 68% (343/503)\r", "Resolving deltas: 69% (348/503)\r", "Resolving deltas: 70% (353/503)\r", "Resolving deltas: 71% (360/503)\r", "Resolving deltas: 72% (365/503)\r", "Resolving deltas: 73% (368/503)\r", "Resolving deltas: 74% (373/503)\r", "Resolving deltas: 75% (378/503)\r", "Resolving deltas: 76% (387/503)\r", "Resolving deltas: 77% (388/503)\r", "Resolving deltas: 78% (393/503)\r", "Resolving deltas: 79% (398/503)\r", "Resolving deltas: 80% (403/503)\r", "Resolving deltas: 81% (410/503)\r", "Resolving deltas: 82% (413/503)\r", "Resolving deltas: 83% (418/503)\r", "Resolving deltas: 84% (423/503)\r", "Resolving deltas: 85% (429/503)\r", "Resolving deltas: 86% (434/503)\r", "Resolving deltas: 87% (438/503)\r", "Resolving deltas: 88% (445/503)\r", "Resolving deltas: 89% (448/503)\r", "Resolving deltas: 90% (453/503)\r", "Resolving deltas: 91% (458/503)\r", "Resolving deltas: 92% (466/503)\r", "Resolving deltas: 94% (474/503)\r", "Resolving deltas: 95% (479/503)\r", "Resolving deltas: 96% (483/503)\r", "Resolving deltas: 97% (491/503)\r", "Resolving deltas: 98% (495/503)\r", "Resolving deltas: 99% (498/503)\r", "Resolving deltas: 100% (503/503)\r", "Resolving deltas: 100% (503/503), done.\r\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 56% (1576/2768)\r", "Updating files: 57% (1578/2768)\r", "Updating files: 58% (1606/2768)\r", "Updating files: 59% (1634/2768)\r", "Updating files: 60% (1661/2768)\r", "Updating files: 61% (1689/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 62% (1717/2768)\r", "Updating files: 63% (1744/2768)\r", "Updating files: 64% (1772/2768)\r", "Updating files: 65% (1800/2768)\r", "Updating files: 66% (1827/2768)\r", "Updating files: 67% (1855/2768)\r", "Updating files: 68% (1883/2768)\r", "Updating files: 69% (1910/2768)\r", "Updating files: 70% (1938/2768)\r", "Updating files: 71% (1966/2768)\r", "Updating files: 72% (1993/2768)\r", "Updating files: 73% (2021/2768)\r", "Updating files: 74% (2049/2768)\r", "Updating files: 75% (2076/2768)\r", "Updating files: 76% (2104/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 77% (2132/2768)\r", "Updating files: 78% (2160/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 79% (2187/2768)\r", "Updating files: 80% (2215/2768)\r", "Updating files: 81% (2243/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 81% (2248/2768)\r", "Updating files: 82% (2270/2768)\r", "Updating files: 83% (2298/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 84% (2326/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 85% (2353/2768)\r", "Updating files: 86% (2381/2768)\r", "Updating files: 87% (2409/2768)\r", "Updating files: 88% (2436/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 89% (2464/2768)\r", "Updating files: 90% (2492/2768)\r", "Updating files: 91% (2519/2768)\r", "Updating files: 92% (2547/2768)\r", "Updating files: 93% (2575/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 94% (2602/2768)\r", "Updating files: 95% (2630/2768)\r", "Updating files: 96% (2658/2768)\r", "Updating files: 97% (2685/2768)\r", "Updating files: 98% (2713/2768)\r", "Updating files: 99% (2741/2768)\r", "Updating files: 100% (2768/2768)\r", "Updating files: 100% (2768/2768), done.\r\n" ] } ], "source": [ "!git clone --branch r1.13.0 --depth 1 https://github.com/tensorflow/models" ] }, { "cell_type": "markdown", "metadata": { "id": "wfHOhbkgvrKr" }, "source": [ "### 读取帮助\n", "\n", "脚本应当随 TensorFlow 安装。下面是内置帮助命令:" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:00.858988Z", "iopub.status.busy": "2022-12-14T22:40:00.858232Z", "iopub.status.idle": "2022-12-14T22:40:03.292015Z", "shell.execute_reply": "2022-12-14T22:40:03.290921Z" }, "id": "m2GF-tlntqTQ" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "2022-12-14 22:40:01.944493: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory\r\n", "2022-12-14 22:40:01.944619: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvinfer_plugin.so.7: cannot open shared object file: No such file or directory\r\n", "2022-12-14 22:40:01.944639: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly.\r\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "usage: tf_upgrade_v2 [-h] [--infile INPUT_FILE] [--outfile OUTPUT_FILE]\r\n", " [--intree INPUT_TREE] [--outtree OUTPUT_TREE]\r\n", " [--copyotherfiles COPY_OTHER_FILES] [--inplace]\r\n", " [--no_import_rename] [--no_upgrade_compat_v1_import]\r\n", " [--reportfile REPORT_FILENAME] [--mode {DEFAULT,SAFETY}]\r\n", " [--print_all]\r\n", "\r\n", "Convert a TensorFlow Python file from 1.x to 2.0\r\n", "\r\n", "Simple usage:\r\n", " tf_upgrade_v2.py --infile foo.py --outfile bar.py\r\n", " tf_upgrade_v2.py --infile foo.ipynb --outfile bar.ipynb\r\n", " tf_upgrade_v2.py --intree ~/code/old --outtree ~/code/new\r\n", "\r\n", "optional arguments:\r\n", " -h, --help show this help message and exit\r\n", " --infile INPUT_FILE If converting a single file, the name of the file to\r\n", " convert\r\n", " --outfile OUTPUT_FILE\r\n", " If converting a single file, the output filename.\r\n", " --intree INPUT_TREE If converting a whole tree of files, the directory to\r\n", " read from (relative or absolute).\r\n", " --outtree OUTPUT_TREE\r\n", " If converting a whole tree of files, the output\r\n", " directory (relative or absolute).\r\n", " --copyotherfiles COPY_OTHER_FILES\r\n", " If converting a whole tree of files, whether to copy\r\n", " the other files.\r\n", " --inplace If converting a set of files, whether to allow the\r\n", " conversion to be performed on the input files.\r\n", " --no_import_rename Not to rename import to compat.v2 explicitly.\r\n", " --no_upgrade_compat_v1_import\r\n", " If specified, don't upgrade explicit imports of\r\n", " `tensorflow.compat.v1 as tf` to the v2 APIs.\r\n", " Otherwise, explicit imports of the form\r\n", " `tensorflow.compat.v1 as tf` will be upgraded.\r\n", " --reportfile REPORT_FILENAME\r\n", " The name of the file where the report log is\r\n", " stored.(default: report.txt)\r\n", " --mode {DEFAULT,SAFETY}\r\n", " Upgrade script mode. Supported modes: DEFAULT: Perform\r\n", " only straightforward conversions to upgrade to 2.0. In\r\n", " more difficult cases, switch to use compat.v1. SAFETY:\r\n", " Keep 1.* code intact and import compat.v1 module.\r\n", " --print_all Print full log to stdout instead of just printing\r\n", " errors\r\n" ] } ], "source": [ "!tf_upgrade_v2 -h" ] }, { "cell_type": "markdown", "metadata": { "id": "se9Leqjm1CZR" }, "source": [ "### TF1 代码示例" ] }, { "cell_type": "markdown", "metadata": { "id": "whD5i36s1SuM" }, "source": [ "下面是一个简单的 TensorFlow 1.0 脚本示例:" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:03.296957Z", "iopub.status.busy": "2022-12-14T22:40:03.296320Z", "iopub.status.idle": "2022-12-14T22:40:03.419617Z", "shell.execute_reply": "2022-12-14T22:40:03.418814Z" }, "id": "mhGbYQ9HwbeU" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " # Calculate loss using mean squared error\r\n", " average_loss = tf.losses.mean_squared_error(labels, predictions)\r\n", "\r\n", " # Pre-made estimators use the total_loss instead of the average,\r\n", " # so report total_loss for compatibility.\r\n", " batch_size = tf.shape(labels)[0]\r\n", " total_loss = tf.to_float(batch_size) * average_loss\r\n", "\r\n", " if mode == tf.estimator.ModeKeys.TRAIN:\r\n", " optimizer = params.get(\"optimizer\", tf.train.AdamOptimizer)\r\n" ] } ], "source": [ "!head -n 65 models/samples/cookbook/regression/custom_regression.py | tail -n 10" ] }, { "cell_type": "markdown", "metadata": { "id": "UGO7xSyL89wX" }, "source": [ "对于安装的 TensorFlow 2.0,它不会运行:" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:03.423818Z", "iopub.status.busy": "2022-12-14T22:40:03.423166Z", "iopub.status.idle": "2022-12-14T22:40:03.543685Z", "shell.execute_reply": "2022-12-14T22:40:03.542916Z" }, "id": "TD7fFphX8_qE" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "/bin/bash: -c: line 0: syntax error near unexpected token `;&'\r\n", "/bin/bash: -c: line 0: `(cd models/samples/cookbook/regression && python custom_regression.py)'\r\n" ] } ], "source": [ "!(cd models/samples/cookbook/regression && python custom_regression.py)" ] }, { "cell_type": "markdown", "metadata": { "id": "iZZHu0H0wLRJ" }, "source": [ "### 单个文件\n", "\n", "升级脚本可以在单个 Python 文件上运行:" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:03.547662Z", "iopub.status.busy": "2022-12-14T22:40:03.547006Z", "iopub.status.idle": "2022-12-14T22:40:06.014717Z", "shell.execute_reply": "2022-12-14T22:40:06.013644Z" }, "id": "xIBZVEjkqkc5" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "2022-12-14 22:40:04.636012: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory\r\n", "2022-12-14 22:40:04.636133: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvinfer_plugin.so.7: cannot open shared object file: No such file or directory\r\n", "2022-12-14 22:40:04.636153: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly.\r\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "INFO line 38:8: Renamed 'tf.feature_column.input_layer' to 'tf.compat.v1.feature_column.input_layer'\r\n", "INFO line 43:10: Renamed 'tf.layers.dense' to 'tf.compat.v1.layers.dense'\r\n", "INFO line 46:17: Renamed 'tf.layers.dense' to 'tf.compat.v1.layers.dense'\r\n", "INFO line 57:17: tf.losses.mean_squared_error requires manual check. tf.losses have been replaced with object oriented versions in TF 2.0 and after. The loss function calls have been converted to compat.v1 for backward compatibility. Please update these calls to the TF 2.0 versions.\r\n", "INFO line 57:17: Renamed 'tf.losses.mean_squared_error' to 'tf.compat.v1.losses.mean_squared_error'\r\n", "INFO line 61:15: Added keywords to args of function 'tf.shape'\r\n", "INFO line 62:15: Changed tf.to_float call to tf.cast(..., dtype=tf.float32).\r\n", "INFO line 65:40: Renamed 'tf.train.AdamOptimizer' to 'tf.compat.v1.train.AdamOptimizer'\r\n", "INFO line 68:39: Renamed 'tf.train.get_global_step' to 'tf.compat.v1.train.get_global_step'\r\n", "INFO line 83:9: tf.metrics.root_mean_squared_error requires manual check. tf.metrics have been replaced with object oriented versions in TF 2.0 and after. The metric function calls have been converted to compat.v1 for backward compatibility. Please update these calls to the TF 2.0 versions.\r\n", "INFO line 83:9: Renamed 'tf.metrics.root_mean_squared_error' to 'tf.compat.v1.metrics.root_mean_squared_error'\r\n", "INFO line 142:23: Renamed 'tf.train.AdamOptimizer' to 'tf.compat.v1.train.AdamOptimizer'\r\n", "INFO line 162:2: Renamed 'tf.logging.set_verbosity' to 'tf.compat.v1.logging.set_verbosity'\r\n", "INFO line 162:27: Renamed 'tf.logging.INFO' to 'tf.compat.v1.logging.INFO'\r\n", "INFO line 163:2: Renamed 'tf.app.run' to 'tf.compat.v1.app.run'\r\n", "TensorFlow 2.0 Upgrade Script\r\n", "-----------------------------\r\n", "Converted 1 files\r\n", "Detected 0 issues that require attention\r\n", "--------------------------------------------------------------------------------\r\n", "\r\n", "\r\n", "Make sure to read the detailed log 'report.txt'\r\n", "\r\n" ] } ], "source": [ "!tf_upgrade_v2 \\\n", " --infile models/samples/cookbook/regression/custom_regression.py \\\n", " --outfile /tmp/custom_regression_v2.py" ] }, { "cell_type": "markdown", "metadata": { "id": "L9X2lxzqqkc9" }, "source": [ "如果无法找到解决代码问题的方法,该脚本会打印错误消息。 " ] }, { "cell_type": "markdown", "metadata": { "id": "r7zpuE1vWSlL" }, "source": [ "### 目录树" ] }, { "cell_type": "markdown", "metadata": { "id": "2q7Gtuu8SdIC" }, "source": [ "典型项目(包括下面的简单示例)会使用远不止一个文件。通常需要升级整个软件包,所以该脚本也可以在目录树上运行:" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:06.019922Z", "iopub.status.busy": "2022-12-14T22:40:06.019183Z", "iopub.status.idle": "2022-12-14T22:40:08.561328Z", "shell.execute_reply": "2022-12-14T22:40:08.560420Z" }, "id": "XGqcdkAPqkc-" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "2022-12-14 22:40:07.095248: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory\r\n", "2022-12-14 22:40:07.095389: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvinfer_plugin.so.7: cannot open shared object file: No such file or directory\r\n", "2022-12-14 22:40:07.095413: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly.\r\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "INFO line 40:7: Renamed 'tf.test.mock' to 'tf.compat.v1.test.mock'\r\n", "WARNING line 125:15: Changing dataset.make_one_shot_iterator() to tf.compat.v1.data.make_one_shot_iterator(dataset). Please check this transformation.\r\n", "\r\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "INFO line 58:10: tf.estimator.LinearRegressor: Default value of loss_reduction has been changed to SUM_OVER_BATCH_SIZE; inserting old default value tf.keras.losses.Reduction.SUM.\r\n", "\r\n", "INFO line 101:2: Renamed 'tf.logging.set_verbosity' to 'tf.compat.v1.logging.set_verbosity'\r\n", "INFO line 101:27: Renamed 'tf.logging.INFO' to 'tf.compat.v1.logging.INFO'\r\n", "INFO line 102:2: Renamed 'tf.app.run' to 'tf.compat.v1.app.run'\r\n", "INFO line 72:10: tf.estimator.DNNRegressor: Default value of loss_reduction has been changed to SUM_OVER_BATCH_SIZE; inserting old default value tf.keras.losses.Reduction.SUM.\r\n", "\r\n", "INFO line 96:2: Renamed 'tf.logging.set_verbosity' to 'tf.compat.v1.logging.set_verbosity'\r\n", "INFO line 96:27: Renamed 'tf.logging.INFO' to 'tf.compat.v1.logging.INFO'\r\n", "INFO line 97:2: Renamed 'tf.app.run' to 'tf.compat.v1.app.run'\r\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "INFO line 82:10: tf.estimator.LinearRegressor: Default value of loss_reduction has been changed to SUM_OVER_BATCH_SIZE; inserting old default value tf.keras.losses.Reduction.SUM.\r\n", "\r\n", "INFO line 105:2: Renamed 'tf.logging.set_verbosity' to 'tf.compat.v1.logging.set_verbosity'\r\n", "INFO line 105:27: Renamed 'tf.logging.INFO' to 'tf.compat.v1.logging.INFO'\r\n", "INFO line 106:2: Renamed 'tf.app.run' to 'tf.compat.v1.app.run'\r\n", "INFO line 38:8: Renamed 'tf.feature_column.input_layer' to 'tf.compat.v1.feature_column.input_layer'\r\n", "INFO line 43:10: Renamed 'tf.layers.dense' to 'tf.compat.v1.layers.dense'\r\n", "INFO line 46:17: Renamed 'tf.layers.dense' to 'tf.compat.v1.layers.dense'\r\n", "INFO line 57:17: tf.losses.mean_squared_error requires manual check. tf.losses have been replaced with object oriented versions in TF 2.0 and after. The loss function calls have been converted to compat.v1 for backward compatibility. Please update these calls to the TF 2.0 versions.\r\n", "INFO line 57:17: Renamed 'tf.losses.mean_squared_error' to 'tf.compat.v1.losses.mean_squared_error'\r\n", "INFO line 61:15: Added keywords to args of function 'tf.shape'\r\n", "INFO line 62:15: Changed tf.to_float call to tf.cast(..., dtype=tf.float32).\r\n", "INFO line 65:40: Renamed 'tf.train.AdamOptimizer' to 'tf.compat.v1.train.AdamOptimizer'\r\n", "INFO line 68:39: Renamed 'tf.train.get_global_step' to 'tf.compat.v1.train.get_global_step'\r\n", "INFO line 83:9: tf.metrics.root_mean_squared_error requires manual check. tf.metrics have been replaced with object oriented versions in TF 2.0 and after. The metric function calls have been converted to compat.v1 for backward compatibility. Please update these calls to the TF 2.0 versions.\r\n", "INFO line 83:9: Renamed 'tf.metrics.root_mean_squared_error' to 'tf.compat.v1.metrics.root_mean_squared_error'\r\n", "INFO line 142:23: Renamed 'tf.train.AdamOptimizer' to 'tf.compat.v1.train.AdamOptimizer'\r\n", "INFO line 162:2: Renamed 'tf.logging.set_verbosity' to 'tf.compat.v1.logging.set_verbosity'\r\n", "INFO line 162:27: Renamed 'tf.logging.INFO' to 'tf.compat.v1.logging.INFO'\r\n", "INFO line 163:2: Renamed 'tf.app.run' to 'tf.compat.v1.app.run'\r\n", "TensorFlow 2.0 Upgrade Script\r\n", "-----------------------------\r\n", "Converted 7 files\r\n", "Detected 1 issues that require attention\r\n", "--------------------------------------------------------------------------------\r\n", "--------------------------------------------------------------------------------\r\n", "File: models/samples/cookbook/regression/automobile_data.py\r\n", "--------------------------------------------------------------------------------\r\n", "models/samples/cookbook/regression/automobile_data.py:125:15: WARNING: Changing dataset.make_one_shot_iterator() to tf.compat.v1.data.make_one_shot_iterator(dataset). Please check this transformation.\r\n", "\r\n", "\r\n", "\r\n", "Make sure to read the detailed log 'tree_report.txt'\r\n", "\r\n" ] } ], "source": [ "# upgrade the .py files and copy all the other files to the outtree\n", "!tf_upgrade_v2 \\\n", " --intree models/samples/cookbook/regression/ \\\n", " --outtree regression_v2/ \\\n", " --reportfile tree_report.txt" ] }, { "cell_type": "markdown", "metadata": { "id": "2S4j7sqbSowC" }, "source": [ "注意关于 `dataset.make_one_shot_iterator` 函数的一条警告。\n", "\n", "现在,对于 TensorFlow 2.0,该脚本已经可以发挥作用:\n", "\n", "请注意,凭借 `tf.compat.v1` 模块,转换的脚本在 TensorFlow 1.14 中也可以运行。 " ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:08.565833Z", "iopub.status.busy": "2022-12-14T22:40:08.565268Z", "iopub.status.idle": "2022-12-14T22:40:08.686422Z", "shell.execute_reply": "2022-12-14T22:40:08.685695Z" }, "id": "vh0cmW3y1tX9" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "/bin/bash: -c: line 0: syntax error near unexpected token `;&'\r\n", "/bin/bash: -c: line 0: `(cd regression_v2 && python custom_regression.py 2>&1) | tail'\r\n" ] } ], "source": [ "!(cd regression_v2 && python custom_regression.py 2>&1) | tail" ] }, { "cell_type": "markdown", "metadata": { "id": "4EgZGGkdqkdC" }, "source": [ "## 详细报告\n", "\n", "该脚本还会报告一个详细更改列表。在本例中,它发现了一个可能不安全的转换,因此在文件顶部包含了一条警告: " ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:08.690386Z", "iopub.status.busy": "2022-12-14T22:40:08.689874Z", "iopub.status.idle": "2022-12-14T22:40:08.809841Z", "shell.execute_reply": "2022-12-14T22:40:08.809117Z" }, "id": "CtHaZbVaNMGV" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "TensorFlow 2.0 Upgrade Script\r\n", "-----------------------------\r\n", "Converted 7 files\r\n", "Detected 1 issues that require attention\r\n", "--------------------------------------------------------------------------------\r\n", "--------------------------------------------------------------------------------\r\n", "File: models/samples/cookbook/regression/automobile_data.py\r\n", "--------------------------------------------------------------------------------\r\n", "models/samples/cookbook/regression/automobile_data.py:125:15: WARNING: Changing dataset.make_one_shot_iterator() to tf.compat.v1.data.make_one_shot_iterator(dataset). Please check this transformation.\r\n", "\r\n", "================================================================================\r\n", "Detailed log follows:\r\n", "\r\n", "================================================================================\r\n", "================================================================================\r\n", "Input tree: 'models/samples/cookbook/regression/'\r\n", "================================================================================\r\n", "--------------------------------------------------------------------------------\r\n", "Processing file 'models/samples/cookbook/regression/__init__.py'\r\n", " outputting to 'regression_v2/__init__.py'\r\n" ] } ], "source": [ "!head -n 20 tree_report.txt" ] }, { "cell_type": "markdown", "metadata": { "id": "1-UIFXP3cFSa" }, "source": [ "再次注意关于 `Dataset.make_one_shot_iterator` 函数的一条警告。" ] }, { "cell_type": "markdown", "metadata": { "id": "oxQeYS1TN-jv" }, "source": [ "在其他情况下,对于非常用更改,输出会解释原因:" ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:08.813671Z", "iopub.status.busy": "2022-12-14T22:40:08.813161Z", "iopub.status.idle": "2022-12-14T22:40:08.818362Z", "shell.execute_reply": "2022-12-14T22:40:08.817792Z" }, "id": "WQs9kEvVN9th" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Writing dropout.py\n" ] } ], "source": [ "%%writefile dropout.py\n", "import tensorflow as tf\n", "\n", "d = tf.nn.dropout(tf.range(10), 0.2)\n", "z = tf.zeros_like(d, optimize=False)" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:08.821375Z", "iopub.status.busy": "2022-12-14T22:40:08.820856Z", "iopub.status.idle": "2022-12-14T22:40:08.988718Z", "shell.execute_reply": "2022-12-14T22:40:08.987914Z" }, "id": "7uOkacZsO3XX" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "/bin/bash: gt: command not found\r\n", "/bin/bash: /dev/null: Permission denied\r\n" ] } ], "source": [ "!tf_upgrade_v2 \\\n", " --infile dropout.py \\\n", " --outfile dropout_v2.py \\\n", " --reportfile dropout_report.txt > /dev/null" ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:08.992798Z", "iopub.status.busy": "2022-12-14T22:40:08.992226Z", "iopub.status.idle": "2022-12-14T22:40:09.113701Z", "shell.execute_reply": "2022-12-14T22:40:09.112855Z" }, "id": "m-J82-scPMGl" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "cat: dropout_report.txt: No such file or directory\r\n" ] } ], "source": [ "!cat dropout_report.txt" ] }, { "cell_type": "markdown", "metadata": { "id": "DOOLN21nTGSS" }, "source": [ "以下是经过修改的文件内容,请注意脚本如何通过添加参数名来处理移动和重命名的参数:" ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:09.117774Z", "iopub.status.busy": "2022-12-14T22:40:09.117256Z", "iopub.status.idle": "2022-12-14T22:40:09.240565Z", "shell.execute_reply": "2022-12-14T22:40:09.239666Z" }, "id": "SrYcJk9-TFlU" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "cat: dropout_v2.py: No such file or directory\r\n" ] } ], "source": [ "!cat dropout_v2.py" ] }, { "cell_type": "markdown", "metadata": { "id": "wI_sVNp_b4C4" }, "source": [ "更大的项目可能会包含一些错误,例如转换 DeepLab 模型:" ] }, { "cell_type": "code", "execution_count": 15, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:09.246628Z", "iopub.status.busy": "2022-12-14T22:40:09.246085Z", "iopub.status.idle": "2022-12-14T22:40:09.439224Z", "shell.execute_reply": "2022-12-14T22:40:09.438179Z" }, "id": "uzuY-bOvYBS7" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "/bin/bash: gt: command not found\r\n", "/bin/bash: /dev/null: Permission denied\r\n" ] } ], "source": [ "!tf_upgrade_v2 \\\n", " --intree models/research/deeplab \\\n", " --outtree deeplab_v2 \\\n", " --reportfile deeplab_report.txt > /dev/null" ] }, { "cell_type": "markdown", "metadata": { "id": "FLhw3fm8drae" }, "source": [ "它会生成输出文件:" ] }, { "cell_type": "code", "execution_count": 16, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:09.444067Z", "iopub.status.busy": "2022-12-14T22:40:09.443306Z", "iopub.status.idle": "2022-12-14T22:40:09.567207Z", "shell.execute_reply": "2022-12-14T22:40:09.566314Z" }, "id": "4YYLRxWJdSvQ" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "ls: cannot access 'deeplab_v2': No such file or directory\r\n" ] } ], "source": [ "!ls deeplab_v2" ] }, { "cell_type": "markdown", "metadata": { "id": "qtTC-cAZdEBy" }, "source": [ "但是其中包含错误。该报告会帮助您找到确保代码可以正常运行所需要解决的错误。下面是前三个错误:" ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:09.571969Z", "iopub.status.busy": "2022-12-14T22:40:09.571386Z", "iopub.status.idle": "2022-12-14T22:40:09.696393Z", "shell.execute_reply": "2022-12-14T22:40:09.695336Z" }, "id": "UVTNOohlcyVZ" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "cat: deeplab_report.txt: No such file or directory\r\n" ] } ], "source": [ "!cat deeplab_report.txt | grep -i models/research/deeplab | grep -i error | head -n 3" ] }, { "cell_type": "markdown", "metadata": { "id": "gGBeDaFVRJ5l" }, "source": [ "## “安全”模式" ] }, { "cell_type": "markdown", "metadata": { "id": "BnfCxB7SVtTO" }, "source": [ "该转换脚本还有一种介入度相对较低的 `SAFETY` 模式。在此模式下,只需更改导入来使用 `tensorflow.compat.v1` 模块:" ] }, { "cell_type": "code", "execution_count": 18, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:09.700973Z", "iopub.status.busy": "2022-12-14T22:40:09.700683Z", "iopub.status.idle": "2022-12-14T22:40:09.825338Z", "shell.execute_reply": "2022-12-14T22:40:09.823531Z" }, "id": "XdaVXCPWQCC5" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "import tensorflow as tf\r\n", "\r\n", "d = tf.nn.dropout(tf.range(10), 0.2)\r\n", "z = tf.zeros_like(d, optimize=False)\r\n" ] } ], "source": [ "!cat dropout.py" ] }, { "cell_type": "code", "execution_count": 19, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:09.832871Z", "iopub.status.busy": "2022-12-14T22:40:09.832111Z", "iopub.status.idle": "2022-12-14T22:40:10.017972Z", "shell.execute_reply": "2022-12-14T22:40:10.017068Z" }, "id": "c0tvRJLGRYEb" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "/bin/bash: gt: command not found\r\n", "/bin/bash: /dev/null: Permission denied\r\n" ] } ], "source": [ "!tf_upgrade_v2 --mode SAFETY --infile dropout.py --outfile dropout_v2_safe.py > /dev/null" ] }, { "cell_type": "code", "execution_count": 20, "metadata": { "execution": { "iopub.execute_input": "2022-12-14T22:40:10.022692Z", "iopub.status.busy": "2022-12-14T22:40:10.021988Z", "iopub.status.idle": "2022-12-14T22:40:10.147428Z", "shell.execute_reply": "2022-12-14T22:40:10.145479Z" }, "id": "91suN2RaRfIV" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "cat: dropout_v2_safe.py: No such file or directory\r\n" ] } ], "source": [ "!cat dropout_v2_safe.py" ] }, { "cell_type": "markdown", "metadata": { "id": "EOzTF7xbZqqW" }, "source": [ "如您所见,这不会升级代码,但允许 TensorFlow 1 代码在 TensorFlow 2 中运行" ] }, { "cell_type": "markdown", "metadata": { "id": "jGfXVApkqkdG" }, "source": [ "## 注意事项\n", "\n", "- 在运行此脚本之前,不要手动更新代码的某些部分。尤其是更改了参数顺序的函数(如 `tf.argmax` 或 `tf.batch_to_space`),否则会导致代码无法正确添加与现有代码匹配的关键字参数。\n", "\n", "- 该脚本假定使用 `import tensorflow as tf` 导入 `tensorflow`。\n", "\n", "- 该脚本不会更改参数顺序,但是会将关键字参数添加到本身已更改参数顺序的函数。\n", "\n", "- 请查阅 [tf2up.ml](https://github.com/lc0/tf2up),找到一款方便的工具来升级 GitHub 仓库中的 Jupyter 笔记本和 Python 文件。\n", "\n", "要报告升级脚本错误或提出功能请求,请在 [GitHub](https://github.com/tensorflow/tensorflow/issues) 上提交问题。如果您在测试 TensorFlow 2.0,我们非常希望了解您的反馈意见!请加入 [TF 2.0 测试社区](https://groups.google.com/a/tensorflow.org/forum/#!forum/testing),将您的问题和讨论发送到 [testing@tensorflow.org](mailto:testing@tensorflow.org)。" ] } ], "metadata": { "colab": { "collapsed_sections": [], "name": "upgrade.ipynb", "toc_visible": true }, "kernelspec": { "display_name": "Python 3", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.16" } }, "nbformat": 4, "nbformat_minor": 0 }