{ "cells": [ { "cell_type": "markdown", "metadata": { "id": "6bYaCABobL5q" }, "source": [ "##### Copyright 2018 The TensorFlow Authors." ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "cellView": "form", "execution": { "iopub.execute_input": "2024-07-19T01:47:14.946467Z", "iopub.status.busy": "2024-07-19T01:47:14.946038Z", "iopub.status.idle": "2024-07-19T01:47:14.949952Z", "shell.execute_reply": "2024-07-19T01:47:14.949346Z" }, "id": "FlUw7tSKbtg4" }, "outputs": [], "source": [ "#@title Licensed under the Apache License, Version 2.0 (the \"License\");\n", "# you may not use this file except in compliance with the License.\n", "# You may obtain a copy of the License at\n", "#\n", "# https://www.apache.org/licenses/LICENSE-2.0\n", "#\n", "# Unless required by applicable law or agreed to in writing, software\n", "# distributed under the License is distributed on an \"AS IS\" BASIS,\n", "# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n", "# See the License for the specific language governing permissions and\n", "# limitations under the License." ] }, { "cell_type": "markdown", "metadata": { "id": "08OTcmxgqkc2" }, "source": [ "# Automatically rewrite TF 1.x and compat.v1 API symbols\n", "\n", "
\n",
" \n",
" ![]() | \n",
" \n",
" \n",
" ![]() | \n",
" \n",
" \n",
" ![]() | \n",
" \n",
" \n",
" ![]() | \n",
"
\n", "tf_upgrade_v2 \\\n", " --intree my_project/ \\\n", " --outtree my_project_v2/ \\\n", " --reportfile report.txt\n", "\n", "\n", "It will accelerate your upgrade process by converting existing TensorFlow 1.x Python scripts to TensorFlow 2.x.\n", "\n", "The conversion script automates many mechanical API transformations, though many APIs cannot be automatically migrated. It is also not able to fully make your code compatible with TF2 behaviors and APIs. So, it is only a part of your migration journey." ] }, { "cell_type": "markdown", "metadata": { "id": "gP9v2vgptdfi" }, "source": [ "## Compatibility modules\n", "\n", "Certain API symbols can not be upgraded simply by using a string replacement. Those that cannot be automatically upgraded will be mapped to their locations in the `compat.v1` module. This module replaces TF 1.x symbols like `tf.foo` with the equivalent `tf.compat.v1.foo` reference. If you are already using `compat.v1` APIs by importing TF via `import tensorflow.compat.v1 as tf`, the `tf_upgrade_v2` script will attempt to convert these usages to the non-compat APIs where possible. Note that while some `compat.v1` APIs are compatible with TF2.x behaviors, many are not. Therefore, it's recommended to manually proofread replacements and migrate them to new APIs in the `tf.*` namespace instead of `tf.compat.v1` namespace as quickly as possible.\n", "\n", "Because of TensorFlow 2.x module deprecations (for example, `tf.flags` and `tf.contrib`), some changes can not be worked around by switching to `compat.v1`. Upgrading this code may require using an additional library (for example, [`absl.flags`](https://github.com/abseil/abseil-py)) or switching to a package in [tensorflow/addons](http://www.github.com/tensorflow/addons).\n" ] }, { "cell_type": "markdown", "metadata": { "id": "s78bbfjkXYb7" }, "source": [ "## Recommended upgrade process\n", "\n", "The rest of this guide demonstrates how to use the symbol-rewriting script. While the script is easy to use, it is strongly recommended that you use the script as part of the following process: \n", "\n", "1. **Unit Test**: Ensure that the code you’re upgrading has a unit test suite with reasonable coverage. This is Python code, so the language won’t protect you from many classes of mistakes. Also ensure that any dependency you have has already been upgraded to be compatible with TensorFlow 2.x.\n", "\n", "1. **Install TensorFlow 1.15**: Upgrade your TensorFlow to the latest TensorFlow 1.x version, at least 1.15. This includes the final TensorFlow 2.0 API in `tf.compat.v2`.\n", "\n", "1. **Test With 1.15**: Ensure your unit tests pass at this point. You’ll be running them repeatedly as you upgrade so starting from green is important.\n", "\n", "1. **Run the upgrade script**: Run `tf_upgrade_v2` on your entire source tree, tests included. This will upgrade your code to a format where it only uses symbols available in TensorFlow 2.0. Deprecated symbols will be accessed with `tf.compat.v1`. These will eventually require manual attention, but not immediately.\n", "\n", "1. **Run the converted tests with TensorFlow 1.15**: Your code should still run fine in TensorFlow 1.15. Run your unit tests again. Any error in your tests here means there’s a bug in the upgrade script. [Please let us know](https://github.com/tensorflow/tensorflow/issues).\n", "\n", "1. **Check the upgrade report for warnings and errors**: The script writes a report file that explains any conversions you should double-check, or any manual action you need to take. For example: Any remaining instances of contrib will require manual action to remove. Please consult [the RFC for more instructions](https://github.com/tensorflow/community/blob/master/rfcs/20180907-contrib-sunset.md). \n", "\n", "1. **Install TensorFlow 2.x**: At this point it should be safe to switch to TensorFlow 2.x binaries, even if you are running with legacy behaviors\n", "\n", "1. **Test with `v1.disable_v2_behavior`**: Re-running your tests with a `v1.disable_v2_behavior()` in the tests' main function should give the same results as running under 1.15.\n", "\n", "1. **Enable V2 Behavior**: Now that your tests work using the TF2 binaries, you can now begin migrating your code to avoiding `tf.estimator`s and only using supported TF2 behaviors (with no TF2 behavior disabling). See the [Migration guides](https://tensorflow.org/guide/migrate) for details." ] }, { "cell_type": "markdown", "metadata": { "id": "6pwSAQEwvscP" }, "source": [ "## Using the symbol-rewriting `tf_upgrade_v2` script\n" ] }, { "cell_type": "markdown", "metadata": { "id": "I9NCvDt5GwX4" }, "source": [ "### Setup\n", "\n", "Before getting started ensure that TensorFlow 2.x is installed." ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "execution": { "iopub.execute_input": "2024-07-19T01:47:14.954050Z", "iopub.status.busy": "2024-07-19T01:47:14.953511Z", "iopub.status.idle": "2024-07-19T01:47:17.294247Z", "shell.execute_reply": "2024-07-19T01:47:17.293542Z" }, "id": "DWVYbvi1WCeY" }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "2024-07-19 01:47:15.203472: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:485] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered\n", "2024-07-19 01:47:15.224820: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:8454] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered\n", "2024-07-19 01:47:15.231381: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1452] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2.17.0\n" ] } ], "source": [ "import tensorflow as tf\n", "\n", "print(tf.__version__)" ] }, { "cell_type": "markdown", "metadata": { "id": "Ycy3B5PNGutU" }, "source": [ "Clone the [tensorflow/models](https://github.com/tensorflow/models) git repository so you have some code to test on:" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "execution": { "iopub.execute_input": "2024-07-19T01:47:17.297975Z", "iopub.status.busy": "2024-07-19T01:47:17.297337Z", "iopub.status.idle": "2024-07-19T01:47:28.872883Z", "shell.execute_reply": "2024-07-19T01:47:28.871987Z" }, "id": "jyckoWyAZEhZ" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Cloning into 'models'...\r\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Enumerating objects: 2927, done.\u001b[K\r\n", "remote: Counting objects: 0% (1/2927)\u001b[K\r", "remote: Counting objects: 1% (30/2927)\u001b[K\r", "remote: Counting objects: 2% (59/2927)\u001b[K\r", "remote: Counting objects: 3% (88/2927)\u001b[K\r", "remote: Counting objects: 4% (118/2927)\u001b[K\r", "remote: Counting objects: 5% (147/2927)\u001b[K\r", "remote: Counting objects: 6% (176/2927)\u001b[K\r", "remote: Counting objects: 7% (205/2927)\u001b[K\r", "remote: Counting objects: 8% (235/2927)\u001b[K\r", "remote: Counting objects: 9% (264/2927)\u001b[K\r", "remote: Counting objects: 10% (293/2927)\u001b[K\r", "remote: Counting objects: 11% (322/2927)\u001b[K\r", "remote: Counting objects: 12% (352/2927)\u001b[K\r", "remote: Counting objects: 13% (381/2927)\u001b[K\r", "remote: Counting objects: 14% (410/2927)\u001b[K\r", "remote: Counting objects: 15% (440/2927)\u001b[K\r", "remote: Counting objects: 16% (469/2927)\u001b[K\r", "remote: Counting objects: 17% (498/2927)\u001b[K\r", "remote: Counting objects: 18% (527/2927)\u001b[K\r", "remote: Counting objects: 19% (557/2927)\u001b[K\r", "remote: Counting objects: 20% (586/2927)\u001b[K\r", "remote: Counting objects: 21% (615/2927)\u001b[K\r", "remote: Counting objects: 22% (644/2927)\u001b[K\r", "remote: Counting objects: 23% (674/2927)\u001b[K\r", "remote: Counting objects: 24% (703/2927)\u001b[K\r", "remote: Counting objects: 25% (732/2927)\u001b[K\r", "remote: Counting objects: 26% (762/2927)\u001b[K\r", "remote: Counting objects: 27% (791/2927)\u001b[K\r", "remote: Counting objects: 28% (820/2927)\u001b[K\r", "remote: Counting objects: 29% (849/2927)\u001b[K\r", "remote: Counting objects: 30% (879/2927)\u001b[K\r", "remote: Counting objects: 31% (908/2927)\u001b[K\r", "remote: Counting objects: 32% (937/2927)\u001b[K\r", "remote: Counting objects: 33% (966/2927)\u001b[K\r", "remote: Counting objects: 34% (996/2927)\u001b[K\r", "remote: Counting objects: 35% (1025/2927)\u001b[K\r", "remote: Counting objects: 36% (1054/2927)\u001b[K\r", "remote: Counting objects: 37% (1083/2927)\u001b[K\r", "remote: Counting objects: 38% (1113/2927)\u001b[K\r", "remote: Counting objects: 39% (1142/2927)\u001b[K\r", "remote: Counting objects: 40% (1171/2927)\u001b[K\r", "remote: Counting objects: 41% (1201/2927)\u001b[K\r", "remote: Counting objects: 42% (1230/2927)\u001b[K\r", "remote: Counting objects: 43% (1259/2927)\u001b[K\r", "remote: Counting objects: 44% (1288/2927)\u001b[K\r", "remote: Counting objects: 45% (1318/2927)\u001b[K\r", "remote: Counting objects: 46% (1347/2927)\u001b[K\r", "remote: Counting objects: 47% (1376/2927)\u001b[K\r", "remote: Counting objects: 48% (1405/2927)\u001b[K\r", "remote: Counting objects: 49% (1435/2927)\u001b[K\r", "remote: Counting objects: 50% (1464/2927)\u001b[K\r", "remote: Counting objects: 51% (1493/2927)\u001b[K\r", "remote: Counting objects: 52% (1523/2927)\u001b[K\r", "remote: Counting objects: 53% (1552/2927)\u001b[K\r", "remote: Counting objects: 54% (1581/2927)\u001b[K\r", "remote: Counting objects: 55% (1610/2927)\u001b[K\r", "remote: Counting objects: 56% (1640/2927)\u001b[K\r", "remote: Counting objects: 57% (1669/2927)\u001b[K\r", "remote: Counting objects: 58% (1698/2927)\u001b[K\r", "remote: Counting objects: 59% (1727/2927)\u001b[K\r", "remote: Counting objects: 60% (1757/2927)\u001b[K\r", "remote: Counting objects: 61% (1786/2927)\u001b[K\r", "remote: Counting objects: 62% (1815/2927)\u001b[K\r", "remote: Counting objects: 63% (1845/2927)\u001b[K\r", "remote: Counting objects: 64% (1874/2927)\u001b[K\r", "remote: Counting objects: 65% (1903/2927)\u001b[K\r", "remote: Counting objects: 66% (1932/2927)\u001b[K\r", "remote: Counting objects: 67% (1962/2927)\u001b[K\r", "remote: Counting objects: 68% (1991/2927)\u001b[K\r", "remote: Counting objects: 69% (2020/2927)\u001b[K\r", "remote: Counting objects: 70% (2049/2927)\u001b[K\r", "remote: Counting objects: 71% (2079/2927)\u001b[K\r", "remote: Counting objects: 72% (2108/2927)\u001b[K\r", "remote: Counting objects: 73% (2137/2927)\u001b[K\r", "remote: Counting objects: 74% (2166/2927)\u001b[K\r", "remote: Counting objects: 75% (2196/2927)\u001b[K\r", "remote: Counting objects: 76% (2225/2927)\u001b[K\r", "remote: Counting objects: 77% (2254/2927)\u001b[K\r", "remote: Counting objects: 78% (2284/2927)\u001b[K\r", "remote: Counting objects: 79% (2313/2927)\u001b[K\r", "remote: Counting objects: 80% (2342/2927)\u001b[K\r", "remote: Counting objects: 81% (2371/2927)\u001b[K\r", "remote: Counting objects: 82% (2401/2927)\u001b[K\r", "remote: Counting objects: 83% (2430/2927)\u001b[K\r", "remote: Counting objects: 84% (2459/2927)\u001b[K\r", "remote: Counting objects: 85% (2488/2927)\u001b[K\r", "remote: Counting objects: 86% (2518/2927)\u001b[K\r", "remote: Counting objects: 87% (2547/2927)\u001b[K\r", "remote: Counting objects: 88% (2576/2927)\u001b[K\r", "remote: Counting objects: 89% (2606/2927)\u001b[K\r", "remote: Counting objects: 90% (2635/2927)\u001b[K\r", "remote: Counting objects: 91% (2664/2927)\u001b[K\r", "remote: Counting objects: 92% (2693/2927)\u001b[K\r", "remote: Counting objects: 93% (2723/2927)\u001b[K\r", "remote: Counting objects: 94% (2752/2927)\u001b[K\r", "remote: Counting objects: 95% (2781/2927)\u001b[K\r", "remote: Counting objects: 96% (2810/2927)\u001b[K\r", "remote: Counting objects: 97% (2840/2927)\u001b[K\r", "remote: Counting objects: 98% (2869/2927)\u001b[K\r", "remote: Counting objects: 99% (2898/2927)\u001b[K\r", "remote: Counting objects: 100% (2927/2927)\u001b[K\r", "remote: Counting objects: 100% (2927/2927), done.\u001b[K\r\n", "remote: Compressing objects: 0% (1/2428)\u001b[K\r", "remote: Compressing objects: 1% (25/2428)\u001b[K\r", "remote: Compressing objects: 2% (49/2428)\u001b[K\r", "remote: Compressing objects: 3% (73/2428)\u001b[K\r", "remote: Compressing objects: 4% (98/2428)\u001b[K\r", "remote: Compressing objects: 5% (122/2428)\u001b[K\r", "remote: Compressing objects: 6% (146/2428)\u001b[K\r", "remote: Compressing objects: 7% (170/2428)\u001b[K\r", "remote: Compressing objects: 8% (195/2428)\u001b[K\r", "remote: Compressing objects: 9% (219/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 10% (243/2428)\u001b[K\r", "remote: Compressing objects: 11% (268/2428)\u001b[K\r", "remote: Compressing objects: 12% (292/2428)\u001b[K\r", "remote: Compressing objects: 13% (316/2428)\u001b[K\r", "remote: Compressing objects: 14% (340/2428)\u001b[K\r", "remote: Compressing objects: 15% (365/2428)\u001b[K\r", "remote: Compressing objects: 16% (389/2428)\u001b[K\r", "remote: Compressing objects: 17% (413/2428)\u001b[K\r", "remote: Compressing objects: 18% (438/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 19% (462/2428)\u001b[K\r", "remote: Compressing objects: 20% (486/2428)\u001b[K\r", "remote: Compressing objects: 21% (510/2428)\u001b[K\r", "remote: Compressing objects: 22% (535/2428)\u001b[K\r", "remote: Compressing objects: 23% (559/2428)\u001b[K\r", "remote: Compressing objects: 24% (583/2428)\u001b[K\r", "remote: Compressing objects: 25% (607/2428)\u001b[K\r", "remote: Compressing objects: 26% (632/2428)\u001b[K\r", "remote: Compressing objects: 27% (656/2428)\u001b[K\r", "remote: Compressing objects: 28% (680/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 29% (705/2428)\u001b[K\r", "remote: Compressing objects: 30% (729/2428)\u001b[K\r", "remote: Compressing objects: 31% (753/2428)\u001b[K\r", "remote: Compressing objects: 32% (777/2428)\u001b[K\r", "remote: Compressing objects: 33% (802/2428)\u001b[K\r", "remote: Compressing objects: 34% (826/2428)\u001b[K\r", "remote: Compressing objects: 35% (850/2428)\u001b[K\r", "remote: Compressing objects: 36% (875/2428)\u001b[K\r", "remote: Compressing objects: 37% (899/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 38% (923/2428)\u001b[K\r", "remote: Compressing objects: 39% (947/2428)\u001b[K\r", "remote: Compressing objects: 40% (972/2428)\u001b[K\r", "remote: Compressing objects: 41% (996/2428)\u001b[K\r", "remote: Compressing objects: 42% (1020/2428)\u001b[K\r", "remote: Compressing objects: 43% (1045/2428)\u001b[K\r", "remote: Compressing objects: 44% (1069/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 45% (1093/2428)\u001b[K\r", "remote: Compressing objects: 46% (1117/2428)\u001b[K\r", "remote: Compressing objects: 47% (1142/2428)\u001b[K\r", "remote: Compressing objects: 48% (1166/2428)\u001b[K\r", "remote: Compressing objects: 49% (1190/2428)\u001b[K\r", "remote: Compressing objects: 50% (1214/2428)\u001b[K\r", "remote: Compressing objects: 51% (1239/2428)\u001b[K\r", "remote: Compressing objects: 52% (1263/2428)\u001b[K\r", "remote: Compressing objects: 53% (1287/2428)\u001b[K\r", "remote: Compressing objects: 54% (1312/2428)\u001b[K\r", "remote: Compressing objects: 55% (1336/2428)\u001b[K\r", "remote: Compressing objects: 56% (1360/2428)\u001b[K\r", "remote: Compressing objects: 57% (1384/2428)\u001b[K\r", "remote: Compressing objects: 58% (1409/2428)\u001b[K\r", "remote: Compressing objects: 59% (1433/2428)\u001b[K\r", "remote: Compressing objects: 60% (1457/2428)\u001b[K\r", "remote: Compressing objects: 61% (1482/2428)\u001b[K\r", "remote: Compressing objects: 62% (1506/2428)\u001b[K\r", "remote: Compressing objects: 63% (1530/2428)\u001b[K\r", "remote: Compressing objects: 64% (1554/2428)\u001b[K\r", "remote: Compressing objects: 65% (1579/2428)\u001b[K\r", "remote: Compressing objects: 66% (1603/2428)\u001b[K\r", "remote: Compressing objects: 67% (1627/2428)\u001b[K\r", "remote: Compressing objects: 68% (1652/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 69% (1676/2428)\u001b[K\r", "remote: Compressing objects: 70% (1700/2428)\u001b[K\r", "remote: Compressing objects: 71% (1724/2428)\u001b[K\r", "remote: Compressing objects: 72% (1749/2428)\u001b[K\r", "remote: Compressing objects: 73% (1773/2428)\u001b[K\r", "remote: Compressing objects: 74% (1797/2428)\u001b[K\r", "remote: Compressing objects: 75% (1821/2428)\u001b[K\r", "remote: Compressing objects: 76% (1846/2428)\u001b[K\r", "remote: Compressing objects: 77% (1870/2428)\u001b[K\r", "remote: Compressing objects: 78% (1894/2428)\u001b[K\r", "remote: Compressing objects: 79% (1919/2428)\u001b[K\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "remote: Compressing objects: 80% (1943/2428)\u001b[K\r", "remote: Compressing objects: 81% (1967/2428)\u001b[K\r", "remote: Compressing objects: 82% (1991/2428)\u001b[K\r", "remote: Compressing objects: 83% (2016/2428)\u001b[K\r", "remote: Compressing objects: 84% (2040/2428)\u001b[K\r", "remote: Compressing objects: 85% (2064/2428)\u001b[K\r", "remote: Compressing objects: 86% (2089/2428)\u001b[K\r", "remote: Compressing objects: 87% (2113/2428)\u001b[K\r", "remote: Compressing objects: 88% (2137/2428)\u001b[K\r", "remote: Compressing objects: 89% (2161/2428)\u001b[K\r", "remote: Compressing objects: 90% (2186/2428)\u001b[K\r", "remote: Compressing objects: 91% (2210/2428)\u001b[K\r", "remote: Compressing objects: 92% (2234/2428)\u001b[K\r", "remote: Compressing objects: 93% (2259/2428)\u001b[K\r", "remote: Compressing objects: 94% (2283/2428)\u001b[K\r", "remote: Compressing objects: 95% (2307/2428)\u001b[K\r", "remote: Compressing objects: 96% (2331/2428)\u001b[K\r", "remote: Compressing objects: 97% (2356/2428)\u001b[K\r", "remote: Compressing objects: 98% (2380/2428)\u001b[K\r", "remote: Compressing objects: 99% (2404/2428)\u001b[K\r", "remote: Compressing objects: 100% (2428/2428)\u001b[K\r", "remote: Compressing objects: 100% (2428/2428), done.\u001b[K\r\n", "Receiving objects: 0% (1/2927)\r", "Receiving objects: 1% (30/2927)\r", "Receiving objects: 2% (59/2927)\r", "Receiving objects: 3% (88/2927)\r", "Receiving objects: 4% (118/2927)\r", "Receiving objects: 5% (147/2927)\r", "Receiving objects: 6% (176/2927)\r", "Receiving objects: 7% (205/2927)\r", "Receiving objects: 8% (235/2927)\r", "Receiving objects: 9% (264/2927)\r", "Receiving objects: 10% (293/2927)\r", "Receiving objects: 11% (322/2927)\r", "Receiving objects: 12% (352/2927)\r", "Receiving objects: 13% (381/2927)\r", "Receiving objects: 14% (410/2927)\r", "Receiving objects: 15% (440/2927)\r", "Receiving objects: 16% (469/2927)\r", "Receiving objects: 17% (498/2927)\r", "Receiving objects: 18% (527/2927)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 19% (557/2927)\r", "Receiving objects: 20% (586/2927)\r", "Receiving objects: 21% (615/2927)\r", "Receiving objects: 22% (644/2927)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 23% (674/2927)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 24% (703/2927)\r", "Receiving objects: 25% (732/2927)\r", "Receiving objects: 26% (762/2927)\r", "Receiving objects: 27% (791/2927)\r", "Receiving objects: 28% (820/2927)\r", "Receiving objects: 29% (849/2927)\r", "Receiving objects: 30% (879/2927)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 31% (908/2927)\r", "Receiving objects: 32% (937/2927)\r", "Receiving objects: 33% (966/2927)\r", "Receiving objects: 34% (996/2927)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 35% (1025/2927)\r", "Receiving objects: 36% (1054/2927)\r", "Receiving objects: 37% (1083/2927)\r", "Receiving objects: 38% (1113/2927)\r", "Receiving objects: 39% (1142/2927)\r", "Receiving objects: 40% (1171/2927)\r", "Receiving objects: 41% (1201/2927)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 41% (1226/2927), 71.80 MiB | 71.93 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 42% (1230/2927), 100.90 MiB | 67.35 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 43% (1259/2927), 100.90 MiB | 67.35 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 44% (1288/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 45% (1318/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 46% (1347/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 47% (1376/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 48% (1405/2927), 100.90 MiB | 67.35 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 49% (1435/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 50% (1464/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 51% (1493/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 52% (1523/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 53% (1552/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 54% (1581/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 55% (1610/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 56% (1640/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 57% (1669/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 58% (1698/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 59% (1727/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 60% (1757/2927), 100.90 MiB | 67.35 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 61% (1786/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 62% (1815/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 63% (1845/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 64% (1874/2927), 100.90 MiB | 67.35 MiB/s\r", "Receiving objects: 65% (1903/2927), 100.90 MiB | 67.35 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 65% (1912/2927), 129.68 MiB | 64.90 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 65% (1912/2927), 183.15 MiB | 61.09 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 66% (1932/2927), 210.77 MiB | 60.25 MiB/s\r", "Receiving objects: 67% (1962/2927), 210.77 MiB | 60.25 MiB/s\r", "Receiving objects: 68% (1991/2927), 210.77 MiB | 60.25 MiB/s\r", "Receiving objects: 69% (2020/2927), 210.77 MiB | 60.25 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 70% (2049/2927), 210.77 MiB | 60.25 MiB/s\r", "Receiving objects: 71% (2079/2927), 210.77 MiB | 60.25 MiB/s\r", "Receiving objects: 72% (2108/2927), 210.77 MiB | 60.25 MiB/s\r", "Receiving objects: 73% (2137/2927), 210.77 MiB | 60.25 MiB/s\r", "Receiving objects: 74% (2166/2927), 210.77 MiB | 60.25 MiB/s\r", "Receiving objects: 75% (2196/2927), 210.77 MiB | 60.25 MiB/s\r", "Receiving objects: 76% (2225/2927), 210.77 MiB | 60.25 MiB/s\r", "Receiving objects: 77% (2254/2927), 210.77 MiB | 60.25 MiB/s\r", "Receiving objects: 78% (2284/2927), 210.77 MiB | 60.25 MiB/s\r", "Receiving objects: 79% (2313/2927), 210.77 MiB | 60.25 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 80% (2342/2927), 210.77 MiB | 60.25 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 81% (2371/2927), 210.77 MiB | 60.25 MiB/s\r", "Receiving objects: 81% (2371/2927), 238.27 MiB | 59.61 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 82% (2401/2927), 238.27 MiB | 59.61 MiB/s\r", "Receiving objects: 83% (2430/2927), 238.27 MiB | 59.61 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 84% (2459/2927), 260.14 MiB | 57.84 MiB/s\r", "Receiving objects: 85% (2488/2927), 260.14 MiB | 57.84 MiB/s\r", "Receiving objects: 85% (2501/2927), 288.52 MiB | 56.79 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 86% (2518/2927), 288.52 MiB | 56.79 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 87% (2547/2927), 314.31 MiB | 53.90 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 88% (2576/2927), 314.31 MiB | 53.90 MiB/s\r", "Receiving objects: 89% (2606/2927), 314.31 MiB | 53.90 MiB/s\r", "Receiving objects: 90% (2635/2927), 314.31 MiB | 53.90 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 90% (2643/2927), 343.39 MiB | 53.90 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 91% (2664/2927), 343.39 MiB | 53.90 MiB/s\r", "Receiving objects: 92% (2693/2927), 343.39 MiB | 53.90 MiB/s\r", "Receiving objects: 93% (2723/2927), 343.39 MiB | 53.90 MiB/s\r", "Receiving objects: 94% (2752/2927), 343.39 MiB | 53.90 MiB/s\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Receiving objects: 95% (2781/2927), 343.39 MiB | 53.90 MiB/s\r", "Receiving objects: 96% (2810/2927), 343.39 MiB | 53.90 MiB/s\r", "Receiving objects: 97% (2840/2927), 343.39 MiB | 53.90 MiB/s\r", "Receiving objects: 98% (2869/2927), 343.39 MiB | 53.90 MiB/s\r", "remote: Total 2927 (delta 503), reused 2114 (delta 424), pack-reused 0\u001b[K\r\n", "Receiving objects: 99% (2898/2927), 343.39 MiB | 53.90 MiB/s\r", "Receiving objects: 100% (2927/2927), 343.39 MiB | 53.90 MiB/s\r", "Receiving objects: 100% (2927/2927), 369.04 MiB | 57.59 MiB/s, done.\r\n", "Resolving deltas: 0% (0/503)\r", "Resolving deltas: 1% (6/503)\r", "Resolving deltas: 2% (11/503)\r", "Resolving deltas: 3% (16/503)\r", "Resolving deltas: 4% (22/503)\r", "Resolving deltas: 5% (28/503)\r", "Resolving deltas: 6% (32/503)\r", "Resolving deltas: 7% (36/503)\r", "Resolving deltas: 8% (42/503)\r", "Resolving deltas: 9% (46/503)\r", "Resolving deltas: 10% (51/503)\r", "Resolving deltas: 11% (56/503)\r", "Resolving deltas: 12% (61/503)\r", "Resolving deltas: 13% (66/503)\r", "Resolving deltas: 14% (71/503)\r", "Resolving deltas: 15% (76/503)\r", "Resolving deltas: 16% (81/503)\r", "Resolving deltas: 17% (86/503)\r", "Resolving deltas: 18% (95/503)\r", "Resolving deltas: 19% (99/503)\r", "Resolving deltas: 20% (101/503)\r", "Resolving deltas: 21% (106/503)\r", "Resolving deltas: 22% (112/503)\r", "Resolving deltas: 23% (116/503)\r", "Resolving deltas: 24% (122/503)\r", "Resolving deltas: 25% (126/503)\r", "Resolving deltas: 26% (132/503)\r", "Resolving deltas: 27% (136/503)\r", "Resolving deltas: 28% (141/503)\r", "Resolving deltas: 29% (146/503)\r", "Resolving deltas: 30% (151/503)\r", "Resolving deltas: 31% (156/503)\r", "Resolving deltas: 32% (161/503)\r", "Resolving deltas: 33% (166/503)\r", "Resolving deltas: 34% (172/503)\r", "Resolving deltas: 35% (178/503)\r", "Resolving deltas: 36% (182/503)\r", "Resolving deltas: 37% (191/503)\r", "Resolving deltas: 40% (206/503)\r", "Resolving deltas: 42% (212/503)\r", "Resolving deltas: 43% (220/503)\r", "Resolving deltas: 44% (224/503)\r", "Resolving deltas: 45% (228/503)\r", "Resolving deltas: 46% (236/503)\r", "Resolving deltas: 47% (237/503)\r", "Resolving deltas: 48% (242/503)\r", "Resolving deltas: 49% (248/503)\r", "Resolving deltas: 50% (252/503)\r", "Resolving deltas: 51% (257/503)\r", "Resolving deltas: 52% (263/503)\r", "Resolving deltas: 53% (267/503)\r", "Resolving deltas: 54% (273/503)\r", "Resolving deltas: 56% (282/503)\r", "Resolving deltas: 57% (288/503)\r", "Resolving deltas: 58% (293/503)\r", "Resolving deltas: 59% (297/503)\r", "Resolving deltas: 60% (302/503)\r", "Resolving deltas: 61% (309/503)\r", "Resolving deltas: 62% (313/503)\r", "Resolving deltas: 63% (318/503)\r", "Resolving deltas: 64% (322/503)\r", "Resolving deltas: 65% (329/503)\r", "Resolving deltas: 66% (332/503)\r", "Resolving deltas: 67% (339/503)\r", "Resolving deltas: 68% (343/503)\r", "Resolving deltas: 69% (350/503)\r", "Resolving deltas: 70% (353/503)\r", "Resolving deltas: 71% (359/503)\r", "Resolving deltas: 72% (365/503)\r", "Resolving deltas: 73% (372/503)\r", "Resolving deltas: 74% (373/503)\r", "Resolving deltas: 75% (378/503)\r", "Resolving deltas: 76% (387/503)\r", "Resolving deltas: 77% (389/503)\r", "Resolving deltas: 78% (393/503)\r", "Resolving deltas: 79% (398/503)\r", "Resolving deltas: 80% (403/503)\r", "Resolving deltas: 81% (410/503)\r", "Resolving deltas: 82% (413/503)\r", "Resolving deltas: 83% (418/503)\r", "Resolving deltas: 84% (423/503)\r", "Resolving deltas: 85% (429/503)\r", "Resolving deltas: 86% (434/503)\r", "Resolving deltas: 87% (438/503)\r", "Resolving deltas: 88% (445/503)\r", "Resolving deltas: 89% (448/503)\r", "Resolving deltas: 90% (453/503)\r", "Resolving deltas: 91% (458/503)\r", "Resolving deltas: 92% (466/503)\r", "Resolving deltas: 94% (475/503)\r", "Resolving deltas: 95% (480/503)\r", "Resolving deltas: 96% (483/503)\r", "Resolving deltas: 97% (491/503)\r", "Resolving deltas: 98% (495/503)\r", "Resolving deltas: 99% (499/503)\r", "Resolving deltas: 100% (503/503)\r", "Resolving deltas: 100% (503/503), done.\r\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 56% (1576/2768)\r", "Updating files: 57% (1578/2768)\r", "Updating files: 58% (1606/2768)\r", "Updating files: 59% (1634/2768)\r", "Updating files: 60% (1661/2768)\r", "Updating files: 61% (1689/2768)\r", "Updating files: 62% (1717/2768)\r", "Updating files: 63% (1744/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 64% (1772/2768)\r", "Updating files: 65% (1800/2768)\r", "Updating files: 66% (1827/2768)\r", "Updating files: 67% (1855/2768)\r", "Updating files: 68% (1883/2768)\r", "Updating files: 69% (1910/2768)\r", "Updating files: 70% (1938/2768)\r", "Updating files: 71% (1966/2768)\r", "Updating files: 72% (1993/2768)\r", "Updating files: 73% (2021/2768)\r", "Updating files: 74% (2049/2768)\r", "Updating files: 75% (2076/2768)\r", "Updating files: 76% (2104/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 77% (2132/2768)\r", "Updating files: 78% (2160/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 79% (2187/2768)\r", "Updating files: 80% (2215/2768)\r", "Updating files: 81% (2243/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 81% (2248/2768)\r", "Updating files: 82% (2270/2768)\r", "Updating files: 83% (2298/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 84% (2326/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 85% (2353/2768)\r", "Updating files: 86% (2381/2768)\r", "Updating files: 87% (2409/2768)\r", "Updating files: 88% (2436/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 89% (2464/2768)\r", "Updating files: 90% (2492/2768)\r", "Updating files: 91% (2519/2768)\r", "Updating files: 92% (2547/2768)\r", "Updating files: 93% (2575/2768)\r" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Updating files: 94% (2602/2768)\r", "Updating files: 95% (2630/2768)\r", "Updating files: 96% (2658/2768)\r", "Updating files: 97% (2685/2768)\r", "Updating files: 98% (2713/2768)\r", "Updating files: 99% (2741/2768)\r", "Updating files: 100% (2768/2768)\r", "Updating files: 100% (2768/2768), done.\r\n" ] } ], "source": [ "!git clone --branch r1.13.0 --depth 1 https://github.com/tensorflow/models" ] }, { "cell_type": "markdown", "metadata": { "id": "wfHOhbkgvrKr" }, "source": [ "### Read the help\n", "\n", "The script should be installed with TensorFlow. Here is the builtin help:" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "execution": { "iopub.execute_input": "2024-07-19T01:47:28.877440Z", "iopub.status.busy": "2024-07-19T01:47:28.876829Z", "iopub.status.idle": "2024-07-19T01:47:31.688899Z", "shell.execute_reply": "2024-07-19T01:47:31.687940Z" }, "id": "m2GF-tlntqTQ" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "2024-07-19 01:47:29.222738: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:485] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered\r\n", "2024-07-19 01:47:29.241645: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:8454] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered\r\n", "2024-07-19 01:47:29.247355: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1452] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered\r\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "usage: tf_upgrade_v2 [-h] [--infile INPUT_FILE] [--outfile OUTPUT_FILE]\r\n", " [--intree INPUT_TREE] [--outtree OUTPUT_TREE]\r\n", " [--copyotherfiles COPY_OTHER_FILES] [--inplace]\r\n", " [--no_import_rename] [--no_upgrade_compat_v1_import]\r\n", " [--reportfile REPORT_FILENAME] [--mode {DEFAULT,SAFETY}]\r\n", " [--print_all]\r\n", "\r\n", "Convert a TensorFlow Python file from 1.x to 2.0\r\n", "\r\n", "Simple usage:\r\n", " tf_upgrade_v2.py --infile foo.py --outfile bar.py\r\n", " tf_upgrade_v2.py --infile foo.ipynb --outfile bar.ipynb\r\n", " tf_upgrade_v2.py --intree ~/code/old --outtree ~/code/new\r\n", "\r\n", "optional arguments:\r\n", " -h, --help show this help message and exit\r\n", " --infile INPUT_FILE If converting a single file, the name of the file to\r\n", " convert\r\n", " --outfile OUTPUT_FILE\r\n", " If converting a single file, the output filename.\r\n", " --intree INPUT_TREE If converting a whole tree of files, the directory to\r\n", " read from (relative or absolute).\r\n", " --outtree OUTPUT_TREE\r\n", " If converting a whole tree of files, the output\r\n", " directory (relative or absolute).\r\n", " --copyotherfiles COPY_OTHER_FILES\r\n", " If converting a whole tree of files, whether to copy\r\n", " the other files.\r\n", " --inplace If converting a set of files, whether to allow the\r\n", " conversion to be performed on the input files.\r\n", " --no_import_rename Not to rename import to compat.v2 explicitly.\r\n", " --no_upgrade_compat_v1_import\r\n", " If specified, don't upgrade explicit imports of\r\n", " `tensorflow.compat.v1 as tf` to the v2 APIs.\r\n", " Otherwise, explicit imports of the form\r\n", " `tensorflow.compat.v1 as tf` will be upgraded.\r\n", " --reportfile REPORT_FILENAME\r\n", " The name of the file where the report log is\r\n", " stored.(default: report.txt)\r\n", " --mode {DEFAULT,SAFETY}\r\n", " Upgrade script mode. Supported modes: DEFAULT: Perform\r\n", " only straightforward conversions to upgrade to 2.0. In\r\n", " more difficult cases, switch to use compat.v1. SAFETY:\r\n", " Keep 1.* code intact and import compat.v1 module.\r\n", " --print_all Print full log to stdout instead of just printing\r\n", " errors\r\n" ] } ], "source": [ "!tf_upgrade_v2 -h" ] }, { "cell_type": "markdown", "metadata": { "id": "se9Leqjm1CZR" }, "source": [ "### Example TF1 code" ] }, { "cell_type": "markdown", "metadata": { "id": "whD5i36s1SuM" }, "source": [ "Here is a simple TensorFlow 1.0 script:" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "execution": { "iopub.execute_input": "2024-07-19T01:47:31.693912Z", "iopub.status.busy": "2024-07-19T01:47:31.693145Z", "iopub.status.idle": "2024-07-19T01:47:31.825715Z", "shell.execute_reply": "2024-07-19T01:47:31.824917Z" }, "id": "mhGbYQ9HwbeU" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " # Calculate loss using mean squared error\r\n", " average_loss = tf.losses.mean_squared_error(labels, predictions)\r\n", "\r\n", " # Pre-made estimators use the total_loss instead of the average,\r\n", " # so report total_loss for compatibility.\r\n", " batch_size = tf.shape(labels)[0]\r\n", " total_loss = tf.to_float(batch_size) * average_loss\r\n", "\r\n", " if mode == tf.estimator.ModeKeys.TRAIN:\r\n", " optimizer = params.get(\"optimizer\", tf.train.AdamOptimizer)\r\n" ] } ], "source": [ "!head -n 65 models/samples/cookbook/regression/custom_regression.py | tail -n 10" ] }, { "cell_type": "markdown", "metadata": { "id": "UGO7xSyL89wX" }, "source": [ "With TensorFlow 2.x installed it does not run:" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "execution": { "iopub.execute_input": "2024-07-19T01:47:31.829970Z", "iopub.status.busy": "2024-07-19T01:47:31.829260Z", "iopub.status.idle": "2024-07-19T01:47:34.620950Z", "shell.execute_reply": "2024-07-19T01:47:34.620042Z" }, "id": "TD7fFphX8_qE" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "2024-07-19 01:47:32.174663: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:485] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered\r\n", "2024-07-19 01:47:32.193806: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:8454] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered\r\n", "2024-07-19 01:47:32.199390: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1452] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered\r\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Traceback (most recent call last):\r\n", " File \"/tmpfs/src/temp/site/en/guide/migrate/models/samples/cookbook/regression/custom_regression.py\", line 162, in