source code: py/hwid/service/appengine/README.md
HWID Service¶
This folder contains the mandatory files to deploy HWID Service on AppEngine. Most of the files are porting from HWID Server with modifications to adapt to factory repository.
Design¶
The origin HWID Server Arch Overview and Design Doc.
Important Files¶
app.yaml
: Config file for deploying service on AppEngine.cron.yaml
: Config file for deploying cronjob on AppEngine.${factory_dir}/deploy/cros_hwid_service.sh
: The main script to deploy and test HWID Service. Runcros_hwid_service.sh
for more usage.appengine_config.py
: The very first loading file on AppEngine.app.py
: The API entry point. It defines the API handlers.hwid_api.py
: The HWID API function implementation.
Run and Deploy¶
Environments¶
There are three environments to deploy to:
prod
GCP project name: chromeos-hwid
AppEngine APP ID: s~chromeos-hwid
AppEngine URL: https://chromeos-hwid.appspot.com
AppEngine Management Page: https://appengine.google.com/?&app_id=s~chromeos-hwid
Cloud Storage Bucket: https://console.developers.google.com/storage/chromeoshwid/
Borgcron Job Sigma: http://sigma/jobs/chromeoshwid
Endpoint URL: https://chromeos-hwid.appspot.com/api/chromeoshwid/v1/
staging
GCP project name: chromeos-hwid-staging
AppEngine APP ID: s~chromeos-hwid-staging
AppEngine URL: https://chromeos-hwid-staging.appspot.com/
AppEngine Management Page: https://appengine.google.com/?app_id=s~chromeos-hwid-staging
Cloud Storage Bucket: https://console.developers.google.com/storage/chromeoshwid-staging/
Borgcron Job Sigma: N/A (Job only exists for prod).
Endpoint URL: https://chromeos-hwid-staging.appspot.com/api/chromeoshwid/v1/
local
GCP project name: N/A
AppEngine APP ID: N/A
AppEngine URL: N/A
Cloud Storage Bucket: https://console.developers.google.com/storage/chromeoshwid-dev/ (Note: Just the server is local, the bucket is not)
Borgcron Job Sigma: N/A
Endpoint URL: http://localhost:8080/api/chromeoshwid/v1/
AppEngine Deployment Flow¶
Run Local Server¶
Download database:
Redis:
GCP console -> Memorystore -> Redis -> Export -> Select bucket -> Export
GCP console -> Cloud Storage -> Select bucket -> Download redis rdb
Datastore:
GCP console -> Datastore -> Export
Select namespace=”all namespace” -> Select bucket -> Export
GCP console -> Cloud Storage -> Select bucket -> Download Datastore folder
Impersonated service account (For gerrit access):
GCP console -> IAM -> Add your LDAP as “Service Account Token Creator”
Run local server:
export REDIS_RDB=${redis_rdb}
export DATASTORE=${datastore}
deploy/cros_hwid_service.sh build
deploy/cros_hwid_service.sh deploy local
Deploy Staging/Prod¶
Make sure the contents of three repos is what you want:
Normally, we would use ToT: Run repo sync .
in each repo.
Make sure endpoint config is up-to-date. If the interface is not changed, you can skip this step and the deployment script will find the latest version of config.
# As endpoint interface changes, you may need to generate the json config of
# Open API settings. Note that ${endpoint_service_name} here is the AppEngine
# URL mentioned above without `https://` schema prefix.
cd ${appengine_dir}
PYTHONPATH=../../../../build/hwid/protobuf_out \
python ../../../../build/hwid/lib/endpoints/endpointscfg.py \
get_openapi_spec hwid_api.HwidApi --hostname "${endpoint_service_name}"
# You can then deploy the generated config file `chromeoshwidv1openapi.json`.
gcloud endpoints services deploy chromeoshwidv1openapi.json
Before deploying to
prod
, you have to deploy tostaging
:
# If you use Google Cloud Platform for the first time, you may have to
# install gcloud sdk (https://cloud.google.com/sdk/install). gcloud may ask you
# to register or loging your account. Please enter your google domain acount.
# It may also ask you to register or login a GCP project account, you can
# use 'chromeos-factory'. The deploy script will choose the right GCP project
# to deploy.
deploy/cros_hwid_service.sh deploy staging
Make sure all tests are passed:
# In chroot: unittest
make test
# Out of chroot: integration test and e2e test
# - Integration test creates a docker image, which may take a long time for the
# first time running this script.
# - e2e test list is placed at http://go/factory-private-git
deploy/cros_hwid_service.sh test
If all tests are passed, now we can deploy the HWID Service to
prod
:
deploy/cros_hwid_service.sh deploy prod
Open the AppEngine management page, and watch the traffics are not blocked.
Invoking API¶
Before invoking the API, you should add your LDAP to client_allowlist
in
$factory-private/config/hwid/service/appengine/configurations.yaml
, and deploy
the app engine again.
Local¶
Example request for local server:
Shell:
HWID_API_MESSAGES_PACKAGE="cros.factory.hwid.service.appengine.proto.hwid_api_messages_pb2"
PROTO_PATH="${FACTORY_REPO}/py/hwid/service/appengine/proto"
PROTO_FILE="${PROTO_PATH}/hwid_api_messages.proto"
bom_request() {
local request="$(echo -e "hwid: 'DRALLION360-ZZCR A3B-A3G-D4Y-Q8I-A9W'\nverbose: true" | \
protoc --encode "${HWID_API_MESSAGES_PACKAGE}.BomRequest" \
--proto_path="${PROTO_PATH}" "${PROTO_FILE}" | base64)"
echo "${request}" | base64 -d | \
curl -s -XPOST localhost:5000/_ah/stubby/HwidService.GetBom \
--data-binary @- | \
protoc --decode "${HWID_API_MESSAGES_PACKAGE}.BomResponse" \
--proto_path="${PROTO_PATH}" "${PROTO_FILE}"
}
Python:
# Copy the generated hwid_api_messages_pb2.py to local
import hwid_api_messages_pb2
import urllib.request
def get_bom():
msg = hwid_api_messages_pb2.BomRequest(
hwid='DRALLION360-ZZCR A3B-A3G-D4Y-Q8I-A9W',
verbose=True)
payload = msg.SerializeToString()
req = urllib.request.Request('http://localhost:5000/_ah/stubby/HwidService.GetBom', data=payload)
with urllib.request.urlopen(req) as fp:
resp = hwid_api_messages_pb2.BomResponse()
resp.ParseFromString(fp.read())
print(resp)
AppEngine¶
Example request for staging/e2e/prod environment:
# usage:
# deploy/cros_hwid_service.sh request [prod|e2e|staging] ${proto_file} ${api}
#
# The input should be in prototxt format. See the definition in
# `py/hwid/service/appengine/proto`
$ cat > /tmp/request.txt << EOF
hwid: "AKALI C5B-A4B-E3K-62Q-A8E"
verbose: true
EOF
$ ./deploy/cros_hwid_service.sh request staging hwid_api_messages \
HwidService.GetBom < /tmp/request.txt
Test¶
cros_sdk make -C ../platform/factory test # factory unittests
./deploy.sh test # integration tests and e2e tests
Log¶
To view the logs, you have to go to AppEngine Management Page -> Versions -> Diagnose -> Logs
HWID Database Ingestion Pipeline¶
The ingestion pipeline helps AppEngine get access to the HWID Database on Gerrit, there are two stages of the pipeline.
Borgcron Job Ingestion
Uploads HWID Databse from gerrit to BigStore bucket
[bucket]/staging
every day. (code: http://go/chromeos-hwid-ingestion)
AppEngine Cronjob Ingestion
Validates the HWID Databse files in BigStore bucket
[bucket]/staging
. If the file is validated, move the file from[bucket]/staging
to[bucket]/live
.
Borgcron Job Deployment¶
The borgcron job is to periodically(every 24h) upload the latest HWID Database from git(via git/gerrit-on-borg) to the cloud buckets. Since it is a borgcron job, we don’t port this part to the factory repository. To modify the code and deploy, please refers to http://go/hwid-server-arch -> Run & deploy -> Deploying the borgcron job.