repo
stringclasses 32
values | instance_id
stringlengths 13
37
| base_commit
stringlengths 40
40
| patch
stringlengths 1
1.89M
| test_patch
stringclasses 1
value | problem_statement
stringlengths 304
69k
| hints_text
stringlengths 0
246k
| created_at
stringlengths 20
20
| version
stringclasses 1
value | FAIL_TO_PASS
stringclasses 1
value | PASS_TO_PASS
stringclasses 1
value | environment_setup_commit
stringclasses 1
value | traceback
stringlengths 64
23.4k
| __index_level_0__
int64 29
19k
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
docker/compose | docker__compose-5588 | 872538581335ca04c78d48ea7a1404373a731fad | diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -99,5 +99,6 @@ def find_version(*file_paths):
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
+ 'Programming Language :: Python :: 3.6',
],
)
| docker-compose.exe up: UnicodeDecodeError: 'ascii' codec can't decode byte 0xe4 in position 10: ordinal not in range(128) docker-compose returned -1
I'm getting following error while trying to do docker-compose up on Windows 10 machine:
```
docker-compose.exe up
Traceback (most recent call last):
File "<string>", line 3, in <module>
File "compose\cli\main.py", line 61, in main
File "compose\cli\main.py", line 110, in perform_command
File "compose\cli\command.py", line 35, in project_from_options
File "compose\cli\command.py", line 102, in get_project
File "compose\config\config.py", line 319, in load
File "compose\config\config.py", line 409, in load_services
File "compose\config\config.py", line 388, in build_services
File "compose\config\config.py", line 373, in build_service
File "compose\config\config.py", line 612, in process_service
File "compose\config\config.py", line 856, in resolve_volume_paths
File "compose\config\config.py", line 866, in resolve_volume_path
File "ntpath.py", line 311, in expanduser
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe4 in position 10: ordinal not in range(128)
docker-compose returned -1
```
Should be the latest release:
```
docker-compose.exe version
docker-compose version 1.8.0, build d988a55
docker-py version: 1.9.0
CPython version: 2.7.11
OpenSSL version: OpenSSL 1.0.2d 9 Jul 2015
```
It's probably caused by my username, it contains character "ä". There are many similar issues, but they seem to be fixed, but not for me unfortunately.
| By the sound of [this bug report](http://bugs.python.org/issue13207), it's a Python stdlib issue that's fixed in Python 3 and marked as WONTFIX for Python 2. So we can only fix this by moving Compose to Python 3.
@dnephin @shin- have we tried building Python 3 binaries yet?
Are there any workarounds or I just need to wait until Python 3 build is available?
@rang501 It may sound like a hassle, but you could change your username to contain only ASCII characters.
> have we tried building Python 3 binaries yet?
I tried it a while ago. It was working for linux, but failing for windows on appveyor. I tried to get it building, but I don't think I ever got it working. I think it was a pyinstaller issue, so it might be fixed now.
We could encode the values before passing them to the stdlib functions, we do that in other places.
I'm having a similar issue:
```
C:\dev\environment-containers\elk>docker-compose -f local-compose.yml up —force
Traceback (most recent call last):
File "<string>", line 3, in <module>
File "compose\cli\main.py", line 62, in main
File "compose\cli\main.py", line 114, in perform_command
File "compose\cli\main.py", line 835, in up
File "compose\project.py", line 379, in up
File "compose\project.py", line 177, in get_services_without_duplicate
File "compose\project.py", line 165, in get_services
File "compose\project.py", line 136, in get_service
File "compose\project.py", line 555, in __init__
UnicodeDecodeError: 'ascii' codec can't decode byte 0x97 in position 0: ordinal not in range(128)
docker-compose returned -1
```
```
C:\dev\environment-containers\elk>docker version
Client:
Version: 1.12.3
API version: 1.24
Go version: go1.6.3
Git commit: 6b644ec
Built: Wed Oct 26 23:26:11 2016
OS/Arch: windows/amd64
Server:
Version: 1.12.3
API version: 1.24
Go version: go1.6.3
Git commit: 6b644ec
Built: Wed Oct 26 23:26:11 2016
OS/Arch: linux/amd64
C:\dev\environment-containers\elk>docker-compose version
docker-compose version 1.8.1, build 004ddae
docker-py version: 1.10.3
CPython version: 2.7.12
OpenSSL version: OpenSSL 1.0.2h 3 May 2016
C:\dev\environment-containers\elk>ver
Microsoft Windows [Version 10.0.14393]
C:\dev\environment-containers\elk>python --version
Python 2.7.9
```
I have this Problem to.
Everything is uptodate!
Is there any workaround for this?
> Severity Code Description Project File Line Suppression State
> Error MSB4018 The "PrepareForLaunch" task failed unexpectedly.
> Microsoft.DotNet.Docker.CommandLineClientException: Traceback (most recent call last):
> File "<string>", line 3, in <module>
> File "compose\cli\main.py", line 64, in main
> File "compose\cli\main.py", line 113, in perform_command
> File "compose\cli\command.py", line 36, in project_from_options
> File "compose\cli\command.py", line 103, in get_project
> File "compose\config\config.py", line 331, in load
> File "compose\config\config.py", line 432, in load_services
> File "compose\config\config.py", line 411, in build_services
> File "compose\config\config.py", line 396, in build_service
> File "compose\config\config.py", line 641, in process_service
> File "compose\config\config.py", line 945, in resolve_volume_paths
> File "compose\config\config.py", line 955, in resolve_volume_path
> File "ntpath.py", line 311, in expanduser
> UnicodeDecodeError: 'ascii' codec can't decode byte 0xdf in position 18: ordinal not in range(128)
> docker-compose returned -1
> at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
> at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
> at Microsoft.DotNet.Docker.DockerComposeClientExtensions.<DownAsync>d__3.MoveNext()
> --- End of stack trace from previous location where exception was thrown ---
> at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
> at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
> at Microsoft.DotNet.Docker.BuildTasks.PrepareForLaunch.<ExecuteAsync>d__0.MoveNext()
> --- End of stack trace from previous location where exception was thrown ---
> at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
> at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
> at Microsoft.DotNet.Docker.BuildTasks.DockerBaseTask.Execute()
> at Microsoft.Build.BackEnd.TaskExecutionHost.Microsoft.Build.BackEnd.ITaskExecutionHost.Execute()
> at Microsoft.Build.BackEnd.TaskBuilder.<ExecuteInstantiatedTask>d__26.MoveNext() D:\DEV\Spielwiese\dotNET_core_Hello_Docker_vs15\src\dotNET_core_Hello_Docker_vs15\dotNET_core_Hello_Docker_vs15.xproj C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0\DotNet\Microsoft.DotNet.Publishing\ImportAfter\Microsoft.DotNet.Docker.targets 70
> iese\dotNET_core_Hello_Docker_vs15\src\dotNET_core_Hello_Docker_vs15\dotNET_core_Hello_Docker_vs15.xproj C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0\DotNet\Microsoft.DotNet.Publishing\ImportAfter\Microsoft.DotNet.Docker.targets 70
> C:\Users\David.Weiß>docker version
> Client:
> Version: 1.13.0
> API version: 1.25
> Go version: go1.7.3
> Git commit: 49bf474
> Built: Wed Jan 18 16:20:26 2017
> OS/Arch: windows/amd64
>
> Server:
> Version: 1.13.0
> API version: 1.25 (minimum version 1.12)
> Go version: go1.7.3
> Git commit: 49bf474
> Built: Wed Jan 18 16:20:26 2017
> OS/Arch: linux/amd64
> Experimental: true
> C:\Users\David.Weiß>docker-compose version
> docker-compose version 1.10.0, build 4bd6f1a0
> docker-py version: 2.0.1
> CPython version: 2.7.13
> OpenSSL version: OpenSSL 1.0.2j 26 Sep 2016
> C:\Users\David.Weiß>ver
>
> Microsoft Windows [Version 10.0.14393]
@davweiss It's [a bug in Python 2.7](http://bugs.python.org/issue13207). Your options are:
- Install docker-compose in a Python 3 environment
- Change your Windows username to use only ASCII characters.
Hi @shin-
I've same problem and i've install pyton 3 on my path Windows and no result.
@LeG3nDz It's probably a different error then. Can you share the stacktrace and which command you were running?
Hi @shin-
Finaly, i've change my Windows username :( I've recreate new user with only ASCII characters.
Thank you for the solution
Because you can choose different path with docker tool box's installing,if you choose different path,it can affect docker-compose.exe's executing ,So you MUST MODIFY %yourpath\Docker Toolbox\start.sh:
export PATH="/yourpath/Docker Toolbox:$PATH"
I had changed to language of Windows from English to Swedish (where the folder "Users" => "Anv**ä**ndare"). When I changed back to English it worked again. I guess the whole path needs to only use ASCII characters.
| 2018-01-19T01:22:15Z | [] | [] |
Traceback (most recent call last):
File "<string>", line 3, in <module>
File "compose\cli\main.py", line 61, in main
File "compose\cli\main.py", line 110, in perform_command
File "compose\cli\command.py", line 35, in project_from_options
File "compose\cli\command.py", line 102, in get_project
File "compose\config\config.py", line 319, in load
File "compose\config\config.py", line 409, in load_services
File "compose\config\config.py", line 388, in build_services
File "compose\config\config.py", line 373, in build_service
File "compose\config\config.py", line 612, in process_service
File "compose\config\config.py", line 856, in resolve_volume_paths
File "compose\config\config.py", line 866, in resolve_volume_path
File "ntpath.py", line 311, in expanduser
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe4 in position 10: ordinal not in range(128)
| 5,025 |
|||
docker/compose | docker__compose-5603 | b64cfd1982ad0841e4813b43ff587bec7e380abc | diff --git a/compose/service.py b/compose/service.py
--- a/compose/service.py
+++ b/compose/service.py
@@ -1322,7 +1322,6 @@ def get_container_data_volumes(container, volumes_option, tmpfs_option, mounts_o
a mapping of volume bindings for those volumes.
Anonymous volume mounts are updated in place instead.
"""
-
volumes = []
volumes_option = volumes_option or []
@@ -1366,7 +1365,7 @@ def get_container_data_volumes(container, volumes_option, tmpfs_option, mounts_o
continue
ctnr_mount = container_mounts.get(mount.target)
- if not ctnr_mount.get('Name'):
+ if not ctnr_mount or not ctnr_mount.get('Name'):
continue
mount.source = ctnr_mount['Name']
| ERROR: for nginx 'NoneType' object has no attribute 'get'
```
➜ projectname docker-compose -v
docker-compose version 1.18.0, build 8dd22a9
```
Having a strange problem when simply changing a volume's mount name.
My configuration looks like this:
```yaml
version: '3.4'
services:
cygnetise:
image: someorg/projectname
volumes:
- type: volume
source: projectname_client
target: /app
nginx:
image: someorg/projectname-nginx
ports:
- target: 80
published: 8080
volumes:
- type: volume
source: projectname_client
target: /mnt/projectname_client
read_only: true
volumes:
projectname_client:
projectname_admin:
```
If I change the nginx application's target location to ``` /home/app/projectname_client``` I get the following error:
```
➜ projectname docker-compose -f docker-compose.local.yml up
Recreating 12396073747d_projectname_nginx_1 ...
projectname_projectname_1 is up-to-date
ERROR: for 12396073747d_projectname_nginx_1 'NoneType' object has no attribute 'get'
ERROR: for nginx 'NoneType' object has no attribute 'get'
Traceback (most recent call last):
File "docker-compose", line 6, in <module>
File "compose/cli/main.py", line 71, in main
File "compose/cli/main.py", line 124, in perform_command
File "compose/cli/main.py", line 959, in up
File "compose/project.py", line 479, in up
File "compose/parallel.py", line 80, in parallel_execute
AttributeError: 'NoneType' object has no attribute 'get'
Failed to execute script docker-compose
```
Not sure why this would happen, and the error message doesn't really provide any useful pointers unfortunately.
| Thanks for the report, I'll look into it asap. | 2018-01-23T23:26:32Z | [] | [] |
Traceback (most recent call last):
File "docker-compose", line 6, in <module>
File "compose/cli/main.py", line 71, in main
File "compose/cli/main.py", line 124, in perform_command
File "compose/cli/main.py", line 959, in up
File "compose/project.py", line 479, in up
File "compose/parallel.py", line 80, in parallel_execute
AttributeError: 'NoneType' object has no attribute 'get'
| 5,030 |
|||
docker/compose | docker__compose-5697 | e71664385756a75f4cb14c9447138ac3c4120f3a | diff --git a/compose/cli/docker_client.py b/compose/cli/docker_client.py
--- a/compose/cli/docker_client.py
+++ b/compose/cli/docker_client.py
@@ -9,6 +9,7 @@
from docker.errors import TLSParameterError
from docker.tls import TLSConfig
from docker.utils import kwargs_from_env
+from docker.utils.config import home_dir
from ..config.environment import Environment
from ..const import HTTP_TIMEOUT
@@ -19,6 +20,10 @@
log = logging.getLogger(__name__)
+def default_cert_path():
+ return os.path.join(home_dir(), '.docker')
+
+
def get_tls_version(environment):
compose_tls_version = environment.get('COMPOSE_TLS_VERSION', None)
if not compose_tls_version:
@@ -56,6 +61,12 @@ def tls_config_from_options(options, environment=None):
key = os.path.join(cert_path, 'key.pem')
ca_cert = os.path.join(cert_path, 'ca.pem')
+ if verify and not any((ca_cert, cert, key)):
+ # Default location for cert files is ~/.docker
+ ca_cert = os.path.join(default_cert_path(), 'ca.pem')
+ cert = os.path.join(default_cert_path(), 'cert.pem')
+ key = os.path.join(default_cert_path(), 'key.pem')
+
tls_version = get_tls_version(environment)
advanced_opts = any([ca_cert, cert, key, verify, tls_version])
| SSL error with remote host
I have CI running docker-compose and my configuration was working until recently. Not sure on which version of docker-compose my image was before (I've updated the image), but now I have following:
Image is base on `docker:latest`, docker-compose is installed through:
```
RUN apk add --no-cache py-pip
RUN pip install docker-compose
```
```sh
$ docker version
Client:
Version: 18.02.0-ce
API version: 1.32 (downgraded from 1.36)
Go version: go1.9.3
Git commit: fc4de44
Built: Wed Feb 7 21:12:37 2018
OS/Arch: linux/amd64
Experimental: false
Orchestrator: swarm
Server:
Engine:
Version: 17.09.0-ce
API version: 1.32 (minimum version 1.12)
Go version: go1.8.3
Git commit: afdb6d4
Built: Tue Sep 26 22:42:49 2017
OS/Arch: linux/amd64
Experimental: false
```
```sh
$ docker-compose version
docker-compose version 1.19.0, build 9e633ef
docker-py version: 2.7.0
CPython version: 2.7.14
OpenSSL version: LibreSSL 2.6.3
```
Following environment variables are populated: `DOCKER_HOST`, `DOCKER_TLS_VERIFY="1"` and `ca.pem`, `cert.pem` and `key.pem` are copied to `~/.docker` folder. (P.S. tried also `DOCKER_TLS_VERIFY=1` and `DOCKER_TLS_VERIFY="True"`)
On any docker-compose command (ps, pull) I get following error (some values are replaced with `xxxxxx`):
```sh
$ docker-compose --verbose pull
compose.config.config.find: Using configuration files: xxxxxx
docker.auth.find_config_file: Trying paths: ['/root/.docker/config.json', '/root/.dockercfg']
docker.auth.find_config_file: Found file at path: /root/.docker/config.json
docker.auth.load_config: Found 'auths' section
docker.auth.parse_auth: Found entry (registry=u'xxxxxx', username=u'xxxxxx')
docker.auth.load_config: Found 'HttpHeaders' section
urllib3.connectionpool._new_conn: Starting new HTTPS connection (1): xxxxxx
Traceback (most recent call last):
File "/usr/bin/docker-compose", line 11, in <module>
sys.exit(main())
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 71, in main
command()
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 121, in perform_command
project = project_from_options('.', options)
File "/usr/lib/python2.7/site-packages/compose/cli/command.py", line 40, in project_from_options
override_dir=options.get('--project-directory'),
File "/usr/lib/python2.7/site-packages/compose/cli/command.py", line 118, in get_project
host=host, environment=environment
File "/usr/lib/python2.7/site-packages/compose/cli/command.py", line 93, in get_client
version_info = six.iteritems(client.version())
File "/usr/lib/python2.7/site-packages/docker/api/daemon.py", line 177, in version
return self._result(self._get(url), json=True)
File "/usr/lib/python2.7/site-packages/docker/utils/decorators.py", line 46, in inner
return f(self, *args, **kwargs)
File "/usr/lib/python2.7/site-packages/docker/api/client.py", line 191, in _get
return self.get(url, **self._set_request_timeout(kwargs))
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 521, in get
return self.request('GET', url, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 508, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 618, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 506, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='xxxxxx', port=2376): Max retries exceeded with url: /v1.25/version (Caused by SSLError(SSLError(1, u'[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:661)'),))
```
As you see `docker` connects to host as it shows daemon version and the same configuration was working until updating the image (and as a result docker-compose).
The workaround now is to use all the flags with docker-compose directly (as if I understand it correctly, it just ignore environment variables):
```sh
$ docker-compose -H xxxxxx --tlsverify --tlscacert ~/.docker/ca.pem --tlscert ~/.docker/cert.pem --tlskey ~/.docker/key.pem ps
```
It may be related to following issues/PRs: #5632, #5593, #5634
I'm almost sure that this is some kind of regression that was introduced in `1.19.0`
P.S. Not sure if it is important but the CI is docker in docker setup.
| The workaround I've mentioned (to use docker-compose with all tls arguments specified) has one downside though. Now `docker-compose [args] pull` and `docker-compose [args] up -d` are working fine (where `[args]` is `-H xxxxxx --tlsverify --tlscacert ~/.docker/ca.pem --tlscert ~/.docker/cert.pem --tlskey ~/.docker/key.pem`), but `docker-compose [args] exec service sh -c 'somecommand'` is not working and gives following error:
```sh
docker-compose -H xxxxxx --tlsverify --tlscacert ~/.docker/ca.pem --tlscert ~/.docker/cert.pem --tlskey ~/.docker/key.pem exec service sh -c 'command_here'
the input device is not a TTY
```
and I need to also pass `-T` flag to docker-compose to disable pseudo-tty allocation. Not sure what's the problem here either, as previously it was working fine without that flag.
Are you setting `DOCKER_CERT_PATH` as well?
@shin- Nope, I don't set it, so it is on default (per documentation `~/.docker`)
There's a chance we're not honoring the default. Does it work if you set it?
@shin- I can confirm that setting `DOCKER_CERT_PATH` is working, and all other env vars are taken into the consideration in that case, so the changed behaviour is only that you've started to not honor the default value for it.
It's maybe another issue, but I still don't get, why
```sh
docker-compose exec service sh -c 'somecommand'
```
works, but:
```sh
docker-compose -H xxxxxx --tlsverify --tlscacert ~/.docker/ca.pem --tlscert ~/.docker/cert.pem --tlskey ~/.docker/key.pem exec service sh -c 'somecommand'
```
doesn't work, and I need to specify `-T` flag like:
```sh
docker-compose -H xxxxxx --tlsverify --tlscacert ~/.docker/ca.pem --tlscert ~/.docker/cert.pem --tlskey ~/.docker/key.pem exec -T service sh -c 'somecommand'
```
@shin- Nope, sorry, for the second issue I mentioned, it's still not working, so it is also some kind of regression I think. `exec` command used to work without `-T` flag on my CI builds and are not working now with docker-compose 1.19.0.
Can I somehow install previous version of docker-compose with `pip install` method to confirm also that (and open separate issue for it?).
Thank you for confirming. I'll make sure we honor the default cert path in the next release.
You can install any version of Compose using `pip install -U docker-compose==<VERSION>`
@shin- Can confirm also the second issue, see #5696 | 2018-02-21T21:25:54Z | [] | [] |
Traceback (most recent call last):
File "/usr/bin/docker-compose", line 11, in <module>
sys.exit(main())
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 71, in main
command()
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 121, in perform_command
project = project_from_options('.', options)
File "/usr/lib/python2.7/site-packages/compose/cli/command.py", line 40, in project_from_options
override_dir=options.get('--project-directory'),
File "/usr/lib/python2.7/site-packages/compose/cli/command.py", line 118, in get_project
host=host, environment=environment
File "/usr/lib/python2.7/site-packages/compose/cli/command.py", line 93, in get_client
version_info = six.iteritems(client.version())
File "/usr/lib/python2.7/site-packages/docker/api/daemon.py", line 177, in version
return self._result(self._get(url), json=True)
File "/usr/lib/python2.7/site-packages/docker/utils/decorators.py", line 46, in inner
return f(self, *args, **kwargs)
File "/usr/lib/python2.7/site-packages/docker/api/client.py", line 191, in _get
return self.get(url, **self._set_request_timeout(kwargs))
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 521, in get
return self.request('GET', url, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 508, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 618, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 506, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='xxxxxx', port=2376): Max retries exceeded with url: /v1.25/version (Caused by SSLError(SSLError(1, u'[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:661)'),))
| 5,038 |
|||
docker/compose | docker__compose-5793 | 273985aee74deffa9cd703b63f7b262aee3c5e38 | diff --git a/compose/progress_stream.py b/compose/progress_stream.py
--- a/compose/progress_stream.py
+++ b/compose/progress_stream.py
@@ -8,6 +8,14 @@ class StreamOutputError(Exception):
pass
+def write_to_stream(s, stream):
+ try:
+ stream.write(s)
+ except UnicodeEncodeError:
+ encoding = getattr(stream, 'encoding', 'ascii')
+ stream.write(s.encode(encoding, errors='replace').decode(encoding))
+
+
def stream_output(output, stream):
is_terminal = hasattr(stream, 'isatty') and stream.isatty()
stream = utils.get_output_stream(stream)
@@ -34,18 +42,18 @@ def stream_output(output, stream):
if image_id not in lines:
lines[image_id] = len(lines)
- stream.write("\n")
+ write_to_stream("\n", stream)
diff = len(lines) - lines[image_id]
# move cursor up `diff` rows
- stream.write("%c[%dA" % (27, diff))
+ write_to_stream("%c[%dA" % (27, diff), stream)
print_output_event(event, stream, is_terminal)
if 'id' in event:
# move cursor back down
- stream.write("%c[%dB" % (27, diff))
+ write_to_stream("%c[%dB" % (27, diff), stream)
stream.flush()
@@ -60,36 +68,36 @@ def print_output_event(event, stream, is_terminal):
if is_terminal and 'stream' not in event:
# erase current line
- stream.write("%c[2K\r" % 27)
+ write_to_stream("%c[2K\r" % 27, stream)
terminator = "\r"
elif 'progressDetail' in event:
return
if 'time' in event:
- stream.write("[%s] " % event['time'])
+ write_to_stream("[%s] " % event['time'], stream)
if 'id' in event:
- stream.write("%s: " % event['id'])
+ write_to_stream("%s: " % event['id'], stream)
if 'from' in event:
- stream.write("(from %s) " % event['from'])
+ write_to_stream("(from %s) " % event['from'], stream)
status = event.get('status', '')
if 'progress' in event:
- stream.write("%s %s%s" % (status, event['progress'], terminator))
+ write_to_stream("%s %s%s" % (status, event['progress'], terminator), stream)
elif 'progressDetail' in event:
detail = event['progressDetail']
total = detail.get('total')
if 'current' in detail and total:
percentage = float(detail['current']) / float(total) * 100
- stream.write('%s (%.1f%%)%s' % (status, percentage, terminator))
+ write_to_stream('%s (%.1f%%)%s' % (status, percentage, terminator), stream)
else:
- stream.write('%s%s' % (status, terminator))
+ write_to_stream('%s%s' % (status, terminator), stream)
elif 'stream' in event:
- stream.write("%s%s" % (event['stream'], terminator))
+ write_to_stream("%s%s" % (event['stream'], terminator), stream)
else:
- stream.write("%s%s\n" % (status, terminator))
+ write_to_stream("%s%s\n" % (status, terminator), stream)
def get_digest_from_pull(events):
| 1.20.0 output character error
I recently upgraded to using version 1.20.0-rc2 and encountered this issue I'd never seen before. From the looks of it, it looks like it's an encoding issue... any other context needed?
```
Traceback (most recent call last):
File "bin/docker-compose", line 6, in <module>
File "compose/cli/main.py", line 71, in main
File "compose/cli/main.py", line 127, in perform_command
File "compose/cli/main.py", line 280, in build
File "compose/project.py", line 372, in build
File "compose/service.py", line 1003, in build
File "compose/progress_stream.py", line 23, in stream_output
File "compose/progress_stream.py", line 90, in print_output_event
UnicodeEncodeError: 'ascii' codec can't encode character '\u2013' in position 151: ordinal not in range(128)
[96] Failed to execute script docker-compose
Exited with code 255
```
| Thanks for the report! I'll take a closer look, but is there a possibility your shell isn't configured to output utf-8 encoded characters?
I've run these same scripts using docker 1.12 and I didn't have any issues with it before.
@helsont I'm having trouble reproducing this locally. If you don't mind answering those questions, that would be super helpful:
1. How did you install Compose?
2. What is the output of `docker-compose version`?
3. What command are you executing that causes the error to appear?
4. Are you redirecting the output of the program to a file or device other than the console?
Thanks!
To prefix this: I'm running docker-compose from within circleci's local testing CLI tool. It runs by using my local docker to emulate what's going on in their build servers.
1. I installed compose with the following commands:
```
curl -L https://github.com/docker/compose/releases/download/1.20.0-rc1/docker-compose-`uname -s`-`uname -m` > ~/docker-compose
chmod +x ~/docker-compose
```
2. docker-compose version:
```docker-compose version 1.20.0-rc1, build 86428af
docker-py version: 3.1.0
CPython version: 3.6.4
OpenSSL version: OpenSSL 1.0.1t 3 May 2016
```
3. The command that triggers it is `composer install` in a PHP laravel project.
Expected output:
```
...
namshi/jose suggests installing phpseclib/phpseclib (Allows to use Phpseclib as crypto engine, use version ^2.0.)
php-amqplib/php-amqplib suggests installing ext-sockets (Use AMQPSocketConnection)
voku/portable-utf8 suggests installing ext-intl (Use Intl for best performance)
phpspec/phpspec suggests installing phpspec/nyan-formatters (~1.0 – Adds Nyan formatters)
sebastian/global-state suggests installing ext-uopz (*)
```
Actual output:
```
...
namshi/jose suggests installing phpseclib/phpseclib (Allows to use Phpseclib as crypto engine, use version ^2.0.)
php-amqplib/php-amqplib suggests installing ext-sockets (Use AMQPSocketConnection)
voku/portable-utf8 suggests installing ext-intl (Use Intl for best performance)
Traceback (most recent call last):
File "bin/docker-compose", line 6, in <module>
File "compose/cli/main.py", line 71, in main
File "compose/cli/main.py", line 127, in perform_command
File "compose/cli/main.py", line 280, in build
File "compose/project.py", line 372, in build
File "compose/service.py", line 1004, in build
File "compose/progress_stream.py", line 23, in stream_output
File "compose/progress_stream.py", line 90, in print_output_event
UnicodeEncodeError: 'ascii' codec can't encode character '\u2013' in position 71: ordinal not in range(128)
[39] Failed to execute script docker-compose
```
4. It's being run inside a docker container managed by circleci. | 2018-03-15T23:23:54Z | [] | [] |
Traceback (most recent call last):
File "bin/docker-compose", line 6, in <module>
File "compose/cli/main.py", line 71, in main
File "compose/cli/main.py", line 127, in perform_command
File "compose/cli/main.py", line 280, in build
File "compose/project.py", line 372, in build
File "compose/service.py", line 1003, in build
File "compose/progress_stream.py", line 23, in stream_output
File "compose/progress_stream.py", line 90, in print_output_event
UnicodeEncodeError: 'ascii' codec can't encode character '\u2013' in position 151: ordinal not in range(128)
| 5,053 |
|||
docker/compose | docker__compose-5858 | 2975f06ca2301f2b1d392c4dcfb43ddbc1baff39 | diff --git a/compose/parallel.py b/compose/parallel.py
--- a/compose/parallel.py
+++ b/compose/parallel.py
@@ -279,9 +279,7 @@ def add_object(self, msg, obj_index):
def write_initial(self, msg, obj_index):
if msg is None:
return
- self.stream.write("{:<{width}} ... \r\n".format(
- msg + ' ' + obj_index, width=self.width))
- self.stream.flush()
+ return self._write_noansi(msg, obj_index, '')
def _write_ansi(self, msg, obj_index, status):
self.lock.acquire()
@@ -299,8 +297,11 @@ def _write_ansi(self, msg, obj_index, status):
self.lock.release()
def _write_noansi(self, msg, obj_index, status):
- self.stream.write("{:<{width}} ... {}\r\n".format(msg + ' ' + obj_index,
- status, width=self.width))
+ self.stream.write(
+ "{:<{width}} ... {}\r\n".format(
+ msg + ' ' + obj_index, status, width=self.width
+ )
+ )
self.stream.flush()
def write(self, msg, obj_index, status, color_func):
diff --git a/compose/project.py b/compose/project.py
--- a/compose/project.py
+++ b/compose/project.py
@@ -556,7 +556,11 @@ def pull_service(service):
limit=5,
)
if len(errors):
- raise ProjectError(b"\n".join(errors.values()))
+ combined_errors = '\n'.join([
+ e.decode('utf-8') if isinstance(e, six.binary_type) else e for e in errors.values()
+ ])
+ raise ProjectError(combined_errors)
+
else:
for service in services:
service.pull(ignore_pull_failures, silent=silent)
| docker-compose pull --parallel crash with "expected a bytes-like object, str found"
## Description of the issue
When running `docker-compose pull --parallel` with a compose file version 2.1 or higher, docker-compose will fail with error message `TypeError: sequence item 0: expected a bytes-like object, str found`.
## Context information (for bug reports)
```
docker-compose version 1.20.1, build 5d8c71b
docker-py version: 3.1.4
CPython version: 3.6.4
OpenSSL version: OpenSSL 1.0.1t 3 May 2016
```
```
Client:
Version: 17.12.1-ce
API version: 1.35
Go version: go1.9.4
Git commit: 7390fc6
Built: Tue Feb 27 22:15:20 2018
OS/Arch: linux/amd64
Server:
Engine:
Version: 17.12.1-ce
API version: 1.35 (minimum version 1.12)
Go version: go1.9.4
Git commit: 7390fc6
Built: Tue Feb 27 22:17:54 2018
OS/Arch: linux/amd64
Experimental: false
```
```
version: '2.1'
services:
test:
image: alpine:does-not-exist
```
## Steps to reproduce the issue
1. Copy the above docker-compose.yaml file
2. Run `docker-compose pull --parallel`
### Observed result
Exception with string-operations-related traceback.
### Expected result
No misleading exception, just clean error message.
### Stacktrace / full error message
```
Traceback (most recent call last):
File "bin/docker-compose", line 6, in <module>
File "compose/cli/main.py", line 71, in main
File "compose/cli/main.py", line 127, in perform_command
File "compose/cli/main.py", line 716, in pull
File "compose/project.py", line 558, in pull
TypeError: sequence item 0: expected a bytes-like object, str found
```
## Additional information
RedHat, docker-compose installed by downloading https://github.com/docker/compose/releases/download/1.20.1/docker-compose-Linux-x86_64.
| Thanks for the report! | 2018-04-07T00:47:27Z | [] | [] |
Traceback (most recent call last):
File "bin/docker-compose", line 6, in <module>
File "compose/cli/main.py", line 71, in main
File "compose/cli/main.py", line 127, in perform_command
File "compose/cli/main.py", line 716, in pull
File "compose/project.py", line 558, in pull
TypeError: sequence item 0: expected a bytes-like object, str found
| 5,065 |
|||
docker/compose | docker__compose-6312 | 69fe42027adfcdb8c348982c7a66e12b44db3734 | diff --git a/compose/container.py b/compose/container.py
--- a/compose/container.py
+++ b/compose/container.py
@@ -96,6 +96,8 @@ def number(self):
@property
def slug(self):
+ if not self.full_slug:
+ return None
return truncate_id(self.full_slug)
@property
| docker-compose fails with TypeError: argument of type 'NoneType' is not iterable
<!--
Welcome to the docker-compose issue tracker! Before creating an issue, please heed the following:
1. This tracker should only be used to report bugs and request features / enhancements to docker-compose
- For questions and general support, use https://forums.docker.com
- For documentation issues, use https://github.com/docker/docker.github.io
- For issues with the `docker stack` commands and the version 3 of the Compose file, use
https://github.com/docker/cli
2. Use the search function before creating a new issue. Duplicates will be closed and directed to
the original discussion.
3. When making a bug report, make sure you provide all required information. The easier it is for
maintainers to reproduce, the faster it'll be fixed.
-->
## Description of the issue
After recent update of docker-compose to version 1.23.0 suddenly builds started to fail with error (yesterday was ok):
`TypeError: argument of type 'NoneType' is not iterable`. It appears to me, that underlying Python libraries are failing during docker-compose process.
## Context information (for bug reports)
**Output of `docker-compose version`**
```
docker-compose version 1.23.0, build c8524dc
docker-py version: 3.5.1
CPython version: 2.7.15
OpenSSL version: LibreSSL 2.7.4
```
**Output of `docker version`**
```
Client:
Version: 18.06.1-ce
API version: 1.37 (downgraded from 1.38)
Go version: go1.10.3
Git commit: e68fc7a
Built: Tue Aug 21 17:20:43 2018
OS/Arch: linux/amd64
Experimental: false
Server:
Engine:
Version: 18.03.0-ce
API version: 1.37 (minimum version 1.12)
Go version: go1.9.4
Git commit: 0520e24
Built: Wed Mar 21 23:08:31 2018
OS/Arch: linux/amd64
Experimental: false
```
**Output of `docker-compose config`**
(Make sure to add the relevant `-f` and other flags)
```
docker-compose -f docker/docker-compose.yml -f docker/docker-compose.dev.yml config
networks:
backend: {}
frontend:
external:
name: nginxproxy_default
services:
front:
build:
context: /builds/project/project_name
dockerfile: docker/nginx/Dockerfile
depends_on:
- php
image: localrepo:5005/project/front
links:
- php
networks:
backend: null
frontend: null
restart: always
php:
build:
context: /builds/project/project_name
dockerfile: docker/php/Dockerfile
image: localrepo:5005/project/php
networks:
backend: null
version: '3.0'
```
### Observed result
docker-compose fails with TypeError: argument of type 'NoneType' is not iterable
### Expected result
Build is successful
### Stacktrace / full error message
command line where it fails (see log, I've included installations also):
```
docker-compose -f docker/docker-compose.yml -f docker/docker-compose.dev.yml run --no-deps php bin/console doctrine:migrations:migrate --allow-no-migration --no-interaction
```
```
$ pip install --no-cache-dir docker-compose
Collecting docker-compose
Downloading https://files.pythonhosted.org/packages/23/e7/3702078bb674d36e607c48177f4e7d93d6fecb13c32a8889d1172236848d/docker_compose-1.23.0-py2.py3-none-any.whl (131kB)
Collecting websocket-client<1.0,>=0.32.0 (from docker-compose)
Downloading https://files.pythonhosted.org/packages/14/d4/6a8cd4e7f67da465108c7cc0a307a1c5da7e2cdf497330b682069b1d4758/websocket_client-0.53.0-py2.py3-none-any.whl (198kB)
Collecting PyYAML<4,>=3.10 (from docker-compose)
Downloading https://files.pythonhosted.org/packages/9e/a3/1d13970c3f36777c583f136c136f804d70f500168edc1edea6daa7200769/PyYAML-3.13.tar.gz (270kB)
Collecting dockerpty<0.5,>=0.4.1 (from docker-compose)
Downloading https://files.pythonhosted.org/packages/8d/ee/e9ecce4c32204a6738e0a5d5883d3413794d7498fe8b06f44becc028d3ba/dockerpty-0.4.1.tar.gz
Collecting backports.ssl-match-hostname>=3.5; python_version < "3.5" (from docker-compose)
Downloading https://files.pythonhosted.org/packages/76/21/2dc61178a2038a5cb35d14b61467c6ac632791ed05131dda72c20e7b9e23/backports.ssl_match_hostname-3.5.0.1.tar.gz
Collecting docopt<0.7,>=0.6.1 (from docker-compose)
Downloading https://files.pythonhosted.org/packages/a2/55/8f8cab2afd404cf578136ef2cc5dfb50baa1761b68c9da1fb1e4eed343c9/docopt-0.6.2.tar.gz
Collecting ipaddress>=1.0.16; python_version < "3.3" (from docker-compose)
Downloading https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting enum34<2,>=1.0.4; python_version < "3.4" (from docker-compose)
Downloading https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting requests!=2.11.0,!=2.12.2,!=2.18.0,<2.21,>=2.6.1 (from docker-compose)
Downloading https://files.pythonhosted.org/packages/f1/ca/10332a30cb25b627192b4ea272c351bce3ca1091e541245cccbace6051d8/requests-2.20.0-py2.py3-none-any.whl (60kB)
Collecting texttable<0.10,>=0.9.0 (from docker-compose)
Downloading https://files.pythonhosted.org/packages/02/e1/2565e6b842de7945af0555167d33acfc8a615584ef7abd30d1eae00a4d80/texttable-0.9.1.tar.gz
Collecting docker<4.0,>=3.5.0 (from docker-compose)
Downloading https://files.pythonhosted.org/packages/c2/76/b8091dc6d9db038af62ae88f228da656a84632cf5d7a84dcf54c613d3fd0/docker-3.5.1-py2.py3-none-any.whl (126kB)
Collecting jsonschema<3,>=2.5.1 (from docker-compose)
Downloading https://files.pythonhosted.org/packages/77/de/47e35a97b2b05c2fadbec67d44cfcdcd09b8086951b331d82de90d2912da/jsonschema-2.6.0-py2.py3-none-any.whl
Collecting cached-property<2,>=1.2.0 (from docker-compose)
Downloading https://files.pythonhosted.org/packages/3b/86/85c1be2e8db9e13ef9a350aecd6dea292bd612fa288c2f40d035bb750ded/cached_property-1.5.1-py2.py3-none-any.whl
Collecting six<2,>=1.3.0 (from docker-compose)
Downloading https://files.pythonhosted.org/packages/67/4b/141a581104b1f6397bfa78ac9d43d8ad29a7ca43ea90a2d863fe3056e86a/six-1.11.0-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests!=2.11.0,!=2.12.2,!=2.18.0,<2.21,>=2.6.1->docker-compose)
Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)
Collecting certifi>=2017.4.17 (from requests!=2.11.0,!=2.12.2,!=2.18.0,<2.21,>=2.6.1->docker-compose)
Downloading https://files.pythonhosted.org/packages/56/9d/1d02dd80bc4cd955f98980f28c5ee2200e1209292d5f9e9cc8d030d18655/certifi-2018.10.15-py2.py3-none-any.whl (146kB)
Collecting urllib3<1.25,>=1.21.1 (from requests!=2.11.0,!=2.12.2,!=2.18.0,<2.21,>=2.6.1->docker-compose)
Downloading https://files.pythonhosted.org/packages/8c/4b/5cbc4cb46095f369117dcb751821e1bef9dd86a07c968d8757e9204c324c/urllib3-1.24-py2.py3-none-any.whl (117kB)
Collecting idna<2.8,>=2.5 (from requests!=2.11.0,!=2.12.2,!=2.18.0,<2.21,>=2.6.1->docker-compose)
Downloading https://files.pythonhosted.org/packages/4b/2a/0276479a4b3caeb8a8c1af2f8e4355746a97fab05a372e4a2c6a6b876165/idna-2.7-py2.py3-none-any.whl (58kB)
Collecting docker-pycreds>=0.3.0 (from docker<4.0,>=3.5.0->docker-compose)
Downloading https://files.pythonhosted.org/packages/ea/bf/7e70aeebc40407fbdb96fa9f79fc8e4722ea889a99378303e3bcc73f4ab5/docker_pycreds-0.3.0-py2.py3-none-any.whl
Collecting functools32; python_version == "2.7" (from jsonschema<3,>=2.5.1->docker-compose)
Downloading https://files.pythonhosted.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz
Installing collected packages: six, websocket-client, PyYAML, dockerpty, backports.ssl-match-hostname, docopt, ipaddress, enum34, chardet, certifi, urllib3, idna, requests, texttable, docker-pycreds, docker, functools32, jsonschema, cached-property, docker-compose
Running setup.py install for PyYAML: started
Running setup.py install for PyYAML: finished with status 'done'
Running setup.py install for dockerpty: started
Running setup.py install for dockerpty: finished with status 'done'
Running setup.py install for backports.ssl-match-hostname: started
Running setup.py install for backports.ssl-match-hostname: finished with status 'done'
Running setup.py install for docopt: started
Running setup.py install for docopt: finished with status 'done'
Running setup.py install for texttable: started
Running setup.py install for texttable: finished with status 'done'
Running setup.py install for functools32: started
Running setup.py install for functools32: finished with status 'done'
Successfully installed PyYAML-3.13 backports.ssl-match-hostname-3.5.0.1 cached-property-1.5.1 certifi-2018.10.15 chardet-3.0.4 docker-3.5.1 docker-compose-1.23.0 docker-pycreds-0.3.0 dockerpty-0.4.1 docopt-0.6.2 enum34-1.1.6 functools32-3.2.3.post2 idna-2.7 ipaddress-1.0.22 jsonschema-2.6.0 requests-2.20.0 six-1.11.0 texttable-0.9.1 urllib3-1.24 websocket-client-0.53.0
You are using pip version 10.0.1, however version 18.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
$ docker-compose -f docker/docker-compose.yml -f docker/docker-compose.dev.yml run --no-deps php bin/console doctrine:migrations:migrate --allow-no-migration --no-interaction
Traceback (most recent call last):
File "/usr/bin/docker-compose", line 11, in <module>
sys.exit(main())
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 71, in main
command()
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 127, in perform_command
handler(command, command_options)
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 873, in run
self.toplevel_options, self.project_dir
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 1328, in run_one_off_container
**container_options)
File "/usr/lib/python2.7/site-packages/compose/service.py", line 326, in create_container
previous_container=previous_container,
File "/usr/lib/python2.7/site-packages/compose/service.py", line 895, in _get_container_create_options
one_off=one_off)
File "/usr/lib/python2.7/site-packages/compose/service.py", line 969, in _get_container_host_config
links=self._get_links(link_to_self=one_off),
File "/usr/lib/python2.7/site-packages/compose/service.py", line 800, in _get_links
links[container.name_without_project] = container.name
File "/usr/lib/python2.7/site-packages/compose/container.py", line 85, in name_without_project
return '{0}_{1}{2}'.format(self.service, self.number, '_' + self.slug if self.slug else '')
File "/usr/lib/python2.7/site-packages/compose/container.py", line 99, in slug
return truncate_id(self.full_slug)
File "/usr/lib/python2.7/site-packages/compose/utils.py", line 168, in truncate_id
if ':' in value:
TypeError: argument of type 'NoneType' is not iterable
ERROR: Job failed: exit code 1
```
Previous successfull biulds:
```
Successfully installed PyYAML-3.13 backports.ssl-match-hostname-3.5.0.1 cached-property-1.5.1 certifi-2018.10.15 chardet-3.0.4 docker-3.5.1 docker-compose-1.22.0 docker-pycreds-0.3.0 dockerpty-0.4.1 docopt-0.6.2 enum34-1.1.6 functools32-3.2.3.post2 idna-2.6 ipaddress-1.0.22 jsonschema-2.6.0 requests-2.18.4 six-1.11.0 texttable-0.9.1 urllib3-1.22 websocket-client-0.53.0
You are using pip version 10.0.1, however version 18.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
$ docker-compose -f docker/docker-compose.yml -f docker/docker-compose.dev.yml run --no-deps php bin/console doctrine:migrations:migrate --allow-no-migration --no-interaction
Application Migrations
No migrations to execute.
```
| +1
Also seeing this after upgrade.
1. `docker-compose stop`
2. Upgrade to `docker-compose` 1.23.0 using:
```
curl -L https://github.com/docker/compose/releases/download/1.23.0/docker-compose-`uname -s`-`uname -m` -o /usr/local/bin/docker-compose
chmod +x /usr/local/bin/docker-compose
```
3. `docker-compose start` to restart services after upgrade. This succeeds.
4. `docker-compose logs -f`:
```
Traceback (most recent call last):
File "bin/docker-compose", line 6, in <module>
File "compose/cli/main.py", line 71, in main
File "compose/cli/main.py", line 127, in perform_command
File "compose/cli/main.py", line 655, in logs
File "compose/cli/log_printer.py", line 87, in run
File "compose/cli/log_printer.py", line 255, in consume_queue
File "compose/cli/log_printer.py", line 161, in tail_container_logs
File "compose/cli/log_printer.py", line 27, in present
File "compose/container.py", line 85, in name_without_project
File "compose/container.py", line 99, in slug
File "compose/utils.py", line 168, in truncate_id
TypeError: argument of type 'NoneType' is not iterable
[1523] Failed to execute script docker-compose
```
I was seeing this error, but things seemed to work for me after removing the old containers
Thanks for the report!
We'll make sure to get this fixed promptly.
I can confirm that removing the containers created before the upgrade to 1.23.0 fixes it. | 2018-10-31T20:56:16Z | [] | [] |
Traceback (most recent call last):
File "/usr/bin/docker-compose", line 11, in <module>
sys.exit(main())
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 71, in main
command()
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 127, in perform_command
handler(command, command_options)
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 873, in run
self.toplevel_options, self.project_dir
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 1328, in run_one_off_container
**container_options)
File "/usr/lib/python2.7/site-packages/compose/service.py", line 326, in create_container
previous_container=previous_container,
File "/usr/lib/python2.7/site-packages/compose/service.py", line 895, in _get_container_create_options
one_off=one_off)
File "/usr/lib/python2.7/site-packages/compose/service.py", line 969, in _get_container_host_config
links=self._get_links(link_to_self=one_off),
File "/usr/lib/python2.7/site-packages/compose/service.py", line 800, in _get_links
links[container.name_without_project] = container.name
File "/usr/lib/python2.7/site-packages/compose/container.py", line 85, in name_without_project
return '{0}_{1}{2}'.format(self.service, self.number, '_' + self.slug if self.slug else '')
File "/usr/lib/python2.7/site-packages/compose/container.py", line 99, in slug
return truncate_id(self.full_slug)
File "/usr/lib/python2.7/site-packages/compose/utils.py", line 168, in truncate_id
if ':' in value:
TypeError: argument of type 'NoneType' is not iterable
| 5,104 |
|||
docker/compose | docker__compose-6326 | f009de025c1afff70e2feb34f77257f162509ced | diff --git a/compose/config/validation.py b/compose/config/validation.py
--- a/compose/config/validation.py
+++ b/compose/config/validation.py
@@ -330,7 +330,10 @@ def handle_generic_error(error, path):
def parse_key_from_error_msg(error):
- return error.message.split("'")[1]
+ try:
+ return error.message.split("'")[1]
+ except IndexError:
+ return error.message.split('(')[1].split(' ')[0].strip("'")
def path_string(path):
| [probably duplicate?] IndexError from Validation
Stack trace below. Below is the compose file content, docker version info, and stack trace resp
``` yaml
{
"services": {
"postgres": {
"image": "postgres:9.6"
},
"ports": {
5432: 5432
},
},
"version": "3.1"
}
```
``` bash
$ docker-compose version
docker-compose version 1.22.0, build f46880f
docker-py version: 3.5.0
CPython version: 3.5.2
OpenSSL version: OpenSSL 1.0.2p 14 Aug 2018
```
Stack trace
``` bash
$ docker-compose up
Traceback (most recent call last):
File "/Users/andy/virtualenvs/kode-venv/bin/docker-compose", line 11, in <module>
sys.exit(main())
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/cli/main.py", line 71, in main
command()
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/cli/main.py", line 124, in perform_command
project = project_from_options('.', options)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/cli/command.py", line 41, in project_from_options
compatibility=options.get('--compatibility'),
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/cli/command.py", line 113, in get_project
config_data = config.load(config_details, compatibility)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/config.py", line 385, in load
for config_file in config_details.config_files
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/config.py", line 385, in <listcomp>
for config_file in config_details.config_files
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/config.py", line 552, in process_config_file
validate_against_config_schema(config_file)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/validation.py", line 437, in validate_against_config_schema
config_file.filename)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/validation.py", line 498, in handle_errors
error_msg = '\n'.join(format_error_func(error) for error in errors)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/validation.py", line 498, in <genexpr>
error_msg = '\n'.join(format_error_func(error) for error in errors)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/validation.py", line 420, in process_config_schema_errors
error_msg = handle_error_for_schema_with_id(error, path)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/validation.py", line 274, in handle_error_for_schema_with_id
invalid_config_key = parse_key_from_error_msg(error)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/validation.py", line 333, in parse_key_from_error_msg
return error.message.split("'")[1]
IndexError: list index out of range
```
| @andy-d Thanks for the report! We'll make sure that gets fixed.
FWIW, your Compose file should look like this instead:
```yaml
{
"services": {
"postgres": {
"image": "postgres:9.6",
"ports": ["5432:5432"]
},
},
"version": "3.1"
}
```
(see the [Compose file reference](https://docs.docker.com/compose/compose-file) for details)
Thanks @shin- yep I corrected that. The issue was the stack trace as you know clearly. | 2018-11-05T21:46:29Z | [] | [] |
Traceback (most recent call last):
File "/Users/andy/virtualenvs/kode-venv/bin/docker-compose", line 11, in <module>
sys.exit(main())
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/cli/main.py", line 71, in main
command()
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/cli/main.py", line 124, in perform_command
project = project_from_options('.', options)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/cli/command.py", line 41, in project_from_options
compatibility=options.get('--compatibility'),
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/cli/command.py", line 113, in get_project
config_data = config.load(config_details, compatibility)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/config.py", line 385, in load
for config_file in config_details.config_files
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/config.py", line 385, in <listcomp>
for config_file in config_details.config_files
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/config.py", line 552, in process_config_file
validate_against_config_schema(config_file)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/validation.py", line 437, in validate_against_config_schema
config_file.filename)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/validation.py", line 498, in handle_errors
error_msg = '\n'.join(format_error_func(error) for error in errors)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/validation.py", line 498, in <genexpr>
error_msg = '\n'.join(format_error_func(error) for error in errors)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/validation.py", line 420, in process_config_schema_errors
error_msg = handle_error_for_schema_with_id(error, path)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/validation.py", line 274, in handle_error_for_schema_with_id
invalid_config_key = parse_key_from_error_msg(error)
File "/Users/andy/virtualenvs/kode-venv/lib/python3.5/site-packages/compose/config/validation.py", line 333, in parse_key_from_error_msg
return error.message.split("'")[1]
IndexError: list index out of range
| 5,107 |
|||
docker/compose | docker__compose-6379 | eedbb28d5e577eb25595241592ea7b7bb97c902c | diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -36,7 +36,7 @@ def find_version(*file_paths):
'requests >= 2.6.1, != 2.11.0, != 2.12.2, != 2.18.0, < 2.21',
'texttable >= 0.9.0, < 0.10',
'websocket-client >= 0.32.0, < 1.0',
- 'docker >= 3.5.0, < 4.0',
+ 'docker >= 3.6.0, < 4.0',
'dockerpty >= 0.4.1, < 0.5',
'six >= 1.3.0, < 2',
'jsonschema >= 2.5.1, < 3',
| compose crashes on relative paths after release
## Description of the issue
Receive a stack trace on build when relative paths are used in the build config. The release prior to this worked fine. I didn't get the version number, but I kept updating with every version that was being automatically downloaded by Docker CE on Windows 10.
## Context information (for bug reports)
**Output of `docker-compose version`**
```
docker-compose version 1.23.1, build b02f1306
docker-py version: 3.5.0
CPython version: 3.6.6
OpenSSL version: OpenSSL 1.0.2o 27 Mar 2018
```
**Output of `docker version`**
```
Client: Docker Engine - Community
Version: 18.09.0
API version: 1.39
Go version: go1.10.4
Git commit: 4d60db4
Built: Wed Nov 7 00:47:51 2018
OS/Arch: windows/amd64
Experimental: false
Server: Docker Engine - Community
Engine:
Version: 18.09.0
API version: 1.39 (minimum version 1.24)
Go version: go1.10.4
Git commit: 4d60db4
Built: Wed Nov 7 00:56:41 2018
OS/Arch: windows/amd64
Experimental: false
```
**Output of `docker-compose config`**
(Make sure to add the relevant `-f` and other flags)
```
networks:
devtestnet:
ipam:
config:
- subnet: 172.28.28.0/24
driver: default
services:
web:
build:
context: C:\SertifiKiln\__BuildAndRunInDocker\Volume\DevBuildWorkspace
dockerfile: ../../Scripts/Compose/DevWebDockerfile
container_name: c_devtest_com
expose:
- 80
- 443
image: devtest_com
networks:
devtestnet:
aliases:
- devtest.com
- login.devtest.com
volumes:
- C:\testKiln\__SandboxQAReleases\latest\Web:C:/inetpub/Applications/dotNet2.0/testEsign/Program:ro
- C:\SertifiKiln\__BuildAndRunInDocker\Volume\NAS:c:/NAS:rw
version: '3.7'
```
## Steps to reproduce the issue
1. docker-compose build
### Observed result
Stack trace
### Expected result
Build completes
### Stacktrace / full error message
```
docker-compose build
Building web
Traceback (most recent call last):
File "docker-compose", line 6, in <module>
File "compose\cli\main.py", line 71, in main
File "compose\cli\main.py", line 127, in perform_command
File "compose\cli\main.py", line 287, in build
File "compose\project.py", line 384, in build
File "compose\project.py", line 366, in build_service
File "compose\service.py", line 1080, in build
File "site-packages\docker\api\build.py", line 152, in build
File "site-packages\docker\api\build.py", line 346, in process_dockerfile
OSError: [Errno 22] Invalid argument: '\\\\?\\C:\\SertifiKiln\\__BuildAndRunInDocker\\Volume\\DevBuildWorkspace\\../../Scripts/Compose/DevWebDockerfile'
[18796] Failed to execute script docker-compose
```
## Additional information
OS version / distribution, `docker-compose` install method, etc.
Windows 10. Installed by going to https://store.docker.com/editions/community/docker-ce-desktop-windows , downloading the executable and installing.
Tried every combination of enclosing with single quotes, enclosing with double quotes, reversing the slashes, using \\ on the ../../Scripts/Compose/DevWebDockerfile path.
| Just checked. version 1.22.0 of docker-compose works fine. 1.23.1 and 1.23.0 do not work.
So it is an issue in latest version only. I downloaded https://github.com/docker/compose/releases/download/1.22.0/docker-compose-Windows-x86_64.exe , placed it into C:\Program Files\Docker\Docker\resources\bin and ran that instead. Build worked great
Output of different version commands
```
C:\SertifiKiln\__BuildAndRunInDocker\Volume\DevBuildWorkspace>docker version
Client: Docker Engine - Community
Version: 18.09.0
API version: 1.39
Go version: go1.10.4
Git commit: 4d60db4
Built: Wed Nov 7 00:47:51 2018
OS/Arch: windows/amd64
Experimental: false
Server: Docker Engine - Community
Engine:
Version: 18.09.0
API version: 1.39 (minimum version 1.24)
Go version: go1.10.4
Git commit: 4d60db4
Built: Wed Nov 7 00:56:41 2018
OS/Arch: windows/amd64
Experimental: false
C:\SertifiKiln\__BuildAndRunInDocker\Volume\DevBuildWorkspace>docker-compose version
docker-compose version 1.22.0, build f46880fe
docker-py version: 3.4.1
CPython version: 3.6.6
OpenSSL version: OpenSSL 1.0.2o 27 Mar 2018
```
Thanks for the report! We'll take a look asap.
+1 having the same issue, reverting back to docker for windows 18.06.1-ce (with built in docker-compose version 1.22.0) seems to have fixed it. | 2018-11-28T20:14:50Z | [] | [] |
Traceback (most recent call last):
File "docker-compose", line 6, in <module>
File "compose\cli\main.py", line 71, in main
File "compose\cli\main.py", line 127, in perform_command
File "compose\cli\main.py", line 287, in build
File "compose\project.py", line 384, in build
File "compose\project.py", line 366, in build_service
File "compose\service.py", line 1080, in build
File "site-packages\docker\api\build.py", line 152, in build
File "site-packages\docker\api\build.py", line 346, in process_dockerfile
OSError: [Errno 22] Invalid argument: '\\\\?\\C:\\SertifiKiln\\__BuildAndRunInDocker\\Volume\\DevBuildWorkspace\\../../Scripts/Compose/DevWebDockerfile'
| 5,117 |
|||
docker/compose | docker__compose-6388 | cfa5d02b52b9bff3a07aae8eca407d03da375c69 | diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -36,7 +36,7 @@ def find_version(*file_paths):
'requests >= 2.6.1, != 2.11.0, != 2.12.2, != 2.18.0, < 2.21',
'texttable >= 0.9.0, < 0.10',
'websocket-client >= 0.32.0, < 1.0',
- 'docker >= 3.6.0, < 4.0',
+ 'docker[ssh] >= 3.6.0, < 4.0',
'dockerpty >= 0.4.1, < 0.5',
'six >= 1.3.0, < 2',
'jsonschema >= 2.5.1, < 3',
| Docker Compose doesn't work when SSH connection used to remote Docker Engine
## Description of the issue
Just trying out the new SSH connection introduced in Docker 18.09 and I noticed an error when attempting to do `docker-compose up` whilst targeting a remote Docker Engine instance.
Errors message below appears to indicate that Compose isn't aware of the SSH protocol for this purpose
```
docker.errors.DockerException: Invalid bind address protocol: ssh://xfoxy.secinternal.local
[486] Failed to execute script docker-compose
```
## Context information (for bug reports)
**Output of `docker-compose version`**
```
docker-compose version 1.23.1, build b02f1306
docker-py version: 3.5.0
CPython version: 3.6.7
OpenSSL version: OpenSSL 1.1.0f 25 May 2017
```
**Output of `docker version`**
```
Client:
Version: 18.09.0
API version: 1.39
Go version: go1.10.4
Git commit: 4d60db4
Built: Wed Nov 7 00:49:01 2018
OS/Arch: linux/amd64
Experimental: false
Server: Docker Engine - Community
Engine:
Version: 18.09.0
API version: 1.39 (minimum version 1.12)
Go version: go1.10.4
Git commit: 4d60db4
Built: Wed Nov 7 00:16:44 2018
OS/Arch: linux/amd64
Experimental: false
```
**Output of `docker-compose config`**
(Make sure to add the relevant `-f` and other flags)
```
networks:
testnet: {}
services:
dradis:
image: raesene/dradis
networks:
testnet: null
ports:
- 3000/tcp
volumes:
- data:/data:rw
sectest:
image: raesene/sectest
networks:
testnet: null
ports:
- 22/tcp
volumes:
- data:/data:rw
version: '3.0'
volumes:
data: {}
```
## Steps to reproduce the issue
1. Configure a Docker client (18.09) to connect to a remote Docker engine instance via SSH
2. Run `docker-compose up` in a directory with a docker-compose.yml file.
3. Error occors.
### Observed result
Error occurs
### Expected result
Docker compose contacts the remote docker engine instance to create the containers.
### Stacktrace / full error message
```
Traceback (most recent call last):
File "bin/docker-compose", line 6, in <module>
File "compose/cli/main.py", line 71, in main
File "compose/cli/main.py", line 124, in perform_command
File "compose/cli/command.py", line 42, in project_from_options
File "compose/cli/command.py", line 123, in get_project
File "compose/cli/command.py", line 94, in get_client
File "compose/cli/docker_client.py", line 127, in docker_client
File "site-packages/docker/api/client.py", line 118, in __init__
File "site-packages/docker/utils/utils.py", line 256, in parse_host
docker.errors.DockerException: Invalid bind address protocol: ssh://xfoxy.secinternal.local
[486] Failed to execute script docker-compose
```
## Additional information
Client is WSL (Ubuntu 18.04) Server is Ubuntu 18.04 running Docker 18.09.
| Support for the SSH protocol will be added in the next version of Compose. https://github.com/docker/docker-py/issues/2159
Cool Thanks for the info. :) | 2018-12-01T00:25:22Z | [] | [] |
Traceback (most recent call last):
File "bin/docker-compose", line 6, in <module>
File "compose/cli/main.py", line 71, in main
File "compose/cli/main.py", line 124, in perform_command
File "compose/cli/command.py", line 42, in project_from_options
File "compose/cli/command.py", line 123, in get_project
File "compose/cli/command.py", line 94, in get_client
File "compose/cli/docker_client.py", line 127, in docker_client
File "site-packages/docker/api/client.py", line 118, in __init__
File "site-packages/docker/utils/utils.py", line 256, in parse_host
docker.errors.DockerException: Invalid bind address protocol: ssh://xfoxy.secinternal.local
| 5,118 |
|||
docker/compose | docker__compose-6914 | d7c7e21921fba349f2fc2fa702c07d87166d80c9 | diff --git a/compose/service.py b/compose/service.py
--- a/compose/service.py
+++ b/compose/service.py
@@ -1787,7 +1787,7 @@ def build(self, path, tag=None, quiet=False, fileobj=None,
command_builder.add_flag("--force-rm", forcerm)
command_builder.add_arg("--memory", container_limits.get("memory"))
command_builder.add_flag("--no-cache", nocache)
- command_builder.add_flag("--progress", self._progress)
+ command_builder.add_arg("--progress", self._progress)
command_builder.add_flag("--pull", pull)
command_builder.add_arg("--tag", tag)
command_builder.add_arg("--target", target)
| [BUG] --progress is not a flag but an argument
<!--
Welcome to the docker-compose issue tracker! Before creating an issue, please heed the following:
1. This tracker should only be used to report bugs and request features / enhancements to docker-compose
- For questions and general support, use https://forums.docker.com
- For documentation issues, use https://github.com/docker/docker.github.io
- For issues with the `docker stack` commands and the version 3 of the Compose file, use
https://github.com/docker/cli
2. Use the search function before creating a new issue. Duplicates will be closed and directed to
the original discussion.
3. When making a bug report, make sure you provide all required information. The easier it is for
maintainers to reproduce, the faster it'll be fixed.
-->
## Description of the issue
Because the line [1790](https://github.com/docker/compose/blob/master/compose/service.py#L1790) asks for a flag
```
command_builder.add_flag("--progress", self._progress)
```
and not for an argument,
```
command_builder.add_arg("--progress", self._progress)
```
this stack trace is created.
Possible Solution:
Replace flag with arg and we will don't have a stack trace.
## Context information (for bug reports)
**Output of `docker-compose version`**
```
docker-compose version 1.25.0dev, build d7c7e219
docker-py version: 4.0.1
CPython version: 3.7.4
OpenSSL version: OpenSSL 1.1.1c 28 May 2019
```
**Output of `docker version`**
```
Client: Docker Engine - Community
Version: 19.03.2
API version: 1.40
Go version: go1.12.8
Git commit: 6a30dfc
Built: Thu Aug 29 05:26:49 2019
OS/Arch: darwin/amd64
Experimental: false
Server: Docker Engine - Community
Engine:
Version: 19.03.2
API version: 1.40 (minimum version 1.12)
Go version: go1.12.8
Git commit: 6a30dfc
Built: Thu Aug 29 05:32:21 2019
OS/Arch: linux/amd64
Experimental: false
containerd:
Version: v1.2.6
GitCommit: 894b81a4b802e4eb2a91d1ce216b8817763c29fb
runc:
Version: 1.0.0-rc8
GitCommit: 425e105d5a03fabd737a126ad93d62a9eeede87f
docker-init:
Version: 0.18.0
GitCommit: fec3683
```
**Output of `docker-compose config`**
(Make sure to add the relevant `-f` and other flags)
```
services:
redis:
image: redis:alpine
web:
build:
context: /Users/lukas.hettwer/workspace/compose
ports:
- 5000:5000/tcp
version: '3.0'
```
## Steps to reproduce the issue
1. COMPOSE_DOCKER_CLI_BUILD=1 DOCKER_BUILDKIT=1 docker-compose build --progress tty
### Observed result
```
redis uses an image, skipping
WARNING: Native build is an experimental feature and could change at any time
Building web
"docker build" requires exactly 1 argument.
See 'docker build --help'.
Usage: docker build [OPTIONS] PATH | URL | -
Build an image from a Dockerfile
Traceback (most recent call last):
File "docker-compose", line 6, in <module>
File "compose/cli/main.py", line 71, in main
command()
File "compose/cli/main.py", line 127, in perform_command
handler(command, command_options)
File "compose/cli/main.py", line 303, in build
progress=options.get('--progress'),
File "compose/project.py", line 396, in build
build_service(service)
File "compose/project.py", line 379, in build_service
service.build(no_cache, pull, force_rm, memory, build_args, gzip, rm, silent, cli, progress)
File "compose/service.py", line 1105, in build
platform=self.platform,
File "compose/progress_stream.py", line 25, in stream_output
for event in utils.json_stream(output):
File "compose/utils.py", line 61, in split_buffer
for data in stream_as_text(stream):
File "compose/utils.py", line 37, in stream_as_text
for data in stream:
File "compose/service.py", line 1808, in build
while p.poll() is None:
FileNotFoundError: [Errno 2] No such file or directory: '/var/folders/n4/8ll6pclx5j9glg4wn6478tjr0000gp/T/tmpkvkcguz1'
[84845] Failed to execute script docker-compose
```
## Additional information
OS version / distribution, `docker-compose` install method, etc.
Darwin macbook-pro-092.local 18.7.0 Darwin Kernel Version 18.7.0: Tue Aug 20 16:57:14 PDT 2019; root:xnu-4903.271.2~2/RELEASE_X86_64 x86_64
| 2019-09-24T14:12:36Z | [] | [] |
Traceback (most recent call last):
File "docker-compose", line 6, in <module>
File "compose/cli/main.py", line 71, in main
command()
File "compose/cli/main.py", line 127, in perform_command
handler(command, command_options)
File "compose/cli/main.py", line 303, in build
progress=options.get('--progress'),
File "compose/project.py", line 396, in build
build_service(service)
File "compose/project.py", line 379, in build_service
service.build(no_cache, pull, force_rm, memory, build_args, gzip, rm, silent, cli, progress)
File "compose/service.py", line 1105, in build
platform=self.platform,
File "compose/progress_stream.py", line 25, in stream_output
for event in utils.json_stream(output):
File "compose/utils.py", line 61, in split_buffer
for data in stream_as_text(stream):
File "compose/utils.py", line 37, in stream_as_text
for data in stream:
File "compose/service.py", line 1808, in build
while p.poll() is None:
FileNotFoundError: [Errno 2] No such file or directory: '/var/folders/n4/8ll6pclx5j9glg4wn6478tjr0000gp/T/tmpkvkcguz1'
| 5,159 |
||||
docker/compose | docker__compose-7037 | 2887d82d1649825ab21ed1f455b3d5d33a8739d5 | diff --git a/compose/cli/command.py b/compose/cli/command.py
--- a/compose/cli/command.py
+++ b/compose/cli/command.py
@@ -159,15 +159,25 @@ def get_project(project_dir, config_path=None, project_name=None, verbose=False,
def execution_context_labels(config_details, environment_file):
extra_labels = [
- '{0}={1}'.format(LABEL_WORKING_DIR, os.path.abspath(config_details.working_dir)),
- '{0}={1}'.format(LABEL_CONFIG_FILES, config_files_label(config_details)),
+ '{0}={1}'.format(LABEL_WORKING_DIR, os.path.abspath(config_details.working_dir))
]
+
+ if not use_config_from_stdin(config_details):
+ extra_labels.append('{0}={1}'.format(LABEL_CONFIG_FILES, config_files_label(config_details)))
+
if environment_file is not None:
extra_labels.append('{0}={1}'.format(LABEL_ENVIRONMENT_FILE,
os.path.normpath(environment_file)))
return extra_labels
+def use_config_from_stdin(config_details):
+ for c in config_details.config_files:
+ if not c.filename:
+ return True
+ return False
+
+
def config_files_label(config_details):
return ",".join(
map(str, (os.path.normpath(c.filename) for c in config_details.config_files)))
| When piping compose config cannot run `sh`
## Description of the issue
I have a simple config file, that I pipe to `docker-compose` through `stdin`. In my case it's JSON, as I dynamically create it using a nodejs script:
```json
{
"version": "3.3",
"services": {
"app": {
"image": "node:10",
"tty": true,
"stdin_open": true
}
}
}
```
This works flawlessly for `run` or `up`, as long as I don't try to use a shell. The following commands silent quit:
```sh
cat docker-compose.json | docker-compose -f- run --rm app sh
cat docker-compose.json | docker-compose -f- run --rm --entrypoint=/bin/sh app
```
## Context information (for bug reports)
**Output of `docker-compose version`**
```
docker-compose version 1.24.1, build 4667896b
docker-py version: 3.7.3
CPython version: 3.6.8
OpenSSL version: OpenSSL 1.1.0j 20 Nov 2018
```
**Output of `docker version`**
```
Client: Docker Engine - Community
Version: 19.03.2
API version: 1.40
Go version: go1.12.8
Git commit: 6a30dfca03
Built: Thu Aug 29 05:29:49 2019
OS/Arch: linux/amd64
Experimental: false
Server: Docker Engine - Community
Engine:
Version: 19.03.2
API version: 1.40 (minimum version 1.12)
Go version: go1.12.8
Git commit: 6a30dfc
Built: Thu Aug 29 05:32:21 2019
OS/Arch: linux/amd64
Experimental: false
containerd:
Version: v1.2.6
GitCommit: 894b81a4b802e4eb2a91d1ce216b8817763c29fb
runc:
Version: 1.0.0-rc8
GitCommit: 425e105d5a03fabd737a126ad93d62a9eeede87f
docker-init:
Version: 0.18.0
GitCommit: fec3683
```
**Output of `docker-compose config`**
(Make sure to add the relevant `-f` and other flags)
cat docker-compose.json | docker-compose -f- config
```yaml
services:
app:
image: node:10
stdin_open: true
tty: true
version: '3.3'
```
docker-compose -f docker-compose.json config
```yaml
services:
app:
image: node:10
stdin_open: true
tty: true
version: '3.3'
```
## Steps to reproduce the issue
1.
2.
3.
### Observed result
### Expected result
### Stacktrace / full error message
```
(paste here)
```
## Additional information
OS version / distribution, `docker-compose` install method, etc.
Piped compose files failing in 1.25.0 release
## Description of the issue
Similar to the bug reported in #6981 during pre-release, it appears that all piping of compose files into `docker-compose` is resulting in an error.
## Context information (for bug reports)
**Output of `docker-compose version`**
```
docker-compose version 1.25.0, build unknown
```
**Output of `docker version`**
```
Client:
Version: 19.03.5-ce
API version: 1.40
Go version: go1.13.4
Git commit: 633a0ea838
Built: Fri Nov 15 03:19:09 2019
OS/Arch: linux/amd64
Experimental: false
Server:
Engine:
Version: 19.03.4-ce
API version: 1.40 (minimum version 1.12)
Go version: go1.13.1
Git commit: 9013bf583a
Built: Sat Oct 19 04:39:38 2019
OS/Arch: linux/amd64
Experimental: false
containerd:
Version: v1.3.0.m
GitCommit: d50db0a42053864a270f648048f9a8b4f24eced3.m
runc:
Version: 1.0.0-rc9
GitCommit: d736ef14f0288d6993a1845745d6756cfc9ddd5a
docker-init:
Version: 0.18.0
GitCommit: fec3683
```
**Output of `docker-compose config`**
(Make sure to add the relevant `-f` and other flags)
```
services:
foo:
image: busybox
version: '3.3'
```
## Steps to reproduce the issue
`cat docker-compose.yaml | docker-compose -f - up` (which works in 1.24.x when `docker-compose.yaml` is a valid compose file)
### Observed result
The compose process fails (trace included below)
### Expected result
The compose process should be the same as `docker-compose -f docker-compose.yaml up` (which still works as expected)
### Stacktrace / full error message
```
Traceback (most recent call last):
File "/usr/bin/docker-compose", line 11, in <module>
load_entry_point('docker-compose==1.25.0', 'console_scripts', 'docker-compose')()
File "/usr/lib/python3.8/site-packages/compose/cli/main.py", line 72, in main
command()
File "/usr/lib/python3.8/site-packages/compose/cli/main.py", line 125, in perform_command
project = project_from_options('.', options)
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 53, in project_from_options
return get_project(
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 156, in get_project
execution_context_labels(config_details, environment_file),
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 163, in execution_context_labels
'{0}={1}'.format(LABEL_CONFIG_FILES, config_files_label(config_details)),
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 172, in config_files_label
return ",".join(
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 173, in <genexpr>
map(str, (os.path.normpath(c.filename) for c in config_details.config_files)))
File "/usr/lib/python3.8/posixpath.py", line 336, in normpath
path = os.fspath(path)
TypeError: expected str, bytes or os.PathLike object, not NoneType
```
## Additional information
**OS version**: Arch Linux
**install method**: `community/docker-compose` package (installed via `pacman`)
| I think all of piping is now broken with one of the newer commits.
I used to be able to do this `echo "$composefile" | docker-compose -f- up -d`
But now I get Traceback
```
Traceback (most recent call last):
File "/home/atomi/.local/bin/docker-compose", line 8, in <module>
sys.exit(main())
File "/home/atomi/.local/lib/python3.6/site-packages/compose/cli/main.py", line 71, in main
command()
File "/home/atomi/.local/lib/python3.6/site-packages/compose/cli/main.py", line 124, in perform_command
project = project_from_options('.', options)
File "/home/atomi/.local/lib/python3.6/site-packages/compose/cli/command.py", line 64, in project_from_options
environment_file=environment_file
File "/home/atomi/.local/lib/python3.6/site-packages/compose/cli/command.py", line 156, in get_project
execution_context_labels(config_details, environment_file),
File "/home/atomi/.local/lib/python3.6/site-packages/compose/cli/command.py", line 163, in execution_context_labels
'{0}={1}'.format(LABEL_CONFIG_FILES, config_files_label(config_details)),
File "/home/atomi/.local/lib/python3.6/site-packages/compose/cli/command.py", line 173, in config_files_label
map(str, (os.path.normpath(c.filename) for c in config_details.config_files)))
File "/home/atomi/.local/lib/python3.6/site-packages/compose/cli/command.py", line 173, in <genexpr>
map(str, (os.path.normpath(c.filename) for c in config_details.config_files)))
File "/usr/lib/python3.6/posixpath.py", line 340, in normpath
path = os.fspath(path)
TypeError: expected str, bytes or os.PathLike object, not NoneType
```
I hope I'm not hijacking this issue as I think they are likely related.
I've tested this command `echo "$composefile" | docker-compose -f- up -d` through the various tags.
tag | status
--- | ---
bump-1.24.0 | works
bump-1.24.1 | works
bump-1.25-rc1 | works
bump-1.25-rc2 | works
bump-1.25-rc3 | fails
bump-1.25-rc4 | fails
I was able to reproduce this issue, will investigate tomorrow
(bug was introduced by https://github.com/docker/compose/commit/dbe4d7323eb8d38c98cec2810469bcd65266edb0)
@ndeloof this was a pressing enough issue for my team that I went ahead and pushed up PR #7035. This is a large enough regression, IMO, that it warrants a patch-level release as soon as possible, so let me know if there's anything that I can do to help! | 2019-11-20T08:01:51Z | [] | [] |
Traceback (most recent call last):
File "/usr/bin/docker-compose", line 11, in <module>
load_entry_point('docker-compose==1.25.0', 'console_scripts', 'docker-compose')()
File "/usr/lib/python3.8/site-packages/compose/cli/main.py", line 72, in main
command()
File "/usr/lib/python3.8/site-packages/compose/cli/main.py", line 125, in perform_command
project = project_from_options('.', options)
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 53, in project_from_options
return get_project(
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 156, in get_project
execution_context_labels(config_details, environment_file),
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 163, in execution_context_labels
'{0}={1}'.format(LABEL_CONFIG_FILES, config_files_label(config_details)),
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 172, in config_files_label
return ",".join(
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 173, in <genexpr>
map(str, (os.path.normpath(c.filename) for c in config_details.config_files)))
File "/usr/lib/python3.8/posixpath.py", line 336, in normpath
path = os.fspath(path)
TypeError: expected str, bytes or os.PathLike object, not NoneType
| 5,171 |
|||
docker/compose | docker__compose-7071 | 4038169d96ca176542c5265e21f4b18ee64b1dd9 | diff --git a/compose/__init__.py b/compose/__init__.py
--- a/compose/__init__.py
+++ b/compose/__init__.py
@@ -1,4 +1,4 @@
from __future__ import absolute_import
from __future__ import unicode_literals
-__version__ = '1.25.0'
+__version__ = '1.25.1-rc1'
diff --git a/compose/cli/command.py b/compose/cli/command.py
--- a/compose/cli/command.py
+++ b/compose/cli/command.py
@@ -159,15 +159,25 @@ def get_project(project_dir, config_path=None, project_name=None, verbose=False,
def execution_context_labels(config_details, environment_file):
extra_labels = [
- '{0}={1}'.format(LABEL_WORKING_DIR, os.path.abspath(config_details.working_dir)),
- '{0}={1}'.format(LABEL_CONFIG_FILES, config_files_label(config_details)),
+ '{0}={1}'.format(LABEL_WORKING_DIR, os.path.abspath(config_details.working_dir))
]
+
+ if not use_config_from_stdin(config_details):
+ extra_labels.append('{0}={1}'.format(LABEL_CONFIG_FILES, config_files_label(config_details)))
+
if environment_file is not None:
extra_labels.append('{0}={1}'.format(LABEL_ENVIRONMENT_FILE,
os.path.normpath(environment_file)))
return extra_labels
+def use_config_from_stdin(config_details):
+ for c in config_details.config_files:
+ if not c.filename:
+ return True
+ return False
+
+
def config_files_label(config_details):
return ",".join(
map(str, (os.path.normpath(c.filename) for c in config_details.config_files)))
diff --git a/compose/progress_stream.py b/compose/progress_stream.py
--- a/compose/progress_stream.py
+++ b/compose/progress_stream.py
@@ -114,3 +114,13 @@ def get_digest_from_push(events):
if digest:
return digest
return None
+
+
+def read_status(event):
+ status = event['status'].lower()
+ if 'progressDetail' in event:
+ detail = event['progressDetail']
+ if 'current' in detail and 'total' in detail:
+ percentage = float(detail['current']) / float(detail['total'])
+ status = '{} ({:.1%})'.format(status, percentage)
+ return status
diff --git a/compose/project.py b/compose/project.py
--- a/compose/project.py
+++ b/compose/project.py
@@ -11,6 +11,8 @@
import enum
import six
from docker.errors import APIError
+from docker.errors import ImageNotFound
+from docker.errors import NotFound
from docker.utils import version_lt
from . import parallel
@@ -25,6 +27,7 @@
from .network import build_networks
from .network import get_networks
from .network import ProjectNetworks
+from .progress_stream import read_status
from .service import BuildAction
from .service import ContainerNetworkMode
from .service import ContainerPidMode
@@ -619,49 +622,68 @@ def _get_convergence_plans(self, services, strategy, always_recreate_deps=False)
def pull(self, service_names=None, ignore_pull_failures=False, parallel_pull=False, silent=False,
include_deps=False):
services = self.get_services(service_names, include_deps)
- images_to_build = {service.image_name for service in services if service.can_be_built()}
- services_to_pull = [service for service in services if service.image_name not in images_to_build]
-
- msg = not silent and 'Pulling' or None
if parallel_pull:
- def pull_service(service):
- strm = service.pull(ignore_pull_failures, True, stream=True)
- if strm is None: # Attempting to pull service with no `image` key is a no-op
- return
+ self.parallel_pull(services, silent=silent)
- writer = parallel.get_stream_writer()
+ else:
+ must_build = []
+ for service in services:
+ try:
+ service.pull(ignore_pull_failures, silent=silent)
+ except (ImageNotFound, NotFound):
+ if service.can_be_built():
+ must_build.append(service.name)
+ else:
+ raise
+
+ if len(must_build):
+ log.warning('Some service image(s) must be built from source by running:\n'
+ ' docker-compose build {}'
+ .format(' '.join(must_build)))
+
+ def parallel_pull(self, services, ignore_pull_failures=False, silent=False):
+ msg = 'Pulling' if not silent else None
+ must_build = []
+ def pull_service(service):
+ strm = service.pull(ignore_pull_failures, True, stream=True)
+
+ if strm is None: # Attempting to pull service with no `image` key is a no-op
+ return
+
+ try:
+ writer = parallel.get_stream_writer()
for event in strm:
if 'status' not in event:
continue
- status = event['status'].lower()
- if 'progressDetail' in event:
- detail = event['progressDetail']
- if 'current' in detail and 'total' in detail:
- percentage = float(detail['current']) / float(detail['total'])
- status = '{} ({:.1%})'.format(status, percentage)
-
+ status = read_status(event)
writer.write(
msg, service.name, truncate_string(status), lambda s: s
)
+ except (ImageNotFound, NotFound):
+ if service.can_be_built():
+ must_build.append(service.name)
+ else:
+ raise
- _, errors = parallel.parallel_execute(
- services_to_pull,
- pull_service,
- operator.attrgetter('name'),
- msg,
- limit=5,
- )
- if len(errors):
- combined_errors = '\n'.join([
- e.decode('utf-8') if isinstance(e, six.binary_type) else e for e in errors.values()
- ])
- raise ProjectError(combined_errors)
+ _, errors = parallel.parallel_execute(
+ services,
+ pull_service,
+ operator.attrgetter('name'),
+ msg,
+ limit=5,
+ )
- else:
- for service in services_to_pull:
- service.pull(ignore_pull_failures, silent=silent)
+ if len(must_build):
+ log.warning('Some service image(s) must be built from source by running:\n'
+ ' docker-compose build {}'
+ .format(' '.join(must_build)))
+ if len(errors):
+ combined_errors = '\n'.join([
+ e.decode('utf-8') if isinstance(e, six.binary_type) else e for e in errors.values()
+ ])
+ raise ProjectError(combined_errors)
def push(self, service_names=None, ignore_push_failures=False):
unique_images = set()
diff --git a/script/release/release/downloader.py b/script/release/release/downloader.py
--- a/script/release/release/downloader.py
+++ b/script/release/release/downloader.py
@@ -55,6 +55,7 @@ def _download(self, url, full_dest):
def download_all(self, version):
files = {
+ 'docker-compose-Darwin-x86_64.tgz': None,
'docker-compose-Darwin-x86_64': None,
'docker-compose-Linux-x86_64': None,
'docker-compose-Windows-x86_64.exe': None,
| Piped compose files failing in 1.25.0 release
## Description of the issue
Similar to the bug reported in #6981 during pre-release, it appears that all piping of compose files into `docker-compose` is resulting in an error.
## Context information (for bug reports)
**Output of `docker-compose version`**
```
docker-compose version 1.25.0, build unknown
```
**Output of `docker version`**
```
Client:
Version: 19.03.5-ce
API version: 1.40
Go version: go1.13.4
Git commit: 633a0ea838
Built: Fri Nov 15 03:19:09 2019
OS/Arch: linux/amd64
Experimental: false
Server:
Engine:
Version: 19.03.4-ce
API version: 1.40 (minimum version 1.12)
Go version: go1.13.1
Git commit: 9013bf583a
Built: Sat Oct 19 04:39:38 2019
OS/Arch: linux/amd64
Experimental: false
containerd:
Version: v1.3.0.m
GitCommit: d50db0a42053864a270f648048f9a8b4f24eced3.m
runc:
Version: 1.0.0-rc9
GitCommit: d736ef14f0288d6993a1845745d6756cfc9ddd5a
docker-init:
Version: 0.18.0
GitCommit: fec3683
```
**Output of `docker-compose config`**
(Make sure to add the relevant `-f` and other flags)
```
services:
foo:
image: busybox
version: '3.3'
```
## Steps to reproduce the issue
`cat docker-compose.yaml | docker-compose -f - up` (which works in 1.24.x when `docker-compose.yaml` is a valid compose file)
### Observed result
The compose process fails (trace included below)
### Expected result
The compose process should be the same as `docker-compose -f docker-compose.yaml up` (which still works as expected)
### Stacktrace / full error message
```
Traceback (most recent call last):
File "/usr/bin/docker-compose", line 11, in <module>
load_entry_point('docker-compose==1.25.0', 'console_scripts', 'docker-compose')()
File "/usr/lib/python3.8/site-packages/compose/cli/main.py", line 72, in main
command()
File "/usr/lib/python3.8/site-packages/compose/cli/main.py", line 125, in perform_command
project = project_from_options('.', options)
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 53, in project_from_options
return get_project(
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 156, in get_project
execution_context_labels(config_details, environment_file),
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 163, in execution_context_labels
'{0}={1}'.format(LABEL_CONFIG_FILES, config_files_label(config_details)),
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 172, in config_files_label
return ",".join(
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 173, in <genexpr>
map(str, (os.path.normpath(c.filename) for c in config_details.config_files)))
File "/usr/lib/python3.8/posixpath.py", line 336, in normpath
path = os.fspath(path)
TypeError: expected str, bytes or os.PathLike object, not NoneType
```
## Additional information
**OS version**: Arch Linux
**install method**: `community/docker-compose` package (installed via `pacman`)
| 2019-11-29T18:21:21Z | [] | [] |
Traceback (most recent call last):
File "/usr/bin/docker-compose", line 11, in <module>
load_entry_point('docker-compose==1.25.0', 'console_scripts', 'docker-compose')()
File "/usr/lib/python3.8/site-packages/compose/cli/main.py", line 72, in main
command()
File "/usr/lib/python3.8/site-packages/compose/cli/main.py", line 125, in perform_command
project = project_from_options('.', options)
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 53, in project_from_options
return get_project(
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 156, in get_project
execution_context_labels(config_details, environment_file),
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 163, in execution_context_labels
'{0}={1}'.format(LABEL_CONFIG_FILES, config_files_label(config_details)),
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 172, in config_files_label
return ",".join(
File "/usr/lib/python3.8/site-packages/compose/cli/command.py", line 173, in <genexpr>
map(str, (os.path.normpath(c.filename) for c in config_details.config_files)))
File "/usr/lib/python3.8/posixpath.py", line 336, in normpath
path = os.fspath(path)
TypeError: expected str, bytes or os.PathLike object, not NoneType
| 5,175 |
||||
docker/compose | docker__compose-7684 | 25d773c924b6337b63802c0d9649f800b542bf49 | diff --git a/compose/service.py b/compose/service.py
--- a/compose/service.py
+++ b/compose/service.py
@@ -1855,7 +1855,9 @@ def build(self, path, tag=None, quiet=False, fileobj=None,
magic_word = "Successfully built "
appear = False
- with subprocess.Popen(args, stdout=subprocess.PIPE, universal_newlines=True) as p:
+ with subprocess.Popen(args, stdout=subprocess.PIPE,
+ stderr=subprocess.PIPE,
+ universal_newlines=True) as p:
while True:
line = p.stdout.readline()
if not line:
@@ -1864,6 +1866,10 @@ def build(self, path, tag=None, quiet=False, fileobj=None,
appear = True
yield json.dumps({"stream": line})
+ err = p.stderr.readline().strip()
+ if err:
+ raise StreamOutputError(err)
+
with open(iidfile) as f:
line = f.readline()
image_id = line.split(":")[1].strip()
| Attempt to open non-existing file when building with docker-cli fails.
<!--
Welcome to the docker-compose issue tracker! Before creating an issue, please heed the following:
1. This tracker should only be used to report bugs and request features / enhancements to docker-compose
- For questions and general support, use https://forums.docker.com
- For documentation issues, use https://github.com/docker/docker.github.io
- For issues with the `docker stack` commands and the version 3 of the Compose file, use
https://github.com/docker/cli
2. Use the search function before creating a new issue. Duplicates will be closed and directed to
the original discussion.
3. When making a bug report, make sure you provide all required information. The easier it is for
maintainers to reproduce, the faster it'll be fixed.
-->
## Description of the issue
While using docker-cli instead of docker-api for building images, a failure in build stage will cause docker-compose to try opening a file which does not exist.
**Output of `docker-compose version`**
```
docker-compose version 1.25.5, build 8a1c60f6
docker-py version: 4.1.0
CPython version: 3.7.5
OpenSSL version: OpenSSL 1.1.1f 31 Mar 2020
```
**Output of `docker version`**
```
Client: Docker Engine - Community
Version: 19.03.8
API version: 1.40
Go version: go1.12.17
Git commit: afacb8b
Built: Wed Mar 11 01:21:11 2020
OS/Arch: darwin/amd64
Experimental: false
Server: Docker Engine - Community
Engine:
Version: 19.03.8
API version: 1.40 (minimum version 1.12)
Go version: go1.12.17
Git commit: afacb8b
Built: Wed Mar 11 01:29:16 2020
OS/Arch: linux/amd64
Experimental: false
containerd:
Version: v1.2.13
GitCommit: 7ad184331fa3e55e52b890ea95e65ba581ae3429
runc:
Version: 1.0.0-rc10
GitCommit: dc9208a3303feef5b3839f4323d9beb36df0a9dd
docker-init:
Version: 0.18.0
GitCommit: fec3683
```
**Output of `docker-compose config`**
(Make sure to add the relevant `-f` and other flags)
```
services:
buggy-image:
build:
context: /Users/erfan/Workspace/bug
image: buggy-image
version: '3.2'
```
## Steps to reproduce the issue
1. Make a Dockerfile which fails to build like this:
```
FROM python:3.6-slim
RUN exit 2
```
2. Make a docker-compose.yml file:
```
version: '3.2'
services:
buggy-image:
image: buggy-image
build:
context: .
```
3. Run this command:
```
COMPOSE_DOCKER_CLI_BUILD=1 docker-compose build
```
### Observed result
```
Building buggy-image
Sending build context to Docker daemon 3.072kB
Step 1/2 : FROM python:3.6-slim
---> 42ab29dc5406
Step 2/2 : RUN exit 2
---> Running in 4d92c34d14d0
The command '/bin/sh -c exit 2' returned a non-zero code: 2
Traceback (most recent call last):
File "docker-compose", line 6, in <module>
File "compose/cli/main.py", line 72, in main
File "compose/cli/main.py", line 128, in perform_command
File "compose/cli/main.py", line 303, in build
File "compose/project.py", line 403, in build
File "compose/project.py", line 385, in build_service
File "compose/service.py", line 1110, in build
File "compose/progress_stream.py", line 25, in stream_output
File "compose/utils.py", line 61, in split_buffer
File "compose/utils.py", line 37, in stream_as_text
File "compose/service.py", line 1818, in build
FileNotFoundError: [Errno 2] No such file or directory: '/var/folders/t5/vy9hz8rd33v1cdlr07lb8d0r0000gn/T/tmpysvdrdqh'
[17620] Failed to execute script docker-compose
```
### Expected result
When using docker-api the output would be:
```
ERROR: Service 'buggy-image' failed to build: The command '/bin/sh -c exit 2' returned a non-zero code: 2
```
| 2020-08-18T15:52:44Z | [] | [] |
Traceback (most recent call last):
File "docker-compose", line 6, in <module>
File "compose/cli/main.py", line 72, in main
File "compose/cli/main.py", line 128, in perform_command
File "compose/cli/main.py", line 303, in build
File "compose/project.py", line 403, in build
File "compose/project.py", line 385, in build_service
File "compose/service.py", line 1110, in build
File "compose/progress_stream.py", line 25, in stream_output
File "compose/utils.py", line 61, in split_buffer
File "compose/utils.py", line 37, in stream_as_text
File "compose/service.py", line 1818, in build
FileNotFoundError: [Errno 2] No such file or directory: '/var/folders/t5/vy9hz8rd33v1cdlr07lb8d0r0000gn/T/tmpysvdrdqh'
| 5,192 |
||||
docker/compose | docker__compose-897 | d79dc85fa56c26bfa67e4ec3f229c44caa8dbab7 | diff --git a/compose/service.py b/compose/service.py
--- a/compose/service.py
+++ b/compose/service.py
@@ -95,6 +95,10 @@ def __init__(self, name, client=None, project='default', links=None, external_li
if 'image' in options and 'build' in options:
raise ConfigError('Service %s has both an image and build path specified. A service can either be built to image or use an existing image, not both.' % name)
+ for filename in get_env_files(options):
+ if not os.path.exists(filename):
+ raise ConfigError("Couldn't find env file for service %s: %s" % (name, filename))
+
supported_options = DOCKER_CONFIG_KEYS + ['build', 'expose',
'external_links']
@@ -617,15 +621,18 @@ def split_port(port):
return internal_port, (external_ip, external_port or None)
+def get_env_files(options):
+ env_files = options.get('env_file', [])
+ if not isinstance(env_files, list):
+ env_files = [env_files]
+ return env_files
+
+
def merge_environment(options):
env = {}
- if 'env_file' in options:
- if isinstance(options['env_file'], list):
- for f in options['env_file']:
- env.update(env_vars_from_file(f))
- else:
- env.update(env_vars_from_file(options['env_file']))
+ for f in get_env_files(options):
+ env.update(env_vars_from_file(f))
if 'environment' in options:
if isinstance(options['environment'], list):
| Stack trace and unfriendly error when an env file doesn't exist
```
Traceback (most recent call last):
File "<string>", line 3, in <module>
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.cli.main", line 31, in main
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.cli.docopt_command", line 21, in sys_dispatch
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.cli.command", line 28, in dispatch
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.cli.docopt_command", line 24, in dispatch
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.cli.command", line 60, in perform_command
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.cli.main", line 445, in up
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.project", line 183, in up
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.service", line 262, in recreate_containers
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.service", line 293, in recreate_container
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.service", line 222, in create_container
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.service", line 448, in _get_container_create_options
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.service", line 628, in merge_environment
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.service", line 660, in env_vars_from_file
IOError: [Errno 2] No such file or directory: 'web.env'
```
| 2015-01-28T21:22:49Z | [] | [] |
Traceback (most recent call last):
File "<string>", line 3, in <module>
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.cli.main", line 31, in main
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.cli.docopt_command", line 21, in sys_dispatch
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.cli.command", line 28, in dispatch
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.cli.docopt_command", line 24, in dispatch
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.cli.command", line 60, in perform_command
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.cli.main", line 445, in up
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.project", line 183, in up
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.service", line 262, in recreate_containers
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.service", line 293, in recreate_container
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.service", line 222, in create_container
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.service", line 448, in _get_container_create_options
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.service", line 628, in merge_environment
File "/Users/aanand/work/docker/fig/build/docker-compose/out00-PYZ.pyz/compose.service", line 660, in env_vars_from_file
IOError: [Errno 2] No such file or directory: 'web.env'
| 5,212 |
||||
docker/compose | docker__compose-970 | 8610adcaf37babb29d1fc347f8c310186c4f4534 | diff --git a/compose/cli/main.py b/compose/cli/main.py
--- a/compose/cli/main.py
+++ b/compose/cli/main.py
@@ -1,26 +1,25 @@
from __future__ import print_function
from __future__ import unicode_literals
+from inspect import getdoc
+from operator import attrgetter
import logging
-import sys
import re
import signal
-from operator import attrgetter
+import sys
-from inspect import getdoc
+from docker.errors import APIError
import dockerpty
from .. import __version__
from ..project import NoSuchService, ConfigurationError
-from ..service import BuildError, CannotBeScaledError
+from ..service import BuildError, CannotBeScaledError, parse_environment
from .command import Command
+from .docopt_command import NoSuchCommand
+from .errors import UserError
from .formatter import Formatter
from .log_printer import LogPrinter
from .utils import yesno
-from docker.errors import APIError
-from .errors import UserError
-from .docopt_command import NoSuchCommand
-
log = logging.getLogger(__name__)
@@ -316,11 +315,10 @@ def run(self, project, options):
}
if options['-e']:
- for option in options['-e']:
- if 'environment' not in service.options:
- service.options['environment'] = {}
- k, v = option.split('=', 1)
- service.options['environment'][k] = v
+ # Merge environment from config with -e command line
+ container_options['environment'] = dict(
+ parse_environment(service.options.get('environment')),
+ **parse_environment(options['-e']))
if options['--entrypoint']:
container_options['entrypoint'] = options.get('--entrypoint')
diff --git a/compose/service.py b/compose/service.py
--- a/compose/service.py
+++ b/compose/service.py
@@ -8,6 +8,7 @@
import sys
from docker.errors import APIError
+import six
from .container import Container, get_container_name
from .progress_stream import stream_output, StreamOutputError
@@ -450,7 +451,7 @@ def _get_container_create_options(self, override_options, one_off=False):
(parse_volume_spec(v).internal, {})
for v in container_options['volumes'])
- container_options['environment'] = merge_environment(container_options)
+ container_options['environment'] = build_environment(container_options)
if self.can_be_built():
container_options['image'] = self.full_name
@@ -629,19 +630,28 @@ def get_env_files(options):
return env_files
-def merge_environment(options):
+def build_environment(options):
env = {}
for f in get_env_files(options):
env.update(env_vars_from_file(f))
- if 'environment' in options:
- if isinstance(options['environment'], list):
- env.update(dict(split_env(e) for e in options['environment']))
- else:
- env.update(options['environment'])
+ env.update(parse_environment(options.get('environment')))
+ return dict(resolve_env(k, v) for k, v in six.iteritems(env))
+
+
+def parse_environment(environment):
+ if not environment:
+ return {}
+
+ if isinstance(environment, list):
+ return dict(split_env(e) for e in environment)
+
+ if isinstance(environment, dict):
+ return environment
- return dict(resolve_env(k, v) for k, v in env.items())
+ raise ConfigError("environment \"%s\" must be a list or mapping," %
+ environment)
def split_env(env):
| Setting environment variable on CLI raises exception when service has environment definition
It looks like I can't set an environment variable in the fig.yml and on the console at the same time.
fig version: `1.0.1`
python version: `2.7 (homebrew)`
OS: OSX `10.10.2`
_fig.yml:_
``` bliep:
image: ubuntu:14.04
environment:
- SECRET_KEY
```
_full output:_
```
$ fig run -e BLA=bla bliep
Traceback (most recent call last):
File "/usr/local/bin/fig", line 9, in <module>
load_entry_point('fig==1.0.1', 'console_scripts', 'fig')()
File "/usr/local/lib/python2.7/site-packages/fig/cli/main.py", line 31, in main
command.sys_dispatch()
File "/usr/local/lib/python2.7/site-packages/fig/cli/docopt_command.py", line 21, in sys_dispatch
self.dispatch(sys.argv[1:], None)
File "/usr/local/lib/python2.7/site-packages/fig/cli/command.py", line 28, in dispatch
super(Command, self).dispatch(*args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/fig/cli/docopt_command.py", line 24, in dispatch
self.perform_command(*self.parse(argv, global_options))
File "/usr/local/lib/python2.7/site-packages/fig/cli/command.py", line 56, in perform_command
handler(project, command_options)
File "/usr/local/lib/python2.7/site-packages/fig/cli/main.py", line 312, in run
service.options['environment'][k] = v
TypeError: list indices must be integers, not unicode
```
| hmm, I thought this was fixed. Maybe it's in 1.1 or still in a PR somewhere.
I believe the workaround for now is to use the mapping form in the fig.yml
``` yaml
image: ubuntu:14.04
environment:
SECRET_KEY:
```
sorry, nope:
```
$ cat fig.yml
bliep:
image: ubuntu:14.04
environment:
- BLOP:
$ fig run -e BLOP=bla bliep
Traceback (most recent call last):
File "/usr/local/bin/fig", line 9, in <module>
load_entry_point('fig==1.0.1', 'console_scripts', 'fig')()
File "/usr/local/lib/python2.7/site-packages/fig/cli/main.py", line 31, in main
command.sys_dispatch()
File "/usr/local/lib/python2.7/site-packages/fig/cli/docopt_command.py", line 21, in sys_dispatch
self.dispatch(sys.argv[1:], None)
File "/usr/local/lib/python2.7/site-packages/fig/cli/command.py", line 28, in dispatch
super(Command, self).dispatch(*args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/fig/cli/docopt_command.py", line 24, in dispatch
self.perform_command(*self.parse(argv, global_options))
File "/usr/local/lib/python2.7/site-packages/fig/cli/command.py", line 56, in perform_command
handler(project, command_options)
File "/usr/local/lib/python2.7/site-packages/fig/cli/main.py", line 312, in run
service.options['environment'][k] = v
TypeError: list indices must be integers, not unicode
```
You did not make the change I suggested. You still have a list in `environment`. You need to remove the dash.
sorry, you are right and yes, that workaround works.
| 2015-02-14T19:21:28Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/bin/fig", line 9, in <module>
load_entry_point('fig==1.0.1', 'console_scripts', 'fig')()
File "/usr/local/lib/python2.7/site-packages/fig/cli/main.py", line 31, in main
command.sys_dispatch()
File "/usr/local/lib/python2.7/site-packages/fig/cli/docopt_command.py", line 21, in sys_dispatch
self.dispatch(sys.argv[1:], None)
File "/usr/local/lib/python2.7/site-packages/fig/cli/command.py", line 28, in dispatch
super(Command, self).dispatch(*args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/fig/cli/docopt_command.py", line 24, in dispatch
self.perform_command(*self.parse(argv, global_options))
File "/usr/local/lib/python2.7/site-packages/fig/cli/command.py", line 56, in perform_command
handler(project, command_options)
File "/usr/local/lib/python2.7/site-packages/fig/cli/main.py", line 312, in run
service.options['environment'][k] = v
TypeError: list indices must be integers, not unicode
| 5,214 |
|||
explosion/spaCy | explosion__spaCy-1448 | 490ad3eaf070f2e210869c37b70edf3fcd504da7 | diff --git a/examples/training/train_new_entity_type.py b/examples/training/train_new_entity_type.py
--- a/examples/training/train_new_entity_type.py
+++ b/examples/training/train_new_entity_type.py
@@ -56,8 +56,7 @@ def train_ner(nlp, train_data, output_dir):
losses = {}
for batch in minibatch(get_gold_parses(nlp.make_doc, train_data), size=3):
docs, golds = zip(*batch)
- nlp.update(docs, golds, losses=losses, sgd=optimizer, update_shared=True,
- drop=0.35)
+ nlp.update(docs, golds, losses=losses, sgd=optimizer, drop=0.35)
print(losses)
if not output_dir:
return
@@ -100,9 +99,10 @@ def main(model_name, output_directory=None):
)
]
- nlp.pipeline.append(TokenVectorEncoder(nlp.vocab))
- nlp.pipeline.append(NeuralEntityRecognizer(nlp.vocab))
- nlp.pipeline[-1].add_label('ANIMAL')
+ nlp.add_pipe(TokenVectorEncoder(nlp.vocab))
+ ner = NeuralEntityRecognizer(nlp.vocab)
+ ner.add_label('ANIMAL')
+ nlp.add_pipe(ner)
train_ner(nlp, train_data, output_directory)
# Test that the entity is recognized
| TokenVectorEncoder object is not iterable when running example in 2.0 alpha
Im trying to run one of the examples in 2.0.0 alpha, for extending a pre existing model with
custom ner tags avaliable here [1],
here is the error i get:
```
$ python train_new_entity_type.py en othersame
Creating initial model en
Traceback (most recent call last):
File "train_new_entity_type.py", line 124, in <module>
plac.call(main)
File "/home/data/experim/spc/sp2env/lib/python2.7/site-packages/plac_core.py", line 328, in call
cmd, result = parser.consume(arglist)
File "/home/data/experim/spc/sp2env/lib/python2.7/site-packages/plac_core.py", line 207, in consume
return cmd, self.func(*(args + varargs + extraopts), **kwargs)
File "train_new_entity_type.py", line 106, in main
train_ner(nlp, train_data, output_directory)
File "train_new_entity_type.py", line 53, in train_ner
optimizer = nlp.begin_training(lambda: [])
File "/home/data/experim/spc/sp2env/lib/python2.7/site-packages/spacy/language.py", line 410, in begin_training
for name, proc in self.pipeline:
TypeError: 'TokenVectorEncoder' object is not iterable
```
I expected to get this to work, as its already documented here [2],
all the models and spacy install are recent and fresh installs (21st october).
## Your Environment
```
Info about spaCy
Python version 2.7.13
Platform Linux-4.11.12-100.fc24.x86_64-x86_64-with-fedora-24-Twenty_Four
spaCy version 2.0.0a17
Location /home/data/experim/spc/sp2env/lib/python2.7/site-packages/spacy
Models en_core_web_sm, en_core_web_lg
```
* Operating System: Fedora Linux
* Python Version Used: Python 2.7.13 reproducible with 3.5.3
* spaCy Version Used: 2.0.0a17
* Environment Information:
[ 1] https://github.com/explosion/spaCy/blob/develop/examples/training/train_new_entity_type.py
[ 2] https://alpha.spacy.io/usage/training#example-new-entity-type
| I think you might be using an outdated model that still has the `tensorizer` in the pipeline. The latest alpha version now has a handy command that lets you check that all models are compatible and up to date, and shows you which ones need to be upgraded:
```bash
spacy validate
```
So simply downloading the latest `en_core_web_sm` or `en_core_web_lg` model should hopefully fix this.
I hope this is the case, but here is the output of validate:
```
$ spacy validate
Installed models (spaCy v2.0.0a17)
/home/data/experim/spc/sp2env/lib/python2.7/site-packages/spacy
TYPE NAME MODEL VERSION
package en-core-web-sm en_core_web_sm 2.0.0a7 ✔
package en-core-web-lg en_core_web_lg 2.0.0a1 ✔
link en_core_web_lg en_core_web_lg 2.0.0a1 ✔
link en_core_web_sm en_core_web_sm 2.0.0a7 ✔
```
I have been really looking forward to adding custom ner tags, so im eager
to get it working.
Thanks for updating! I think I found the issue – try removing this line:
https://github.com/explosion/spaCy/blob/490ad3eaf070f2e210869c37b70edf3fcd504da7/examples/training/train_new_entity_type.py#L103
I think we may have forgotten to push the updated version of the example for the latest alpha release and models, sorry about that.
**Edit:** Since the [pipeline architecture has changed](https://alpha.spacy.io/usage/processing-pipelines#pipelines) and `nlp.pipeline` entries are now `(name, func)` tuples, this line also has to be adjusted:
https://github.com/explosion/spaCy/blob/490ad3eaf070f2e210869c37b70edf3fcd504da7/examples/training/train_new_entity_type.py#L104
```python
nlp.add_pipe(NeuralEntityRecognizer(nlp.vocab))
```
Will test this as soon as we have time and adjust it accordingly!
Thanks @ines
I also had to change the add_label line to:
```
nlp.pipeline[nlp.pipe_names.index('ner')][1].add_label('ANIMAL')
```
Not quite sure that's how it's supposed to be done, but it works for me.
Ah, thanks! 👍 You should be able to simply use `nlp.get_pipe()` to get a pipeline component, e.g.:
```python
ner = nlp.get_pipe('ner')
ner.add_label('ANIMAL')
```
Or, probably cleaner:
```python
ner = NeuralEntityRecognizer(nlp.vocab)
ner.add_label('ANIMAL')
nlp.add_pipe(ner)
```
That works too, thanks!
Yay! Thanks for your help and feedback. If it's all working for you now, feel free to submit a PR to `develop` btw (otherwise, we're happy to take care of this later as well). | 2017-10-22T13:20:10Z | [] | [] |
Traceback (most recent call last):
File "train_new_entity_type.py", line 124, in <module>
plac.call(main)
File "/home/data/experim/spc/sp2env/lib/python2.7/site-packages/plac_core.py", line 328, in call
cmd, result = parser.consume(arglist)
File "/home/data/experim/spc/sp2env/lib/python2.7/site-packages/plac_core.py", line 207, in consume
return cmd, self.func(*(args + varargs + extraopts), **kwargs)
File "train_new_entity_type.py", line 106, in main
train_ner(nlp, train_data, output_directory)
File "train_new_entity_type.py", line 53, in train_ner
optimizer = nlp.begin_training(lambda: [])
File "/home/data/experim/spc/sp2env/lib/python2.7/site-packages/spacy/language.py", line 410, in begin_training
for name, proc in self.pipeline:
TypeError: 'TokenVectorEncoder' object is not iterable
| 5,217 |
|||
explosion/spaCy | explosion__spaCy-2949 | 02fc73ca53effae501c97879e1e95dc156d24f0d | diff --git a/spacy/compat.py b/spacy/compat.py
--- a/spacy/compat.py
+++ b/spacy/compat.py
@@ -1,6 +1,7 @@
# coding: utf8
from __future__ import unicode_literals
+import os
import sys
import ujson
import itertools
@@ -80,12 +81,18 @@ def getattr_(obj, name, *default):
def symlink_to(orig, dest):
- if is_python2 and is_windows:
+ if is_windows:
import subprocess
subprocess.call(['mklink', '/d', path2str(orig), path2str(dest)], shell=True)
else:
orig.symlink_to(dest)
+def symlink_remove(link):
+ # https://stackoverflow.com/questions/26554135/cant-delete-unlink-a-symlink-to-directory-in-python-windows
+ if( os.path.isdir(path2str(link)) and is_windows ): # this should only be on Py2.7 and windows
+ os.rmdir(path2str(link))
+ else:
+ os.unlink(path2str(link))
def is_config(python2=None, python3=None, windows=None, linux=None, osx=None):
return (python2 in (None, is_python2) and
| Provide compat that works with on Windows with Py 3.7.1
## Feature description
The script that creates the symbolic link doesn't work by default when running `python -m spacy link en_core_web_sm en` -- normally if Dos is the default shell a call to `mklink` just works. However, in via this module, it does not and it is not a permission issues nor a system policy restriction as just running `mklink /d ...` from the same shell/command line works..
This fails regardless if under a virtualenv as well.
Instead a user is presented with the following error message.
```
(venv) C:\g\py\spacy> python -m spacy link en_core_web_sm en
C:\Program Files\Python37\lib\importlib\_bootstrap.py:219: RuntimeWarning: cymem.cymem.Pool size changed, may indicate binary incompatibility. Expected 48 from C header, got 64 from PyObject
return f(*args, **kwds)
C:\Program Files\Python37\lib\importlib\_bootstrap.py:219: RuntimeWarning: cymem.cymem.Address size changed, may indicate binary incompatibility. Expected 24 from C header, got 40 from PyObject
return f(*args, **kwds)
Error: Couldn't link model to 'en'
Creating a symlink in spacy/data failed. Make sure you have the required
permissions and try re-running the command as admin, or use a
virtualenv. You can still import the model as a module and call its
load() method, or create the symlink manually.
C:\g\py\spacy\venv\lib\site-packages\en_core_web_sm -->
C:\g\py\spacy\venv\lib\site-packages\spacy\data\en
Traceback (most recent call last):
File "C:\Program Files\Python37\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "C:\Program Files\Python37\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\g\py\spacy\venv\lib\site-packages\spacy\__main__.py", line 31, in <module>
plac.call(commands[command], sys.argv[1:])
File "C:\g\py\spacy\venv\lib\site-packages\plac_core.py", line 328, in call
cmd, result = parser.consume(arglist)
File "C:\g\py\spacy\venv\lib\site-packages\plac_core.py", line 207, in consume
return cmd, self.func(*(args + varargs + extraopts), **kwargs)
File "C:\g\py\spacy\venv\lib\site-packages\spacy\cli\link.py", line 48, in link
symlink_to(link_path, model_path)
File "C:\g\py\spacy\venv\lib\site-packages\spacy\compat.py", line 87, in symlink_to
orig.symlink_to(dest)
File "C:\Program Files\Python37\lib\pathlib.py", line 1320, in symlink_to
self._accessor.symlink(target, self, target_is_directory)
OSError: symbolic link privilege not held
```
The following command works regardless and has no issue with permissions nor policy. The error message indicating symbolic link privilege not held is misleading.
```
cmd /c mklink /k c:\path\to\symlink c:\target\file
```
A suggestion is changing the respective lines in `venv\Lib\site-packages\spacy\compat.py (symlink_to)`
```
def symlink_to(orig, dest):
if is_python2 and is_windows:
import subprocess
subprocess.call(['mklink', '/d', path2str(orig), path2str(dest)], shell=True)
else:
orig.symlink_to(dest)
```
to
```
def symlink_to(orig, dest):
if (is_python2 or is_python3) and is_windows:
import subprocess
subprocess.call(['mklink', '/d', path2str(orig), path2str(dest)], shell=True)
else:
orig.symlink_to(dest)
````
| 2018-11-20T04:47:55Z | [] | [] |
Traceback (most recent call last):
File "C:\Program Files\Python37\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "C:\Program Files\Python37\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\g\py\spacy\venv\lib\site-packages\spacy\__main__.py", line 31, in <module>
plac.call(commands[command], sys.argv[1:])
File "C:\g\py\spacy\venv\lib\site-packages\plac_core.py", line 328, in call
cmd, result = parser.consume(arglist)
File "C:\g\py\spacy\venv\lib\site-packages\plac_core.py", line 207, in consume
return cmd, self.func(*(args + varargs + extraopts), **kwargs)
File "C:\g\py\spacy\venv\lib\site-packages\spacy\cli\link.py", line 48, in link
symlink_to(link_path, model_path)
File "C:\g\py\spacy\venv\lib\site-packages\spacy\compat.py", line 87, in symlink_to
orig.symlink_to(dest)
File "C:\Program Files\Python37\lib\pathlib.py", line 1320, in symlink_to
self._accessor.symlink(target, self, target_is_directory)
OSError: symbolic link privilege not held
| 5,231 |
||||
explosion/spaCy | explosion__spaCy-3038 | b1c8731b4d72aea588eb843769e20aa6749593cc | diff --git a/examples/training/train_textcat.py b/examples/training/train_textcat.py
--- a/examples/training/train_textcat.py
+++ b/examples/training/train_textcat.py
@@ -44,6 +44,7 @@ def main(model=None, output_dir=None, n_iter=20, n_texts=2000):
# add label to text classifier
textcat.add_label("POSITIVE")
+ textcat.add_label("NEGATIVE")
# load the IMDB dataset
print("Loading IMDB data...")
@@ -64,7 +65,7 @@ def main(model=None, output_dir=None, n_iter=20, n_texts=2000):
for i in range(n_iter):
losses = {}
# batch up the examples using spaCy's minibatch
- batches = minibatch(train_data, size=compounding(4.0, 32.0, 1.001))
+ batches = minibatch(train_data, size=compounding(4.0, 16.0, 1.001))
for batch in batches:
texts, annotations = zip(*batch)
nlp.update(texts, annotations, sgd=optimizer, drop=0.2, losses=losses)
@@ -106,22 +107,24 @@ def load_data(limit=0, split=0.8):
random.shuffle(train_data)
train_data = train_data[-limit:]
texts, labels = zip(*train_data)
- cats = [{"POSITIVE": bool(y)} for y in labels]
+ cats = [{"POSITIVE": bool(y), "NEGATIVE": not bool(y)} for y in labels]
split = int(len(train_data) * split)
return (texts[:split], cats[:split]), (texts[split:], cats[split:])
def evaluate(tokenizer, textcat, texts, cats):
docs = (tokenizer(text) for text in texts)
- tp = 1e-8 # True positives
+ tp = 0.0 # True positives
fp = 1e-8 # False positives
fn = 1e-8 # False negatives
- tn = 1e-8 # True negatives
+ tn = 0.0 # True negatives
for i, doc in enumerate(textcat.pipe(docs)):
gold = cats[i]
for label, score in doc.cats.items():
if label not in gold:
continue
+ if label == "NEGATIVE":
+ continue
if score >= 0.5 and gold[label] >= 0.5:
tp += 1.0
elif score >= 0.5 and gold[label] < 0.5:
diff --git a/spacy/_ml.py b/spacy/_ml.py
--- a/spacy/_ml.py
+++ b/spacy/_ml.py
@@ -5,7 +5,7 @@
from thinc.v2v import Model, Maxout, Softmax, Affine, ReLu
from thinc.i2v import HashEmbed, StaticVectors
from thinc.t2t import ExtractWindow, ParametricAttention
-from thinc.t2v import Pooling, sum_pool
+from thinc.t2v import Pooling, sum_pool, mean_pool
from thinc.misc import Residual
from thinc.misc import LayerNorm as LN
from thinc.misc import FeatureExtracter
@@ -575,6 +575,32 @@ def build_text_classifier(nr_class, width=64, **cfg):
return model
+def build_simple_cnn_text_classifier(tok2vec, nr_class, exclusive_classes=True, **cfg):
+ """
+ Build a simple CNN text classifier, given a token-to-vector model as inputs.
+ If exclusive_classes=True, a softmax non-linearity is applied, so that the
+ outputs sum to 1. If exclusive_classes=False, a logistic non-linearity
+ is applied instead, so that outputs are in the range [0, 1].
+ """
+ with Model.define_operators({">>": chain}):
+ if exclusive_classes:
+ output_layer = Softmax(nr_class, tok2vec.nO)
+ else:
+ output_layer = (
+ zero_init(Affine(nr_class, tok2vec.nO))
+ >> logistic
+ )
+ model = (
+ tok2vec
+ >> flatten_add_lengths
+ >> Pooling(mean_pool)
+ >> output_layer
+ )
+ model.tok2vec = chain(tok2vec, flatten)
+ model.nO = nr_class
+ return model
+
+
@layerize
def flatten(seqs, drop=0.0):
ops = Model.ops
| problem training text categorizer on GPU
Great work!
Install spacy in an isolated virtualenv using `pip install spacy[cuda90]`, everything works well and
training starts normally but when GPU memory starts to be used as reported by `nvidia-smi` I get this error and the process ends, the environment is well configured and I'm able to use my GPUs with tensor flow.
```
Traceback (most recent call last):
File "core_nlp/spacy/text_categorizer.py", line 168, in <module>
nlp = TextCategorizer.train(filter_by_support(category_data, 60), model, None, use_averages=True, epochs=50, debug=False)
File "core_nlp/spacy/text_categorizer.py", line 80, in train
nlp_.update(texts, annotations, sgd=optimizer, drop=dropout, losses=losses)
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/spacy/language.py", line 421, in update
proc.update(docs, golds, drop=drop, sgd=get_grads, losses=losses)
File "pipeline.pyx", line 876, in spacy.pipeline.TextCategorizer.update
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/thinc/api.py", line 61, in begin_update
X, inc_layer_grad = layer.begin_update(X, drop=drop)
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/thinc/api.py", line 176, in begin_update
values = [fwd(X, *a, **k) for fwd in forward]
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/thinc/api.py", line 176, in <listcomp>
values = [fwd(X, *a, **k) for fwd in forward]
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/thinc/api.py", line 258, in wrap
output = func(*args, **kwargs)
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/thinc/api.py", line 61, in begin_update
X, inc_layer_grad = layer.begin_update(X, drop=drop)
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/spacy/_ml.py", line 102, in _preprocess_doc
keys = ops.xp.concatenate(keys)
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/cupy/manipulation/join.py", line 49, in concatenate
return core.concatenate_method(tup, axis)
File "cupy/core/core.pyx", line 2740, in cupy.core.core.concatenate_method
File "cupy/core/core.pyx", line 2753, in cupy.core.core.concatenate_method
TypeError: Only cupy arrays can be concatenated
```
This is a copy of my requirements.txt
```
certifi==2018.10.15
chardet==3.0.4
cupy-cuda90==5.0.0
cycler==0.10.0
cymem==2.0.2
cytoolz==0.9.0.1
dill==0.2.8.2
fastrlock==0.4
idna==2.7
kiwisolver==1.0.1
matplotlib==3.0.2
msgpack==0.5.6
msgpack-numpy==0.4.3.2
murmurhash==1.0.1
numpy==1.15.4
pandas==0.23.4
plac==0.9.6
preshed==2.0.1
pyparsing==2.3.0
python-dateutil==2.7.5
pytz==2018.7
regex==2018.1.10
requests==2.20.1
scikit-learn==0.20.0
scipy==1.1.0
six==1.11.0
spacy==2.0.16
thinc==6.12.0
thinc-gpu-ops==0.0.3
toolz==0.9.0
tqdm==4.28.1
twitter-text-python==1.1.0
ujson==1.35
urllib3==1.24.1
wrapt==1.10.11
```
## Your Environment
<!-- Include details of your environment. If you're using spaCy 1.7+, you can also type `python -m spacy info --markdown` and copy-paste the result here.-->
* Operating System: Ubuntu 16.04
* Python Version Used: 3.5.2
* spaCy Version Used: 2.0.16
* Environment Information: Cuda 90 driver working with tensor flow
| 2018-12-10T12:33:18Z | [] | [] |
Traceback (most recent call last):
File "core_nlp/spacy/text_categorizer.py", line 168, in <module>
nlp = TextCategorizer.train(filter_by_support(category_data, 60), model, None, use_averages=True, epochs=50, debug=False)
File "core_nlp/spacy/text_categorizer.py", line 80, in train
nlp_.update(texts, annotations, sgd=optimizer, drop=dropout, losses=losses)
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/spacy/language.py", line 421, in update
proc.update(docs, golds, drop=drop, sgd=get_grads, losses=losses)
File "pipeline.pyx", line 876, in spacy.pipeline.TextCategorizer.update
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/thinc/api.py", line 61, in begin_update
X, inc_layer_grad = layer.begin_update(X, drop=drop)
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/thinc/api.py", line 176, in begin_update
values = [fwd(X, *a, **k) for fwd in forward]
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/thinc/api.py", line 176, in <listcomp>
values = [fwd(X, *a, **k) for fwd in forward]
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/thinc/api.py", line 258, in wrap
output = func(*args, **kwargs)
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/thinc/api.py", line 61, in begin_update
X, inc_layer_grad = layer.begin_update(X, drop=drop)
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/spacy/_ml.py", line 102, in _preprocess_doc
keys = ops.xp.concatenate(keys)
File "/home/neo/pyenvs/spacyp3/lib/python3.5/site-packages/cupy/manipulation/join.py", line 49, in concatenate
return core.concatenate_method(tup, axis)
File "cupy/core/core.pyx", line 2740, in cupy.core.core.concatenate_method
File "cupy/core/core.pyx", line 2753, in cupy.core.core.concatenate_method
TypeError: Only cupy arrays can be concatenated
| 5,234 |
||||
explosion/spaCy | explosion__spaCy-3065 | 52f3c950044756e325659514a3600965c137aed1 | diff --git a/spacy/lang/ja/__init__.py b/spacy/lang/ja/__init__.py
--- a/spacy/lang/ja/__init__.py
+++ b/spacy/lang/ja/__init__.py
@@ -31,7 +31,7 @@ def resolve_pos(token):
Under Universal Dependencies, sometimes the same Unidic POS tag can
be mapped differently depending on the literal token or its context
- in the sentence. This function adds information to the POS tag to
+ in the sentence. This function adds information to the POS tag to
resolve ambiguous mappings.
"""
@@ -74,6 +74,7 @@ def __init__(self, cls, nlp=None):
MeCab = try_mecab_import()
self.tokenizer = MeCab.Tagger()
+ self.tokenizer.parseToNode('') # see #2901
def __call__(self, text):
dtokens = detailed_tokens(self.tokenizer, text)
| Japanese (MeCab): the very first call of `nlp` fails
## How to reproduce the behaviour
<!-- Include a code example or the steps that led to the problem. Please try to be as specific as possible. -->
The very first call of `nlp` (probably for any string) results in an error:
```python
>>> import spacy
>>> nlp = spacy.blank('ja')
>>> nlp('pythonが大好きです')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.6/site-packages/spacy/language.py", line 340, in __call__
doc = self.make_doc(text)
File "/usr/local/lib/python3.6/site-packages/spacy/lang/ja/__init__.py", line 117, in make_doc
return self.tokenizer(text)
File "/usr/local/lib/python3.6/site-packages/spacy/lang/ja/__init__.py", line 81, in __call__
doc = Doc(self.vocab, words=words, spaces=[False]*len(words))
File "doc.pyx", line 176, in spacy.tokens.doc.Doc.__init__
File "doc.pyx", line 559, in spacy.tokens.doc.Doc.push_back
ValueError: [E031] Invalid token: empty string ('') at position 0.
>>> nlp('pythonが大好きです')
pythonが大好きです
>>>
```
MeCab is installed according to https://github.com/SamuraiT/mecab-python3#user-content-installation-and-usage. The example from there works:
```python
>>> import MeCab
>>> mecab = MeCab.Tagger ("-Ochasen")
>>> print(mecab.parse("pythonが大好きです"))
python python python 名詞-固有名詞-組織
が ガ が 助詞-格助詞-一般
大好き ダイスキ 大好き 名詞-形容動詞語幹
です デス です 助動詞 特殊・デス 基本形
EOS
>>>
```
## Info about spaCy
* **spaCy version:** 2.0.16
* **Platform:** Linux-3.16.0-4-amd64-x86_64-with-debian-8.10
* **Python version:** 3.6.4
| I can't reproduce this, but it seems like you might not have Unidic installed. That wouldn't explain why it would fail once and then work afterwards, but could you verify you have Unidic installed? It's required because Universal Dependencies requires it. The documentation should be clearer about this soon (see #3001).
(I assume you are using IPAdic because IPAdic has a `chasen` output format in the config file and Unidic does not.)
@polm Thanks for the reply. I'm using IPAdic indeed: as I said, MeCab is installed according to https://github.com/SamuraiT/mecab-python3#user-content-installation-and-usage, so the command `apt-get install mecab mecab-ipadic-utf8 libmecab-dev swig` was used before `pip install mecab-python3`.
Unfortunately, using `unidic-mecab` instead of `mecab-ipadic-utf8` doesn't help. The error is the same:
```python
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/usr/local/lib/python3.7/site-packages/spacy/language.py", line 340, in __call__
doc = self.make_doc(text)
File "/usr/local/lib/python3.7/site-packages/spacy/lang/ja/__init__.py", line 117, in make_doc
return self.tokenizer(text)
File "/usr/local/lib/python3.7/site-packages/spacy/lang/ja/__init__.py", line 81, in __call__
doc = Doc(self.vocab, words=words, spaces=[False]*len(words))
File "doc.pyx", line 176, in spacy.tokens.doc.Doc.__init__
File "doc.pyx", line 566, in spacy.tokens.doc.Doc.push_back
ValueError: [E031] Invalid token: empty string ('') at position 0.
```
Here's a Dockerfile to reproduce (run `docker build .` from the directory with it):
```Dockerfile
FROM python@sha256:a837aefef8f2553789bc70436621160af4da65d95b0fb02d5032557f887d0ca5
# python:3.7.1 (Debian Stretch), Sun Nov 11 17:29:43 +05 2018
RUN apt-get update
RUN apt-get install -y --no-install-recommends \
mecab=0.996-3.1 \
unidic-mecab=2.1.2~dfsg-6 \
libmecab-dev=0.996-3.1 \
swig=3.0.10-1.1
RUN pip install spacy==2.0.18 mecab-python3==0.996.1
RUN python -c 'import spacy; nlp = spacy.blank("ja"); \
doc = nlp("pythonが大好きです"); print(doc)'
```
Thanks for the Dockerfile and sorry for my late reply - was hoping this was related to the recent issue with the mecab-python3 upgrade. I tried downgrading the version of that used in the Dockerfile and it didn't help though, so it seems it's unrelated.
The good news is I found a minimal example of the issue, the bad news is it makes no sense. Here's code that shows the issue:
```
import MeCab
tagger = MeCab.Tagger()
def print_tokens(text):
node = tagger.parseToNode(text).next
while node.posid != 0:
print(node.surface[:])
node = node.next
print('-----')
print_tokens("日本語だよ")
print_tokens("日本語だよ")
```
Output
```
-----
だ
よ
日本
語
だ
よ
```
Please note the first two lines are blank, which is not only wrong but very strange. Poking around internally, data besides the literal token (like POS and lemma) seem OK, just the surface disappears. This issue does not affect the `.parse` function, only `.parseToNode`.
As the sample code above indicates, this isn't a problem with spaCy - it's an issue either with the python library or mecab itself, though since neither of those are updated often and I can't reproduce it on my Arch machine it might be Debian specific.
I'll keep looking into this and see if I can figure out what's up.
I think I found the cause of the issue - it looks like it's the same as SamuraiT/mecab-python3#3, which is caused by taku910/mecab#5. The problem is in Mecab itself and is fixed in the latest git version, but not in the latest release (which is from 2013).
@kbulygin For you individually I think the best solution is to install Mecab from source. Sorry I can't provide a better solution, but given how long Mecab has been without a release it's hard to say when the next one would be or how long it would take distribution packages to be updated.
For spaCy we might want to post a warning somewhere... not sure about the best place to do that. Since this is an issue in the C++ lib we can't really deal with the version in `requirements.txt` or something.
I will poke the main Mecab project and maybe some distribution maintainers.
@polm Thanks for the investigation. Personally, I'm not affected by the error much now: I just thought the behaviour is worth reporting, as the error seemed related to how initialization is done on the spaCy side.
As far as I get from https://github.com/taku910/mecab/issues/5#issuecomment-80528189, another safe solution is probably just to do warming up like:
```python
import spacy
nlp = spacy.blank('ja')
nlp('') # no exception is raised
# `nlp` is ready now.
```
This can also be done in the spaCy itself of course, unless considered too hacky.
Thanks for getting to the bottom of this!
Running warm-up code on initialization of the `Japanese` language class sounds fine. It could process an empty string, or just make the most minimal call into Mecab required to prevent this issue. We can then add a comment pointing to this issue and remove the hack once it's fixed in Mecab itself.
The next release of python3-mecab from PyPI will include pre-built wheels that bundle a version of MeCab with this bug fixed. Unfortunately I cannot promise exactly when that will happen, but I'm shooting for "before the end of December". | 2018-12-18T12:55:54Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.6/site-packages/spacy/language.py", line 340, in __call__
doc = self.make_doc(text)
File "/usr/local/lib/python3.6/site-packages/spacy/lang/ja/__init__.py", line 117, in make_doc
return self.tokenizer(text)
File "/usr/local/lib/python3.6/site-packages/spacy/lang/ja/__init__.py", line 81, in __call__
doc = Doc(self.vocab, words=words, spaces=[False]*len(words))
File "doc.pyx", line 176, in spacy.tokens.doc.Doc.__init__
File "doc.pyx", line 559, in spacy.tokens.doc.Doc.push_back
ValueError: [E031] Invalid token: empty string ('') at position 0.
| 5,235 |
|||
explosion/spaCy | explosion__spaCy-3100 | cc95167b6de7fcfa557b61f3edddfe35349f44e0 | diff --git a/spacy/cli/package.py b/spacy/cli/package.py
--- a/spacy/cli/package.py
+++ b/spacy/cli/package.py
@@ -102,6 +102,7 @@ def generate_meta(model_path, existing_meta, msg):
"width": nlp.vocab.vectors_length,
"vectors": len(nlp.vocab.vectors),
"keys": nlp.vocab.vectors.n_keys,
+ "name": nlp.vocab.vectors.name,
}
msg.divider("Generating meta.json")
msg.text(
diff --git a/spacy/cli/train.py b/spacy/cli/train.py
--- a/spacy/cli/train.py
+++ b/spacy/cli/train.py
@@ -279,6 +279,7 @@ def train(
"width": nlp.vocab.vectors_length,
"vectors": len(nlp.vocab.vectors),
"keys": nlp.vocab.vectors.n_keys,
+ "name": nlp.vocab.vectors.name
}
meta.setdefault("name", "model%d" % i)
meta.setdefault("version", version)
| pretrained_vectors meta written inconsistently by CLI
## How to reproduce the behaviour
I would like to build a new model more or less from scratch, using word vectors trained elsewhere:
```
spacy init-model nb path/to/model-with-vectors --vectors-loc input-vectors.txt [...]
spacy train --vectors path/to/model-with-vectors [...]
```
This works well. But when I try to `spacy.load()` a trained model, it fails with:
`OSError: [E050] Can't find model 'nb_model.vectors'. It doesn't seem to be a shortcut link, a Python package or a valid path to a data directory. `
If I manually remove `"pretrained_vectors": "nb_model.vectors"` from `{ner,tagger,parser}/cfg`, the model appears to work as expected.
<details>
<summary>Full stack trace</summary>
```
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/__init__.py", line 21, in load
return util.load_model(name, **overrides)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/util.py", line 116, in load_model
return load_model_from_path(Path(name), **overrides)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/util.py", line 156, in load_model_from_path
return nlp.from_disk(model_path)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/language.py", line 647, in from_disk
util.from_disk(path, deserializers, exclude)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/util.py", line 511, in from_disk
reader(path / key)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/language.py", line 643, in <lambda>
deserializers[name] = lambda p, proc=proc: proc.from_disk(p, vocab=False)
File "pipeline.pyx", line 643, in spacy.pipeline.Tagger.from_disk
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/util.py", line 511, in from_disk
reader(path / key)
File "pipeline.pyx", line 625, in spacy.pipeline.Tagger.from_disk.load_model
File "pipeline.pyx", line 535, in spacy.pipeline.Tagger.Model
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/_ml.py", line 447, in build_tagger_model
pretrained_vectors=pretrained_vectors)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/_ml.py", line 278, in Tok2Vec
glove = StaticVectors(pretrained_vectors, width, column=cols.index(ID))
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/thinc/neural/_classes/static_vectors.py", line 41, in __init__
vectors = self.get_vectors()
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/thinc/neural/_classes/static_vectors.py", line 52, in get_vectors
return get_vectors(self.ops, self.lang)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/thinc/extra/load_nlp.py", line 19, in get_vectors
nlp = get_spacy(lang)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/thinc/extra/load_nlp.py", line 11, in get_spacy
SPACY_MODELS[lang] = spacy.load(lang, **kwargs)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/__init__.py", line 21, in load
return util.load_model(name, **overrides)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/util.py", line 119, in load_model
raise IOError(Errors.E050.format(name=name))
OSError: [E050] Can't find model 'nb_model.vectors'. It doesn't seem to be a shortcut link, a Python package or a valid path to a data directory.
```
</details>
## Your Environment
## Info about spaCy
* **spaCy version:** 2.1.0a4
* **Platform:** Darwin-18.2.0-x86_64-i386-64bit
* **Python version:** 3.6.1
* **Models:** de, fr
| * It appears to only be a problem with non-final models (including `model-best`)
* It can also be fixed by setting vectors.name to "nb_model.vectors" in the top-level meta.json (which is why it works in `model-final`).
@jarib Thanks for the report. I think we should be able to patch this by adjusting the meta in `spacy train`. Would you be able to make a pull request?
Looks like the same problem occurs in `spacy package`, see #3067:
> First reported [on the Prodigy forum](https://support.prodi.gy/t/issue-with-exported-spacy-models/999/3):
>
>> You were right, that was caused by wrong meta.json, which was generated after `spacy package ... --create-meta`, without vector names: `"vectors":{ "width":XYZ, "vectors":XYZ, "keys":XYZ}` . I compared that with meta.json automatically generated from `prodigy textcat.batch-train ... en_core_web_md ` where vector info is stored in following format: `"vectors":{... , "name":"en_core_web_md.vectors"} `
>>
>> I just copied automatically generated meta.json, changed model name to my unique and was able to load my custom model with vectors inside prodigy commands :)
>
> This caused the following error:
>
>> ```
>> OSError: [E050] Can’t find model 'en_core_web_lg.vectors'. It doesn’t seem to be a shortcut link, a Python package or a valid path to a data directory.
>> ```
I'll merge those issues, since this should ideally be part of the same fix! | 2018-12-27T17:09:31Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/__init__.py", line 21, in load
return util.load_model(name, **overrides)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/util.py", line 116, in load_model
return load_model_from_path(Path(name), **overrides)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/util.py", line 156, in load_model_from_path
return nlp.from_disk(model_path)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/language.py", line 647, in from_disk
util.from_disk(path, deserializers, exclude)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/util.py", line 511, in from_disk
reader(path / key)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/language.py", line 643, in <lambda>
deserializers[name] = lambda p, proc=proc: proc.from_disk(p, vocab=False)
File "pipeline.pyx", line 643, in spacy.pipeline.Tagger.from_disk
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/util.py", line 511, in from_disk
reader(path / key)
File "pipeline.pyx", line 625, in spacy.pipeline.Tagger.from_disk.load_model
File "pipeline.pyx", line 535, in spacy.pipeline.Tagger.Model
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/_ml.py", line 447, in build_tagger_model
pretrained_vectors=pretrained_vectors)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/_ml.py", line 278, in Tok2Vec
glove = StaticVectors(pretrained_vectors, width, column=cols.index(ID))
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/thinc/neural/_classes/static_vectors.py", line 41, in __init__
vectors = self.get_vectors()
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/thinc/neural/_classes/static_vectors.py", line 52, in get_vectors
return get_vectors(self.ops, self.lang)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/thinc/extra/load_nlp.py", line 19, in get_vectors
nlp = get_spacy(lang)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/thinc/extra/load_nlp.py", line 11, in get_spacy
SPACY_MODELS[lang] = spacy.load(lang, **kwargs)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/__init__.py", line 21, in load
return util.load_model(name, **overrides)
File "/home/ubuntu/src/spacy-nb/.venv/lib/python3.6/site-packages/spacy/util.py", line 119, in load_model
raise IOError(Errors.E050.format(name=name))
OSError: [E050] Can't find model 'nb_model.vectors'. It doesn't seem to be a shortcut link, a Python package or a valid path to a data directory.
| 5,238 |
|||
explosion/spaCy | explosion__spaCy-3583 | 531c0869b2535eacb47b07b649fe1710233ce001 | diff --git a/spacy/cli/convert.py b/spacy/cli/convert.py
--- a/spacy/cli/convert.py
+++ b/spacy/cli/convert.py
@@ -39,7 +39,7 @@
def convert(
input_file,
output_dir="-",
- file_type="jsonl",
+ file_type="json",
n_sents=1,
morphology=False,
converter="auto",
@@ -48,8 +48,8 @@ def convert(
"""
Convert files into JSON format for use with train command and other
experiment management functions. If no output_dir is specified, the data
- is written to stdout, so you can pipe them forward to a JSONL file:
- $ spacy convert some_file.conllu > some_file.jsonl
+ is written to stdout, so you can pipe them forward to a JSON file:
+ $ spacy convert some_file.conllu > some_file.json
"""
msg = Printer()
input_path = Path(input_file)
| Training a new model using cli throws error `KeyError`
I am trying to train a new spacy model based on the [Tweebank](https://github.com/Oneplus/Tweebank) annotated data.
For that I first tried using the training example given in the docs to familiarize myself with the procedure.
Example and training on the Tweebank throw the same error.
## How to reproduce the behaviour
Follow the example [here](https://spacy.io/usage/training#spacy-train-cli)
For the sake of completeness:
```
git clone https://github.com/UniversalDependencies/UD_Spanish-AnCora
mkdir ancora-json
python -m spacy convert UD_Spanish-AnCora/es_ancora-ud-train.conllu ancora-json
python -m spacy convert UD_Spanish-AnCora/es_ancora-ud-dev.conllu ancora-json
mkdir models
python -m spacy train es models ancora-json/es_ancora-ud-train.jsonl ancora-json/es_ancora-ud-dev.jsonl
```
## Your Environment
## Info about spaCy
* **spaCy version:** 2.1.3
* **Platform:** Linux-4.15.0-46-generic-x86_64-with-debian-buster-sid
* **Python version:** 3.6.7
* **Models:** en_core_web_md, en_core_web_sm
## The Error
```
>>> python -m spacy train es models es_ancora-ud-train.jsonl es_ancora-ud-dev.jsonl
Training pipeline: ['tagger', 'parser', 'ner']
Starting with blank model 'es'
Counting training words (limit=0)
Traceback (most recent call last):
File "/home/shikhar/.conda/envs/factmata/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/shikhar/.conda/envs/factmata/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/home/shikhar/.conda/envs/factmata/lib/python3.6/site-packages/spacy/__main__.py", line 35, in <module>
plac.call(commands[command], sys.argv[1:])
File "/home/shikhar/.conda/envs/factmata/lib/python3.6/site-packages/plac_core.py", line 328, in call
cmd, result = parser.consume(arglist)
File "/home/shikhar/.conda/envs/factmata/lib/python3.6/site-packages/plac_core.py", line 207, in consume
return cmd, self.func(*(args + varargs + extraopts), **kwargs)
File "/home/shikhar/.conda/envs/factmata/lib/python3.6/site-packages/spacy/cli/train.py", line 196, in train
corpus = GoldCorpus(train_path, dev_path, limit=n_examples)
File "gold.pyx", line 112, in spacy.gold.GoldCorpus.__init__
File "gold.pyx", line 125, in spacy.gold.GoldCorpus.write_msgpack
KeyError: 1
```
| Could you run the (experimental) `debug-data` command and see if it produces any hints? I suspect what might be happening is that the corpus includes labels or tags that aren't in the tag map and spaCy doesn't fail very gracefully here – or something similar to that.
`debug-data` currently isn't officially documented, but you can type `python -m spacy debug-data --help` for docs. Here's and example command:
```bash
python -m spacy debug-data en /path/to/train.json /path/to/dev.json --pipeline tagger,parser
```
You can also add the `--verbose` flag to make it show more details.
For the example in the docs:
```
>> python -m spacy debug-data es es_ancora-ud-train.jsonl es_ancora-ud-dev.jsonl --verbose
=========================== Data format validation ===========================
✔ Loaded es_ancora-ud-train.jsonl
✔ Loaded es_ancora-ud-dev.jsonl
✔ Training data JSON format is valid
✔ Development data JSON format is valid
✔ Corpus is loadable
=============================== Training stats ===============================
Training pipeline: tagger, parser, ner
Starting with blank model 'es'
14305 training docs
1654 evaluation docs
✔ No overlap between training and evaluation data
============================== Vocab & Vectors ==============================
ℹ 444617 total words in the data (37523 unique)
10 most common words: 'de' (26711), ',' (24417), 'la' (15223), '.' (14179),
'que' (13184), 'el' (11901), 'en' (10519), 'y' (9242), 'a' (7943), '"' (7385)
ℹ No word vectors present in the model
========================== Named Entity Recognition ==========================
ℹ 0 new labels, 0 existing labels
444617 missing values (tokens with '-' label)
✔ Good amount of examples for all labels
✔ Examples without occurences available for all labels
✔ No entities consisting of or starting/ending with whitespace
=========================== Part-of-speech Tagging ===========================
ℹ 17 labels in data (300 labels in tag map)
'NOUN' (81523), 'ADP' (71192), 'DET' (60656), 'PUNCT' (52915), 'VERB' (36311),
'PROPN' (34458), 'ADJ' (29439), 'PRON' (19947), 'ADV' (14496), 'AUX' (13779),
'CCONJ' (12225), 'SCONJ' (10129), 'NUM' (6929), 'SYM' (406), 'PART' (122),
'INTJ' (88), 'X' (2)
✘ Label 'VERB' not found in tag map for language 'es'
✘ Label 'PUNCT' not found in tag map for language 'es'
✘ Label 'NOUN' not found in tag map for language 'es'
✘ Label 'PROPN' not found in tag map for language 'es'
✘ Label 'ADV' not found in tag map for language 'es'
✘ Label 'ADJ' not found in tag map for language 'es'
✘ Label 'CCONJ' not found in tag map for language 'es'
✘ Label 'PRON' not found in tag map for language 'es'
✘ Label 'AUX' not found in tag map for language 'es'
✘ Label 'SCONJ' not found in tag map for language 'es'
✘ Label 'NUM' not found in tag map for language 'es'
✘ Label 'PART' not found in tag map for language 'es'
✘ Label 'SYM' not found in tag map for language 'es'
✘ Label 'INTJ' not found in tag map for language 'es'
✘ Label 'X' not found in tag map for language 'es'
============================= Dependency Parsing =============================
ℹ 139 labels in data
'case' (61501), 'det' (60246), 'punct' (51801), 'nmod' (31254), 'obj' (30273),
'nsubj' (23886), 'amod' (23811), 'obl' (19278), 'advmod' (16127), 'mark'
(15599), 'ROOT' (14305), 'conj' (12990), 'cc' (12190), 'flat' (11626), 'acl'
(8528), 'aux' (7857), 'advcl' (6951), 'fixed' (6417), 'appos' (6093), 'cop'
(4717), 'ccomp' (4671), 'nummod' (4519), 'xcomp' (2064), 'compound' (2010),
'iobj' (1437), 'csubj' (998), 'punct||conj' (993), 'parataxis' (534),
'expl:pass' (411), 'dep' (242), 'nsubj||conj' (120), 'mark||conj' (88),
'advmod||conj' (88), 'flat||det' (78), 'obj||conj' (62), 'obj||cc' (55),
'obl||conj' (51), 'punct||det' (44), 'cc||cc' (40), 'appos||det' (39),
'advcl||conj' (33), 'nsubj:pass' (30), 'aux||conj' (29), 'nmod||cc' (29),
'nsubj||cc' (29), 'nmod||det' (24), 'det||conj' (22), 'orphan' (22), 'obl||cc'
(20), 'ccomp||cc' (19), 'advcl||cc' (18), 'cc||conj' (17), 'mark||ccomp' (16),
'mark||advcl' (15), 'case||cc' (15), 'acl||det' (12), 'punct||xcomp' (10),
'xcomp||cc' (10), 'punct||ccomp' (10), 'punct||case' (9), 'acl||cc' (9),
'advmod||ccomp' (9), 'mark||acl' (9), 'punct||advcl' (8), 'obl||xcomp' (8),
'punct||acl' (8), 'amod||cc' (7), 'nsubj||advcl' (7), 'fixed||case' (7),
'obl||advcl' (7), 'obl||ccomp' (7), 'iobj||conj' (6), 'mark||xcomp' (6),
'advmod||cc' (6), 'nsubj||acl' (6), 'case||cop' (6), 'appos||cc' (6),
'compound||det' (6), 'csubj||cc' (5), 'nsubj||ccomp' (5), 'mark||cc' (5),
'mark||csubj' (4), 'cop||cc' (4), 'advcl||parataxis' (3), 'advmod||xcomp' (3),
'appos||obj' (3), 'det||cc' (3), 'case||amod' (3), 'advmod||acl' (3), 'obl||acl'
(3), 'advmod||advcl' (2), 'nsubj||xcomp' (2), 'flat||obj' (2), 'nummod||det'
(2), 'appos||conj' (2), 'obj||ccomp' (2), 'advcl||ccomp' (2), 'case||nmod' (2),
'obl||parataxis' (2), 'case||flat' (2), 'det||advcl' (2), 'advmod||parataxis'
(2), 'cop||conj' (2), 'det||obj' (1), 'nmod||obj' (1), 'acl||obj' (1),
'ccomp||conj' (1), 'obl||csubj' (1), 'advmod||csubj' (1), 'punct||csubj' (1),
'cop||ccomp' (1), 'case||appos' (1), 'fixed||mark' (1), 'aux||xcomp' (1),
'obj||xcomp' (1), 'aux||cc' (1), 'csubj:pass' (1), 'punct||flat' (1), 'obj||acl'
(1), 'flat||appos' (1), 'punct||parataxis' (1), 'amod||det' (1), 'punct||aux'
(1), 'appos||advmod' (1), 'advcl||advcl' (1), 'csubj||conj' (1), 'mark||cop'
(1), 'compound||aux' (1), 'det||ccomp' (1), 'nmod||conj' (1), 'advcl||acl' (1),
'amod||nummod' (1), 'amod||conj' (1), 'case||conj' (1), 'case||det' (1),
'obj||nmod' (1), 'nsubj||csubj' (1), 'obj||csubj' (1), 'obj||parataxis' (1)
================================== Summary ==================================
✔ 9 checks passed
✘ 15 errors
```
For the dataset that I am trying to work on:
```
>> python -m spacy debug-data en en-ud-tweet-train.jsonl en-ud-tweet-dev.jsonl
=========================== Data format validation ===========================
✔ Loaded en-ud-tweet-train.jsonl
✔ Loaded en-ud-tweet-dev.jsonl
✔ Training data JSON format is valid
✔ Development data JSON format is valid
✔ Corpus is loadable
=============================== Training stats ===============================
Training pipeline: tagger, parser, ner
Starting with blank model 'en'
1639 training docs
710 evaluation docs
✔ No overlap between training and evaluation data
⚠ Low number of examples to train from a blank model (1639)
============================== Vocab & Vectors ==============================
ℹ 24753 total words in the data (8564 unique)
ℹ No word vectors present in the model
========================== Named Entity Recognition ==========================
ℹ 0 new labels, 0 existing labels
24753 missing values (tokens with '-' label)
✔ Good amount of examples for all labels
✔ Examples without occurences available for all labels
✔ No entities consisting of or starting/ending with whitespace
=========================== Part-of-speech Tagging ===========================
ℹ 42 labels in data (57 labels in tag map)
✘ Label 'V_V' not found in tag map for language 'en'
✘ Label 'N_N' not found in tag map for language 'en'
✘ Label 'P_P' not found in tag map for language 'en'
✘ Label 'D_D' not found in tag map for language 'en'
✘ Label 'R_R' not found in tag map for language 'en'
✘ Label 'A_A' not found in tag map for language 'en'
✘ Label ',_,' not found in tag map for language 'en'
✘ Label 'X' not found in tag map for language 'en'
✘ Label 'PUNCT' not found in tag map for language 'en'
✘ Label 'NOUN' not found in tag map for language 'en'
✘ Label 'PRON' not found in tag map for language 'en'
✘ Label 'VERB' not found in tag map for language 'en'
✘ Label 'PART' not found in tag map for language 'en'
✘ Label 'ADP' not found in tag map for language 'en'
✘ Label 'CCONJ' not found in tag map for language 'en'
✘ Label 'ADJ' not found in tag map for language 'en'
✘ Label '~_~' not found in tag map for language 'en'
✘ Label '@_@' not found in tag map for language 'en'
✘ Label 'O_O' not found in tag map for language 'en'
✘ Label 'L_L' not found in tag map for language 'en'
✘ Label '&_&' not found in tag map for language 'en'
✘ Label '#_#' not found in tag map for language 'en'
✘ Label 'U_U' not found in tag map for language 'en'
✘ Label 'E_E' not found in tag map for language 'en'
✘ Label '!_!' not found in tag map for language 'en'
✘ Label 'PROPN' not found in tag map for language 'en'
✘ Label 'NUM' not found in tag map for language 'en'
✘ Label 'ADV' not found in tag map for language 'en'
✘ Label 'DET' not found in tag map for language 'en'
✘ Label 'AUX' not found in tag map for language 'en'
✘ Label 'INTJ' not found in tag map for language 'en'
✘ Label 'SCONJ' not found in tag map for language 'en'
✘ Label '^_^' not found in tag map for language 'en'
✘ Label '$_$' not found in tag map for language 'en'
✘ Label 'G_G' not found in tag map for language 'en'
✘ Label 'T_T' not found in tag map for language 'en'
✘ Label 'X_X' not found in tag map for language 'en'
✘ Label 'Z_Z' not found in tag map for language 'en'
✘ Label 'S_S' not found in tag map for language 'en'
✘ Label 'Y_Y' not found in tag map for language 'en'
✘ Label 'M_M' not found in tag map for language 'en'
============================= Dependency Parsing =============================
ℹ 57 labels in data
================================== Summary ==================================
✔ 9 checks passed
⚠ 1 warning
✘ 41 errors
(factmata) ➜ Tweebank git:(dev) ✗ python -m spacy debug-data en en-ud-tweet-train.jsonl en-ud-tweet-dev.jsonl --verbose
=========================== Data format validation ===========================
✔ Loaded en-ud-tweet-train.jsonl
✔ Loaded en-ud-tweet-dev.jsonl
✔ Training data JSON format is valid
✔ Development data JSON format is valid
✔ Corpus is loadable
=============================== Training stats ===============================
Training pipeline: tagger, parser, ner
Starting with blank model 'en'
1639 training docs
710 evaluation docs
✔ No overlap between training and evaluation data
⚠ Low number of examples to train from a blank model (1639)
It's recommended to use at least 2000 examples (minimum 100)
============================== Vocab & Vectors ==============================
ℹ 24753 total words in the data (8564 unique)
10 most common words: ':' (773), 'RT' (638), '.' (593), 'I' (398), 'the' (394),
'to' (367), ',' (345), 'a' (276), '!' (271), 'you' (269)
ℹ No word vectors present in the model
========================== Named Entity Recognition ==========================
ℹ 0 new labels, 0 existing labels
24753 missing values (tokens with '-' label)
✔ Good amount of examples for all labels
✔ Examples without occurences available for all labels
✔ No entities consisting of or starting/ending with whitespace
=========================== Part-of-speech Tagging ===========================
ℹ 42 labels in data (57 labels in tag map)
'NOUN' (2251), 'PUNCT' (2116), 'X' (1998), 'VERB' (1699), 'PRON' (1452), 'PROPN'
(1356), 'V_V' (1288), 'N_N' (1110), 'ADP' (1061), ',_,' (944), 'ADJ' (839),
'AUX' (789), 'DET' (728), 'ADV' (686), 'P_P' (671), 'O_O' (600), 'D_D' (484),
'^_^' (463), 'A_A' (439), '@_@' (412), 'R_R' (383), 'PART' (368), 'NUM' (285),
'~_~' (284), 'CCONJ' (272), 'SYM' (265), 'L_L' (239), '!_!' (225), 'INTJ' (160),
'SCONJ' (150), '$_$' (131), '&_&' (120), 'U_U' (118), 'E_E' (90), '#_#' (85),
'G_G' (69), 'T_T' (54), 'Z_Z' (43), 'X_X' (15), 'S_S' (8), 'M_M' (2), 'Y_Y' (1)
✘ Label 'V_V' not found in tag map for language 'en'
✘ Label 'R_R' not found in tag map for language 'en'
✘ Label 'A_A' not found in tag map for language 'en'
✘ Label 'P_P' not found in tag map for language 'en'
✘ Label 'D_D' not found in tag map for language 'en'
✘ Label ',_,' not found in tag map for language 'en'
✘ Label '#_#' not found in tag map for language 'en'
✘ Label 'ADJ' not found in tag map for language 'en'
✘ Label 'NOUN' not found in tag map for language 'en'
✘ Label 'PUNCT' not found in tag map for language 'en'
✘ Label 'X' not found in tag map for language 'en'
✘ Label 'PROPN' not found in tag map for language 'en'
✘ Label 'NUM' not found in tag map for language 'en'
✘ Label '@_@' not found in tag map for language 'en'
✘ Label '!_!' not found in tag map for language 'en'
✘ Label 'N_N' not found in tag map for language 'en'
✘ Label 'O_O' not found in tag map for language 'en'
✘ Label 'DET' not found in tag map for language 'en'
✘ Label 'PRON' not found in tag map for language 'en'
✘ Label 'AUX' not found in tag map for language 'en'
✘ Label 'CCONJ' not found in tag map for language 'en'
✘ Label 'ADP' not found in tag map for language 'en'
✘ Label '~_~' not found in tag map for language 'en'
✘ Label 'L_L' not found in tag map for language 'en'
✘ Label 'INTJ' not found in tag map for language 'en'
✘ Label 'VERB' not found in tag map for language 'en'
✘ Label 'SCONJ' not found in tag map for language 'en'
✘ Label '&_&' not found in tag map for language 'en'
✘ Label 'ADV' not found in tag map for language 'en'
✘ Label '$_$' not found in tag map for language 'en'
✘ Label 'G_G' not found in tag map for language 'en'
✘ Label 'E_E' not found in tag map for language 'en'
✘ Label '^_^' not found in tag map for language 'en'
✘ Label 'Z_Z' not found in tag map for language 'en'
✘ Label 'PART' not found in tag map for language 'en'
✘ Label 'U_U' not found in tag map for language 'en'
✘ Label 'T_T' not found in tag map for language 'en'
✘ Label 'X_X' not found in tag map for language 'en'
✘ Label 'M_M' not found in tag map for language 'en'
✘ Label 'S_S' not found in tag map for language 'en'
✘ Label 'Y_Y' not found in tag map for language 'en'
============================= Dependency Parsing =============================
ℹ 57 labels in data
'punct' (3250), 'ROOT' (2470), 'discourse' (2139), 'nsubj' (1940), 'case'
(1484), 'obj' (1188), 'advmod' (1132), 'det' (1046), 'obl' (855), 'amod' (796),
'compound' (785), 'list' (694), 'vocative' (615), 'cop' (582), 'mark' (577),
'aux' (557), 'parataxis' (522), 'nmod' (500), 'conj' (467), 'nmod:poss' (432),
'cc' (411), 'xcomp' (405), 'nummod' (298), 'advcl' (273), 'ccomp' (232), 'flat'
(190), 'compound:prt' (123), 'acl:relcl' (116), 'appos' (109), 'acl' (105),
'obl:tmod' (80), 'aux:pass' (48), 'nsubj:pass' (46), 'obl:npmod' (40), 'iobj'
(36), 'goeswith' (28), 'nmod:tmod' (27), 'det:predet' (25), 'csubj' (24),
'nmod:npmod' (23), 'expl' (23), 'flat:foreign' (18), 'fixed' (16), 'reparandum'
(8), 'appos||obj' (4), 'case||obl' (2), 'acl||obj' (2), 'cc:preconj' (1),
'case||nsubj:pass' (1), 'obl||amod' (1), 'conj||obl' (1), 'obj||xcomp' (1),
'dislocated||xcomp' (1), 'orphan' (1), 'dislocated' (1), 'cc||conj' (1),
'conj||aux' (1)
================================== Summary ==================================
✔ 9 checks passed
⚠ 1 warning
✘ 41 errors
```
How come the tags `VERB`, `PRON`, `PROPN` are labelled as not present in tag map for en?
I am getting the exact same error message. Working in Python 3.7.1. Mac OSX 10.14.4. Spacy 2.1.3
I think I have the same error trying to evaluate model (with spacy 2.1.3 and 2.1.1 also):
> python -m spacy convert /home/julm/work/data/ud/UD_English-EWT-master/en_ewt-ud-test.conllu ud_en_ewt -n 10 -l en
> python -m spacy evaluate en /home/julm/work/data/ud/ud_en_ewt/en_ewt-ud-test.jsonl`
`Traceback (most recent call last):
File "/home/julm/anaconda3/envs/python3_spacy/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/julm/anaconda3/envs/python3_spacy/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/home/julm/anaconda3/envs/python3_spacy/lib/python3.6/site-packages/spacy/__main__.py", line 35, in <module>
plac.call(commands[command], sys.argv[1:])
File "/home/julm/anaconda3/envs/python3_spacy/lib/python3.6/site-packages/plac_core.py", line 328, in call
cmd, result = parser.consume(arglist)
File "/home/julm/anaconda3/envs/python3_spacy/lib/python3.6/site-packages/plac_core.py", line 207, in consume
return cmd, self.func(*(args + varargs + extraopts), **kwargs)
File "/home/julm/anaconda3/envs/python3_spacy/lib/python3.6/site-packages/spacy/cli/evaluate.py", line 44, in evaluate
corpus = GoldCorpus(data_path, data_path)
File "gold.pyx", line 112, in spacy.gold.GoldCorpus.__init__
File "gold.pyx", line 125, in spacy.gold.GoldCorpus.write_msgpack
KeyError: 1
`
By the way, evaluate works with UD Ukrainian json I've converted with spacy 2.0 (but with code change in convert)
@ines are you able to reproduce this error?? Seems like we all have are getting this error from running the code in the official documentation https://spacy.io/usage/training#spacy-train-cli
The repository UniversalDependencies/UD_Spanish-AnCora has not been updated during the last 5 months so maybe the error is related to a spacy version upgrade?!
> How come the tags VERB, PRON, PROPN are labelled as not present in tag map for en?
The tag map maps *fine-grained tags* from the treebank to *coarse-grained* universal tags – for example, [in English](https://github.com/explosion/spaCy/blob/master/spacy/lang/en/tag_map.py), it'd map something like `VBZ` to `VERB`. In your treebank, the universal tags plus a bunch of custom ones are used. So essentially, you need to provide spaCy with a custom tag mao that matches the tag set in the data.
There are a few todos on this topic for us:
* Make spaCy fail more gracefully here and not with a KeyError. It should raise a custom one and say hey, I came across a tag that's not in the tag map, see here for the tag map, here's what to do etc.
* Make it easy to provide your own JSON tag map in `spacy train`
I get the same error with UD_Norwegian-Bokmaal using spacy 2.1.3.
I have json-files generated with "spacy convert" from spacy 2.0.18 and I'm able to use these to train without error.
I think the issue is that `spacy convert` changed behaviour. We should change it back for the next release.
`spacy convert` is currently defaulting to jsonl-formatted output, which is causing problems. If you set `-t json` it should work I think.
@honnibal : Thanks for looking into this so quickly! Problem solved!
@honnibal Thank you for pointing it out. I would like to help out with this issue. Any references that can help me get started?
@xssChauhan I think it could be as straightforward as changing the default value from `"jsonl"` to `"json"` here:
https://github.com/explosion/spaCy/blob/4d198a7e92f813fb9df2ade72fbeaf847284a7a0/spacy/cli/convert.py#L42
If you'd like to test this and submit a PR, that'd be great! | 2019-04-12T06:16:14Z | [] | [] |
Traceback (most recent call last):
File "/home/shikhar/.conda/envs/factmata/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/shikhar/.conda/envs/factmata/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/home/shikhar/.conda/envs/factmata/lib/python3.6/site-packages/spacy/__main__.py", line 35, in <module>
plac.call(commands[command], sys.argv[1:])
File "/home/shikhar/.conda/envs/factmata/lib/python3.6/site-packages/plac_core.py", line 328, in call
cmd, result = parser.consume(arglist)
File "/home/shikhar/.conda/envs/factmata/lib/python3.6/site-packages/plac_core.py", line 207, in consume
return cmd, self.func(*(args + varargs + extraopts), **kwargs)
File "/home/shikhar/.conda/envs/factmata/lib/python3.6/site-packages/spacy/cli/train.py", line 196, in train
corpus = GoldCorpus(train_path, dev_path, limit=n_examples)
File "gold.pyx", line 112, in spacy.gold.GoldCorpus.__init__
File "gold.pyx", line 125, in spacy.gold.GoldCorpus.write_msgpack
KeyError: 1
| 5,253 |
|||
gitpython-developers/GitPython | gitpython-developers__GitPython-1340 | 146202cdcbed8239651ccc62d36a8e5af3ceff8c | diff --git a/git/cmd.py b/git/cmd.py
--- a/git/cmd.py
+++ b/git/cmd.py
@@ -79,7 +79,7 @@ def handle_process_output(process: 'Git.AutoInterrupt' | Popen,
finalizer: Union[None,
Callable[[Union[subprocess.Popen, 'Git.AutoInterrupt']], None]] = None,
decode_streams: bool = True,
- timeout: float = 10.0) -> None:
+ kill_after_timeout: Union[None, float] = None) -> None:
"""Registers for notifications to learn that process output is ready to read, and dispatches lines to
the respective line handlers.
This function returns once the finalizer returns
@@ -94,7 +94,10 @@ def handle_process_output(process: 'Git.AutoInterrupt' | Popen,
their contents to handlers.
Set it to False if `universal_newline == True` (then streams are in text-mode)
or if decoding must happen later (i.e. for Diffs).
- :param timeout: float, timeout to pass to t.join() in case it hangs. Default = 10.0 seconds
+ :param kill_after_timeout:
+ float or None, Default = None
+ To specify a timeout in seconds for the git command, after which the process
+ should be killed.
"""
# Use 2 "pump" threads and wait for both to finish.
def pump_stream(cmdline: List[str], name: str, stream: Union[BinaryIO, TextIO], is_decode: bool,
@@ -108,9 +111,12 @@ def pump_stream(cmdline: List[str], name: str, stream: Union[BinaryIO, TextIO],
handler(line_str)
else:
handler(line)
+
except Exception as ex:
log.error(f"Pumping {name!r} of cmd({remove_password_if_present(cmdline)}) failed due to: {ex!r}")
- raise CommandError([f'<{name}-pump>'] + remove_password_if_present(cmdline), ex) from ex
+ if "I/O operation on closed file" not in str(ex):
+ # Only reraise if the error was not due to the stream closing
+ raise CommandError([f'<{name}-pump>'] + remove_password_if_present(cmdline), ex) from ex
finally:
stream.close()
@@ -146,9 +152,24 @@ def pump_stream(cmdline: List[str], name: str, stream: Union[BinaryIO, TextIO],
## FIXME: Why Join?? Will block if `stdin` needs feeding...
#
for t in threads:
- t.join(timeout=timeout)
+ t.join(timeout=kill_after_timeout)
if t.is_alive():
- raise RuntimeError(f"Thread join() timed out in cmd.handle_process_output(). Timeout={timeout} seconds")
+ if isinstance(process, Git.AutoInterrupt):
+ process._terminate()
+ else: # Don't want to deal with the other case
+ raise RuntimeError("Thread join() timed out in cmd.handle_process_output()."
+ f" kill_after_timeout={kill_after_timeout} seconds")
+ if stderr_handler:
+ error_str: Union[str, bytes] = (
+ "error: process killed because it timed out."
+ f" kill_after_timeout={kill_after_timeout} seconds")
+ if not decode_streams and isinstance(p_stderr, BinaryIO):
+ # Assume stderr_handler needs binary input
+ error_str = cast(str, error_str)
+ error_str = error_str.encode()
+ # We ignore typing on the next line because mypy does not like
+ # the way we inferred that stderr takes str or bytes
+ stderr_handler(error_str) # type: ignore
if finalizer:
return finalizer(process)
@@ -386,13 +407,19 @@ class AutoInterrupt(object):
The wait method was overridden to perform automatic status code checking
and possibly raise."""
- __slots__ = ("proc", "args")
+ __slots__ = ("proc", "args", "status")
+
+ # If this is non-zero it will override any status code during
+ # _terminate, used to prevent race conditions in testing
+ _status_code_if_terminate: int = 0
def __init__(self, proc: Union[None, subprocess.Popen], args: Any) -> None:
self.proc = proc
self.args = args
+ self.status: Union[int, None] = None
- def __del__(self) -> None:
+ def _terminate(self) -> None:
+ """Terminate the underlying process"""
if self.proc is None:
return
@@ -404,10 +431,10 @@ def __del__(self) -> None:
proc.stdout.close()
if proc.stderr:
proc.stderr.close()
-
# did the process finish already so we have a return code ?
try:
if proc.poll() is not None:
+ self.status = self._status_code_if_terminate or proc.poll()
return None
except OSError as ex:
log.info("Ignored error after process had died: %r", ex)
@@ -419,7 +446,9 @@ def __del__(self) -> None:
# try to kill it
try:
proc.terminate()
- proc.wait() # ensure process goes away
+ status = proc.wait() # ensure process goes away
+
+ self.status = self._status_code_if_terminate or status
except OSError as ex:
log.info("Ignored error after process had died: %r", ex)
except AttributeError:
@@ -431,6 +460,9 @@ def __del__(self) -> None:
call(("TASKKILL /F /T /PID %s 2>nul 1>nul" % str(proc.pid)), shell=True)
# END exception handling
+ def __del__(self) -> None:
+ self._terminate()
+
def __getattr__(self, attr: str) -> Any:
return getattr(self.proc, attr)
@@ -444,24 +476,29 @@ def wait(self, stderr: Union[None, str, bytes] = b'') -> int:
if stderr is None:
stderr_b = b''
stderr_b = force_bytes(data=stderr, encoding='utf-8')
-
+ status: Union[int, None]
if self.proc is not None:
status = self.proc.wait()
+ p_stderr = self.proc.stderr
+ else: # Assume the underlying proc was killed earlier or never existed
+ status = self.status
+ p_stderr = None
- def read_all_from_possibly_closed_stream(stream: Union[IO[bytes], None]) -> bytes:
- if stream:
- try:
- return stderr_b + force_bytes(stream.read())
- except ValueError:
- return stderr_b or b''
- else:
+ def read_all_from_possibly_closed_stream(stream: Union[IO[bytes], None]) -> bytes:
+ if stream:
+ try:
+ return stderr_b + force_bytes(stream.read())
+ except ValueError:
return stderr_b or b''
+ else:
+ return stderr_b or b''
- if status != 0:
- errstr = read_all_from_possibly_closed_stream(self.proc.stderr)
- log.debug('AutoInterrupt wait stderr: %r' % (errstr,))
- raise GitCommandError(remove_password_if_present(self.args), status, errstr)
# END status handling
+
+ if status != 0:
+ errstr = read_all_from_possibly_closed_stream(p_stderr)
+ log.debug('AutoInterrupt wait stderr: %r' % (errstr,))
+ raise GitCommandError(remove_password_if_present(self.args), status, errstr)
return status
# END auto interrupt
@@ -694,7 +731,7 @@ def execute(self,
as_process: bool = False,
output_stream: Union[None, BinaryIO] = None,
stdout_as_string: bool = True,
- kill_after_timeout: Union[None, int] = None,
+ kill_after_timeout: Union[None, float] = None,
with_stdout: bool = True,
universal_newlines: bool = False,
shell: Union[None, bool] = None,
@@ -817,7 +854,7 @@ def execute(self,
if is_win:
cmd_not_found_exception = OSError
- if kill_after_timeout:
+ if kill_after_timeout is not None:
raise GitCommandError(redacted_command, '"kill_after_timeout" feature is not supported on Windows.')
else:
cmd_not_found_exception = FileNotFoundError # NOQA # exists, flake8 unknown @UndefinedVariable
@@ -884,7 +921,7 @@ def _kill_process(pid: int) -> None:
return
# end
- if kill_after_timeout:
+ if kill_after_timeout is not None:
kill_check = threading.Event()
watchdog = threading.Timer(kill_after_timeout, _kill_process, args=(proc.pid,))
@@ -895,10 +932,10 @@ def _kill_process(pid: int) -> None:
newline = "\n" if universal_newlines else b"\n"
try:
if output_stream is None:
- if kill_after_timeout:
+ if kill_after_timeout is not None:
watchdog.start()
stdout_value, stderr_value = proc.communicate()
- if kill_after_timeout:
+ if kill_after_timeout is not None:
watchdog.cancel()
if kill_check.is_set():
stderr_value = ('Timeout: the command "%s" did not complete in %d '
diff --git a/git/remote.py b/git/remote.py
--- a/git/remote.py
+++ b/git/remote.py
@@ -707,7 +707,8 @@ def update(self, **kwargs: Any) -> 'Remote':
return self
def _get_fetch_info_from_stderr(self, proc: 'Git.AutoInterrupt',
- progress: Union[Callable[..., Any], RemoteProgress, None]
+ progress: Union[Callable[..., Any], RemoteProgress, None],
+ kill_after_timeout: Union[None, float] = None,
) -> IterableList['FetchInfo']:
progress = to_progress_instance(progress)
@@ -724,7 +725,8 @@ def _get_fetch_info_from_stderr(self, proc: 'Git.AutoInterrupt',
cmds = set(FetchInfo._flag_map.keys())
progress_handler = progress.new_message_handler()
- handle_process_output(proc, None, progress_handler, finalizer=None, decode_streams=False)
+ handle_process_output(proc, None, progress_handler, finalizer=None, decode_streams=False,
+ kill_after_timeout=kill_after_timeout)
stderr_text = progress.error_lines and '\n'.join(progress.error_lines) or ''
proc.wait(stderr=stderr_text)
@@ -769,7 +771,8 @@ def _get_fetch_info_from_stderr(self, proc: 'Git.AutoInterrupt',
return output
def _get_push_info(self, proc: 'Git.AutoInterrupt',
- progress: Union[Callable[..., Any], RemoteProgress, None]) -> IterableList[PushInfo]:
+ progress: Union[Callable[..., Any], RemoteProgress, None],
+ kill_after_timeout: Union[None, float] = None) -> IterableList[PushInfo]:
progress = to_progress_instance(progress)
# read progress information from stderr
@@ -786,11 +789,14 @@ def stdout_handler(line: str) -> None:
# If an error happens, additional info is given which we parse below.
pass
- handle_process_output(proc, stdout_handler, progress_handler, finalizer=None, decode_streams=False)
+ handle_process_output(proc, stdout_handler, progress_handler, finalizer=None, decode_streams=False,
+ kill_after_timeout=kill_after_timeout)
stderr_text = progress.error_lines and '\n'.join(progress.error_lines) or ''
try:
proc.wait(stderr=stderr_text)
except Exception:
+ # This is different than fetch (which fails if there is any std_err
+ # even if there is an output)
if not output:
raise
elif stderr_text:
@@ -813,7 +819,9 @@ def _assert_refspec(self) -> None:
def fetch(self, refspec: Union[str, List[str], None] = None,
progress: Union[RemoteProgress, None, 'UpdateProgress'] = None,
- verbose: bool = True, **kwargs: Any) -> IterableList[FetchInfo]:
+ verbose: bool = True,
+ kill_after_timeout: Union[None, float] = None,
+ **kwargs: Any) -> IterableList[FetchInfo]:
"""Fetch the latest changes for this remote
:param refspec:
@@ -833,6 +841,9 @@ def fetch(self, refspec: Union[str, List[str], None] = None,
for 'refspec' will make use of this facility.
:param progress: See 'push' method
:param verbose: Boolean for verbose output
+ :param kill_after_timeout:
+ To specify a timeout in seconds for the git command, after which the process
+ should be killed. It is set to None by default.
:param kwargs: Additional arguments to be passed to git-fetch
:return:
IterableList(FetchInfo, ...) list of FetchInfo instances providing detailed
@@ -853,19 +864,22 @@ def fetch(self, refspec: Union[str, List[str], None] = None,
proc = self.repo.git.fetch(self, *args, as_process=True, with_stdout=False,
universal_newlines=True, v=verbose, **kwargs)
- res = self._get_fetch_info_from_stderr(proc, progress)
+ res = self._get_fetch_info_from_stderr(proc, progress,
+ kill_after_timeout=kill_after_timeout)
if hasattr(self.repo.odb, 'update_cache'):
self.repo.odb.update_cache()
return res
def pull(self, refspec: Union[str, List[str], None] = None,
progress: Union[RemoteProgress, 'UpdateProgress', None] = None,
+ kill_after_timeout: Union[None, float] = None,
**kwargs: Any) -> IterableList[FetchInfo]:
"""Pull changes from the given branch, being the same as a fetch followed
by a merge of branch with your local branch.
:param refspec: see 'fetch' method
:param progress: see 'push' method
+ :param kill_after_timeout: see 'fetch' method
:param kwargs: Additional arguments to be passed to git-pull
:return: Please see 'fetch' method """
if refspec is None:
@@ -874,13 +888,15 @@ def pull(self, refspec: Union[str, List[str], None] = None,
kwargs = add_progress(kwargs, self.repo.git, progress)
proc = self.repo.git.pull(self, refspec, with_stdout=False, as_process=True,
universal_newlines=True, v=True, **kwargs)
- res = self._get_fetch_info_from_stderr(proc, progress)
+ res = self._get_fetch_info_from_stderr(proc, progress,
+ kill_after_timeout=kill_after_timeout)
if hasattr(self.repo.odb, 'update_cache'):
self.repo.odb.update_cache()
return res
def push(self, refspec: Union[str, List[str], None] = None,
progress: Union[RemoteProgress, 'UpdateProgress', Callable[..., RemoteProgress], None] = None,
+ kill_after_timeout: Union[None, float] = None,
**kwargs: Any) -> IterableList[PushInfo]:
"""Push changes from source branch in refspec to target branch in refspec.
@@ -897,6 +913,9 @@ def push(self, refspec: Union[str, List[str], None] = None,
overrides the ``update()`` function.
:note: No further progress information is returned after push returns.
+ :param kill_after_timeout:
+ To specify a timeout in seconds for the git command, after which the process
+ should be killed. It is set to None by default.
:param kwargs: Additional arguments to be passed to git-push
:return:
list(PushInfo, ...) list of PushInfo instances, each
@@ -908,8 +927,11 @@ def push(self, refspec: Union[str, List[str], None] = None,
be 0."""
kwargs = add_progress(kwargs, self.repo.git, progress)
proc = self.repo.git.push(self, refspec, porcelain=True, as_process=True,
- universal_newlines=True, **kwargs)
- return self._get_push_info(proc, progress)
+ universal_newlines=True,
+ kill_after_timeout=kill_after_timeout,
+ **kwargs)
+ return self._get_push_info(proc, progress,
+ kill_after_timeout=kill_after_timeout)
@ property
def config_reader(self) -> SectionConstraint[GitConfigParser]:
| default thread timeout too short for CI
Since this morning [our CI](https://github.com/openpathsampling/openpathsampling/runs/3563720388?) started failing with:
```
Run python devtools/autorelease_check.py
Traceback (most recent call last):
File "devtools/autorelease_check.py", line 24, in <module>
repo_path='.'
File "/usr/share/miniconda/envs/test/lib/python3.7/site-packages/autorelease/check_runners.py", line 52, in __init__
repo_path=repo_path
File "/usr/share/miniconda/envs/test/lib/python3.7/site-packages/autorelease/git_repo_checks.py", line 28, in __init__
self.repo.remotes.origin.fetch()
File "/usr/share/miniconda/envs/test/lib/python3.7/site-packages/git/remote.py", line 856, in fetch
res = self._get_fetch_info_from_stderr(proc, progress)
File "/usr/share/miniconda/envs/test/lib/python3.7/site-packages/git/remote.py", line 727, in _get_fetch_info_from_stderr
handle_process_output(proc, None, progress_handler, finalizer=None, decode_streams=False)
File "/usr/share/miniconda/envs/test/lib/python3.7/site-packages/git/cmd.py", line 151, in handle_process_output
raise RuntimeError(f"Thread join() timed out in cmd.handle_process_output(). Timeout={timeout} seconds")
RuntimeError: Thread join() timed out in cmd.handle_process_output(). Timeout=10.0 seconds
Fatal Python error: could not acquire lock for <_io.BufferedReader name=4> at interpreter shutdown, possibly due to daemon threads
Thread 0x00007fbb2f2d7700 (most recent call first):
File "/usr/share/miniconda/envs/test/lib/python3.7/site-packages/git/cmd.py", line 103 in pump_stream
File "/usr/share/miniconda/envs/test/lib/python3.7/threading.py", line 870 in run
File "/usr/share/miniconda/envs/test/lib/python3.7/threading.py", line 926 in _bootstrap_inner
File "/usr/share/miniconda/envs/test/lib/python3.7/threading.py", line 890 in _bootstrap
Current thread 0x00007fbb3332d180 (most recent call first):
File "/usr/share/miniconda/envs/test/lib/python3.7/site-packages/git/cmd.py", line 406 in __del__
/home/runner/work/_temp/ff824c06-ff56-4ee4-9e2a-e65258688cf4.sh: line 1: 1922 Aborted (core dumped) python devtools/autorelease_check.py
Error: Process completed with exit code 134.
```
This seems a side effect of the timeout introduced by #1318
Could this time-out default to `60 s` instead (or be taken and propagated by the `fetch` function)?
I could make a PR if needed, with either of the two solutions
| cc: @dwhswenson
Thanks for reporting and sorry for the hassle. Does that timeout not also means that clones or fetches can't take longer than that before assuming a hang? CC @Yobmod
@sroet A PR would definitely be welcome, even though it would be good to understand it fully first.
Hi, same on my CI-System, where fetches takes long time and 10 sec time-out are definitive to short. A default value of 60 sec for time-out are also fine for me. | 2021-09-10T11:59:04Z | [] | [] |
Traceback (most recent call last):
File "devtools/autorelease_check.py", line 24, in <module>
repo_path='.'
File "/usr/share/miniconda/envs/test/lib/python3.7/site-packages/autorelease/check_runners.py", line 52, in __init__
repo_path=repo_path
File "/usr/share/miniconda/envs/test/lib/python3.7/site-packages/autorelease/git_repo_checks.py", line 28, in __init__
self.repo.remotes.origin.fetch()
File "/usr/share/miniconda/envs/test/lib/python3.7/site-packages/git/remote.py", line 856, in fetch
res = self._get_fetch_info_from_stderr(proc, progress)
File "/usr/share/miniconda/envs/test/lib/python3.7/site-packages/git/remote.py", line 727, in _get_fetch_info_from_stderr
handle_process_output(proc, None, progress_handler, finalizer=None, decode_streams=False)
File "/usr/share/miniconda/envs/test/lib/python3.7/site-packages/git/cmd.py", line 151, in handle_process_output
raise RuntimeError(f"Thread join() timed out in cmd.handle_process_output(). Timeout={timeout} seconds")
RuntimeError: Thread join() timed out in cmd.handle_process_output(). Timeout=10.0 seconds
| 5,273 |
|||
gitpython-developers/GitPython | gitpython-developers__GitPython-208 | 2af98929bd185cf1b4316391078240f337877f66 | diff --git a/git/cmd.py b/git/cmd.py
--- a/git/cmd.py
+++ b/git/cmd.py
@@ -341,8 +341,10 @@ def execute(self, command,
cwd = self._working_dir
# Start the process
+ env = os.environ.copy()
+ env["LC_MESSAGES"] = "C"
proc = Popen(command,
- env={"LC_MESSAGES": "C"},
+ env=env,
cwd=cwd,
stdin=istream,
stderr=PIPE,
| in 0.3.2 executing git commands using environment variables breaks
Recent release 0.3.2 contains a change to set the LC_MESSAGES=C for all git commands to ensure the error messages are consistent across systems. this breaks commands that use the environment to allow users to control behaviour.
To reproduce, clone the GitPython repo, checkout the current 0.3 branch and within execute the following command
``` shell
PYTHONPATH=. python -c 'import os
import git
test = git.Git(".")
os.environ["GIT_EDITOR"] = "cat"
test.var("GIT_EDITOR")'
```
Output will be:
```
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "git/cmd.py", line 240, in <lambda>
return lambda *args, **kwargs: self._call_process(name, *args, **kwargs)
File "git/cmd.py", line 536, in _call_process
return self.execute(make_call(), **_kwargs)
File "git/cmd.py", line 399, in execute
raise GitCommandError(command, status, stderr_value)
git.exc.GitCommandError: 'git var GIT_EDITOR' returned exit status 128: fatal: Terminal is dumb, but EDITOR unset
```
This breaks https://github.com/stackforge/git-upstream usage of GitPython.
| 2014-11-17T16:50:50Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "git/cmd.py", line 240, in <lambda>
return lambda *args, **kwargs: self._call_process(name, *args, **kwargs)
File "git/cmd.py", line 536, in _call_process
return self.execute(make_call(), **_kwargs)
File "git/cmd.py", line 399, in execute
raise GitCommandError(command, status, stderr_value)
git.exc.GitCommandError: 'git var GIT_EDITOR' returned exit status 128: fatal: Terminal is dumb, but EDITOR unset
| 5,284 |
||||
gitpython-developers/GitPython | gitpython-developers__GitPython-226 | 3936084cdd336ce7db7d693950e345eeceab93a5 | diff --git a/git/refs/log.py b/git/refs/log.py
--- a/git/refs/log.py
+++ b/git/refs/log.py
@@ -80,11 +80,15 @@ def from_line(cls, line):
""":return: New RefLogEntry instance from the given revlog line.
:param line: line without trailing newline
:raise ValueError: If line could not be parsed"""
- try:
- info, msg = line.split('\t', 2)
- except ValueError:
- raise ValueError("line is missing tab separator")
- # END handle first plit
+ fields = line.split('\t', 1)
+ if len(fields) == 1:
+ info, msg = fields[0], None
+ elif len(fields) == 2:
+ info, msg = fields
+ else:
+ raise ValueError("Line must have up to two TAB-separated fields."
+ " Got %s" % repr(line))
+ # END handle first split
oldhexsha = info[:40]
newhexsha = info[41:81]
for hexsha in (oldhexsha, newhexsha):
| line is missing tab separator
As I have whined about in #216, I have a few hits into
```
======================================================================
ERROR: test_base (git.test.test_reflog.TestRefLog)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/yoh/deb/gits/python-git/git/test/test_reflog.py", line 48, in test_base
reflog = RefLog.from_file(rlp_master_ro)
File "/home/yoh/deb/gits/python-git/git/refs/log.py", line 154, in from_file
return cls(filepath)
File "/home/yoh/deb/gits/python-git/git/refs/log.py", line 128, in __init__
self._read_from_file()
File "/home/yoh/deb/gits/python-git/git/refs/log.py", line 140, in _read_from_file
self._deserialize(fmap)
File "/home/yoh/deb/gits/python-git/git/refs/log.py", line 289, in _deserialize
self.extend(self.iter_entries(stream))
File "/home/yoh/deb/gits/python-git/git/refs/log.py", line 180, in iter_entries
yield new_entry(line.strip())
File "/home/yoh/deb/gits/python-git/git/refs/log.py", line 86, in from_line
raise ValueError("line is missing tab separator")
ValueError: line is missing tab separator
```
```
(Pdb) print line
0000000000000000000000000000000000000000 609237d8b5a47d5e68babea116712e5d41e8a949 Yaroslav Halchenko <debian@onerussian.com> 1418863256 +0000
```
in my case it looks like fields are separated by the space. Where tab should have been? ;)
I am also a bit confused with 2 in `info, msg = line.split('\t', 2)` since it would result in up to 3 elements in the list, thus might overfill info, msg pair
| Indeed, the `.split` call should be limited to two tokens, so the two literal should be a 1 instead. Usually these lines have a tab character right after the timestamp, separating information from the comment.
Apparently, lines without a comment are valid too unless you tell me you have truncated the line accidentally.
In any case, it's probably a good idea make the parser a little less error prone.
| 2015-01-05T13:48:14Z | [] | [] |
Traceback (most recent call last):
File "/home/yoh/deb/gits/python-git/git/test/test_reflog.py", line 48, in test_base
reflog = RefLog.from_file(rlp_master_ro)
File "/home/yoh/deb/gits/python-git/git/refs/log.py", line 154, in from_file
return cls(filepath)
File "/home/yoh/deb/gits/python-git/git/refs/log.py", line 128, in __init__
self._read_from_file()
File "/home/yoh/deb/gits/python-git/git/refs/log.py", line 140, in _read_from_file
self._deserialize(fmap)
File "/home/yoh/deb/gits/python-git/git/refs/log.py", line 289, in _deserialize
self.extend(self.iter_entries(stream))
File "/home/yoh/deb/gits/python-git/git/refs/log.py", line 180, in iter_entries
yield new_entry(line.strip())
File "/home/yoh/deb/gits/python-git/git/refs/log.py", line 86, in from_line
raise ValueError("line is missing tab separator")
ValueError: line is missing tab separator
| 5,286 |
|||
gitpython-developers/GitPython | gitpython-developers__GitPython-475 | 105a8c0fb3fe61b77956c8ebd3216738c78a3dff | diff --git a/git/cmd.py b/git/cmd.py
--- a/git/cmd.py
+++ b/git/cmd.py
@@ -39,7 +39,8 @@
PY3,
bchr,
# just to satisfy flake8 on py3
- unicode
+ unicode,
+ safe_decode,
)
execute_kwargs = ('istream', 'with_keep_cwd', 'with_extended_output',
@@ -693,12 +694,12 @@ def _kill_process(pid):
cmdstr = " ".join(command)
def as_text(stdout_value):
- return not output_stream and stdout_value.decode(defenc) or '<OUTPUT_STREAM>'
+ return not output_stream and safe_decode(stdout_value) or '<OUTPUT_STREAM>'
# end
if stderr_value:
log.info("%s -> %d; stdout: '%s'; stderr: '%s'",
- cmdstr, status, as_text(stdout_value), stderr_value.decode(defenc))
+ cmdstr, status, as_text(stdout_value), safe_decode(stderr_value))
elif stdout_value:
log.info("%s -> %d; stdout: '%s'", cmdstr, status, as_text(stdout_value))
else:
@@ -712,11 +713,11 @@ def as_text(stdout_value):
raise GitCommandError(command, status, stderr_value)
if isinstance(stdout_value, bytes) and stdout_as_string: # could also be output_stream
- stdout_value = stdout_value.decode(defenc)
+ stdout_value = safe_decode(stdout_value)
# Allow access to the command's status code
if with_extended_output:
- return (status, stdout_value, stderr_value.decode(defenc))
+ return (status, stdout_value, safe_decode(stderr_value))
else:
return stdout_value
| CatFileContentStream.execute() should probably safe_decode() stdout and stderr
FTR, using Python 3.5 here.
In a Debian project, we want to essentially `git show <ref>:debian/changelog` but the changelog has some bogus non-utf-8 characters in it. Here's an excerpt (not sure if this will come through in the GH issue):
```
sbuild (0.24) unstable; urgency=low
* remove -qq from apt-get call in the updatechroot script
* fix upgradechroot output and add -u to -y
* added oldstable to distribution options
* fix for dependency calculation for --arch-all builds from
Martin K<F6>gler (Closes: #180859)
* libpng-dev => libpng12-0-dev in sbuild.conf
* add dpkg-dev to package dependencies - thanks Michael Banck
(Closes: #182234)
* chroot building fix and waldi's patch still to come
-- Rick Younie <younie@debian.org> Sat, 19 Apr 2003 14:41:03 -0700
```
However, the command tracebacks (notice the weird `<F6>` in the changelog entry).
```
Traceback (most recent call last):
File "/home/barry/projects/ubuntu/uddgit/usd-importer/usd-import", line 144, in get_changelog_versions_from_treeish
ref, self._local_repo.git.show('%s:debian/changelog' % ref)))
File "/usr/lib/python3/dist-packages/git/cmd.py", line 459, in <lambda>
return lambda *args, **kwargs: self._call_process(name, *args, **kwargs)
File "/usr/lib/python3/dist-packages/git/cmd.py", line 920, in _call_process
return self.execute(make_call(), **_kwargs)
File "/usr/lib/python3/dist-packages/git/cmd.py", line 708, in execute
stdout_value = stdout_value.decode(defenc)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf6 in position 1341: invalid start byte
```
where `defenc` is utf-8. Since git/compat.py already has a `safe_decode()` method, that should probably be used instead on `stdout_value` and `stderr_value` to ensure you don't get an exception on bogus data.
| 2016-06-15T07:13:25Z | [] | [] |
Traceback (most recent call last):
File "/home/barry/projects/ubuntu/uddgit/usd-importer/usd-import", line 144, in get_changelog_versions_from_treeish
ref, self._local_repo.git.show('%s:debian/changelog' % ref)))
File "/usr/lib/python3/dist-packages/git/cmd.py", line 459, in <lambda>
return lambda *args, **kwargs: self._call_process(name, *args, **kwargs)
File "/usr/lib/python3/dist-packages/git/cmd.py", line 920, in _call_process
return self.execute(make_call(), **_kwargs)
File "/usr/lib/python3/dist-packages/git/cmd.py", line 708, in execute
stdout_value = stdout_value.decode(defenc)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf6 in position 1341: invalid start byte
| 5,291 |
||||
gitpython-developers/GitPython | gitpython-developers__GitPython-658 | cf8dc259fcc9c1397ea67cec3a6a4cb5816e3e68 | diff --git a/git/__init__.py b/git/__init__.py
--- a/git/__init__.py
+++ b/git/__init__.py
@@ -35,23 +35,26 @@ def _init_externals():
#{ Imports
-from git.config import GitConfigParser # @NoMove @IgnorePep8
-from git.objects import * # @NoMove @IgnorePep8
-from git.refs import * # @NoMove @IgnorePep8
-from git.diff import * # @NoMove @IgnorePep8
-from git.exc import * # @NoMove @IgnorePep8
-from git.db import * # @NoMove @IgnorePep8
-from git.cmd import Git # @NoMove @IgnorePep8
-from git.repo import Repo # @NoMove @IgnorePep8
-from git.remote import * # @NoMove @IgnorePep8
-from git.index import * # @NoMove @IgnorePep8
-from git.util import ( # @NoMove @IgnorePep8
- LockFile,
- BlockingLockFile,
- Stats,
- Actor,
- rmtree,
-)
+from git.exc import * # @NoMove @IgnorePep8
+try:
+ from git.config import GitConfigParser # @NoMove @IgnorePep8
+ from git.objects import * # @NoMove @IgnorePep8
+ from git.refs import * # @NoMove @IgnorePep8
+ from git.diff import * # @NoMove @IgnorePep8
+ from git.db import * # @NoMove @IgnorePep8
+ from git.cmd import Git # @NoMove @IgnorePep8
+ from git.repo import Repo # @NoMove @IgnorePep8
+ from git.remote import * # @NoMove @IgnorePep8
+ from git.index import * # @NoMove @IgnorePep8
+ from git.util import ( # @NoMove @IgnorePep8
+ LockFile,
+ BlockingLockFile,
+ Stats,
+ Actor,
+ rmtree,
+ )
+except GitError as exc:
+ raise ImportError('%s: %s' % (exc.__class__.__name__, exc))
#} END imports
| [GitPython >= 2.1.4] Can't import without git CLI
## Description of Problem
Since GitPython 2.1.4, the initial `import git` fails because it invokes the git CLI, raising a `git.exc.GitCommandNotFound` error. However, due to the chicken-and-egg nature of this import, it is impossible for one to catch a `git.exc.GitCommandNotFound` error.
## Steps to Reproduce
Install GitPython >= 2.1.4 on a box without git installed on it. An easy way to do this is to launch a Cent 7 docker container (which does not come with git installed off the base image), and then install GitPython (first installing EPEL repo and pip for ease of installation).
```
% docker run --rm -it centos:7 bash
[root@c1c079f693dc /]# yum --quiet -y install epel-release
warning: /var/cache/yum/x86_64/7/extras/packages/epel-release-7-9.noarch.rpm: Header V3 RSA/SHA256 Signature, key ID f4a80eb5: NOKEY
Public key for epel-release-7-9.noarch.rpm is not installed
Importing GPG key 0xF4A80EB5:
Userid : "CentOS-7 Key (CentOS 7 Official Signing Key) <security@centos.org>"
Fingerprint: 6341 ab27 53d7 8a78 a7c2 7bb1 24c6 a8a7 f4a8 0eb5
Package : centos-release-7-3.1611.el7.centos.x86_64 (@CentOS)
From : /etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-7
[root@c1c079f693dc /]# yum --quiet -y install python-pip
warning: /var/cache/yum/x86_64/7/epel/packages/python2-pip-8.1.2-5.el7.noarch.rpm: Header V3 RSA/SHA256 Signature, key ID 352c64e5: NOKEY
Public key for python2-pip-8.1.2-5.el7.noarch.rpm is not installed
Importing GPG key 0x352C64E5:
Userid : "Fedora EPEL (7) <epel@fedoraproject.org>"
Fingerprint: 91e9 7d7c 4a5e 96f1 7f3e 888f 6a2f aea2 352c 64e5
Package : epel-release-7-9.noarch (@extras)
From : /etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-7
[root@c1c079f693dc /]# command -v git && echo git is installed || echo git is not installed
git is not installed
root@c1c079f693dc /]# pip install GitPython
Collecting GitPython
Downloading GitPython-2.1.5-py2.py3-none-any.whl (443kB)
100% |################################| 450kB 1.0MB/s
Collecting gitdb2>=2.0.0 (from GitPython)
Downloading gitdb2-2.0.2-py2.py3-none-any.whl (63kB)
100% |################################| 71kB 2.1MB/s
Collecting smmap2>=2.0.0 (from gitdb2>=2.0.0->GitPython)
Downloading smmap2-2.0.3-py2.py3-none-any.whl
Installing collected packages: smmap2, gitdb2, GitPython
Successfully installed GitPython-2.1.5 gitdb2-2.0.2 smmap2-2.0.3
You are using pip version 8.1.2, however version 9.0.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
```
Now attempt to import:
```
[root@c1c079f693dc /]# python -c 'import git'
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/usr/lib/python2.7/site-packages/git/__init__.py", line 45, in <module>
from git.repo import Repo # @NoMove @IgnorePep8
File "/usr/lib/python2.7/site-packages/git/repo/__init__.py", line 4, in <module>
from .base import *
File "/usr/lib/python2.7/site-packages/git/repo/base.py", line 31, in <module>
from git.remote import Remote, add_progress, to_progress_instance
File "/usr/lib/python2.7/site-packages/git/remote.py", line 190, in <module>
class FetchInfo(object):
File "/usr/lib/python2.7/site-packages/git/remote.py", line 219, in FetchInfo
v = Git().version_info[:2]
File "/usr/lib/python2.7/site-packages/git/cmd.py", line 461, in version_info
return self._version_info
File "/usr/lib/python2.7/site-packages/git/cmd.py", line 424, in __getattr__
return LazyMixin.__getattr__(self, name)
File "/usr/lib/python2.7/site-packages/gitdb/util.py", line 256, in __getattr__
self._set_cache_(attr)
File "/usr/lib/python2.7/site-packages/git/cmd.py", line 444, in _set_cache_
version_numbers = self._call_process('version').split(' ')[2]
File "/usr/lib/python2.7/site-packages/git/cmd.py", line 877, in _call_process
return self.execute(call, **exec_kwargs)
File "/usr/lib/python2.7/site-packages/git/cmd.py", line 602, in execute
raise GitCommandNotFound(command, err)
git.exc.GitCommandNotFound: Cmd('git') not found due to: OSError('[Errno 2] No such file or directory')
cmdline: git version
```
If you then downgrade to before version 2.1.4, the import works fine since the initial import no longer seems to be trying to invoke the git CLI:
```
[root@c1c079f693dc /]# pip install 'GitPython<2.1.4'
Collecting GitPython<2.1.4
Downloading GitPython-2.1.3-py2.py3-none-any.whl (442kB)
100% |################################| 450kB 993kB/s
Requirement already satisfied (use --upgrade to upgrade): gitdb2>=2.0.0 in /usr/lib/python2.7/site-packages (from GitPython<2.1.4)
Requirement already satisfied (use --upgrade to upgrade): smmap2>=2.0.0 in /usr/lib/python2.7/site-packages (from gitdb2>=2.0.0->GitPython<2.1.4)
Installing collected packages: GitPython
Found existing installation: GitPython 2.1.5
Uninstalling GitPython-2.1.5:
Successfully uninstalled GitPython-2.1.5
Successfully installed GitPython-2.1.3
You are using pip version 8.1.2, however version 9.0.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
[root@c1c079f693dc /]# python -c 'import git'
[root@c1c079f693dc /]#
```
| 2017-08-10T21:14:48Z | [] | [] |
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/usr/lib/python2.7/site-packages/git/__init__.py", line 45, in <module>
from git.repo import Repo # @NoMove @IgnorePep8
File "/usr/lib/python2.7/site-packages/git/repo/__init__.py", line 4, in <module>
from .base import *
File "/usr/lib/python2.7/site-packages/git/repo/base.py", line 31, in <module>
from git.remote import Remote, add_progress, to_progress_instance
File "/usr/lib/python2.7/site-packages/git/remote.py", line 190, in <module>
class FetchInfo(object):
File "/usr/lib/python2.7/site-packages/git/remote.py", line 219, in FetchInfo
v = Git().version_info[:2]
File "/usr/lib/python2.7/site-packages/git/cmd.py", line 461, in version_info
return self._version_info
File "/usr/lib/python2.7/site-packages/git/cmd.py", line 424, in __getattr__
return LazyMixin.__getattr__(self, name)
File "/usr/lib/python2.7/site-packages/gitdb/util.py", line 256, in __getattr__
self._set_cache_(attr)
File "/usr/lib/python2.7/site-packages/git/cmd.py", line 444, in _set_cache_
version_numbers = self._call_process('version').split(' ')[2]
File "/usr/lib/python2.7/site-packages/git/cmd.py", line 877, in _call_process
return self.execute(call, **exec_kwargs)
File "/usr/lib/python2.7/site-packages/git/cmd.py", line 602, in execute
raise GitCommandNotFound(command, err)
git.exc.GitCommandNotFound: Cmd('git') not found due to: OSError('[Errno 2] No such file or directory')
| 5,295 |
||||
gitpython-developers/GitPython | gitpython-developers__GitPython-667 | cf8dc259fcc9c1397ea67cec3a6a4cb5816e3e68 | diff --git a/git/objects/submodule/base.py b/git/objects/submodule/base.py
--- a/git/objects/submodule/base.py
+++ b/git/objects/submodule/base.py
@@ -123,12 +123,12 @@ def _set_cache_(self, attr):
reader = self.config_reader()
# default submodule values
try:
- self.path = reader.get_value('path')
+ self.path = reader.get('path')
except cp.NoSectionError:
raise ValueError("This submodule instance does not exist anymore in '%s' file"
% osp.join(self.repo.working_tree_dir, '.gitmodules'))
# end
- self._url = reader.get_value('url')
+ self._url = reader.get('url')
# git-python extension values - optional
self._branch_path = reader.get_value(self.k_head_option, git.Head.to_full_path(self.k_head_default))
elif attr == '_name':
@@ -1168,11 +1168,11 @@ def iter_items(cls, repo, parent_commit='HEAD'):
for sms in parser.sections():
n = sm_name(sms)
- p = parser.get_value(sms, 'path')
- u = parser.get_value(sms, 'url')
+ p = parser.get(sms, 'path')
+ u = parser.get(sms, 'url')
b = cls.k_head_default
if parser.has_option(sms, cls.k_head_option):
- b = str(parser.get_value(sms, cls.k_head_option))
+ b = str(parser.get(sms, cls.k_head_option))
# END handle optional information
# get the binsha
| .submodules leads to IndexError: list index out of range on info = self._cache[item]
not sure if related to still open #279
I have bunch of freshly added, and then also modified submodules. That leads to following kaboom:
```shell
$> pip install --upgrade GitPython
Collecting GitPython
Downloading GitPython-2.1.3-py2.py3-none-any.whl (442kB)
Requirement already up-to-date: gitdb2>=2.0.0 in /home/yoh/proj/datalad/datalad/venv-tests/lib/python2.7/site-packages (from GitPython)
Requirement already up-to-date: smmap2>=2.0.0 in /home/yoh/proj/datalad/datalad/venv-tests/lib/python2.7/site-packages (from gitdb2>=2.0.0->GitPython)
Installing collected packages: GitPython
Found existing installation: GitPython 2.1.0
Uninstalling GitPython-2.1.0:
Successfully uninstalled GitPython-2.1.0
Successfully installed GitPython-2.1.3
You are using pip version 7.1.2, however version 9.0.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
$> python -c 'from git import Repo; print Repo(".").submodules'
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/git/repo/base.py", line 290, in submodules
return Submodule.list_items(self)
File "/home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/git/util.py", line 932, in list_items
out_list.extend(cls.iter_items(repo, *args, **kwargs))
File "/home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/git/objects/submodule/base.py", line 1181, in iter_items
sm = rt[p]
File "/home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/git/objects/tree.py", line 293, in __getitem__
info = self._cache[item]
IndexError: list index out of range
$> git submodule
+60d265e09441a6316f7c3336c0eba4370afed2ac 2009 (heads/master)
+5cdff4402dec87be62ae0ba197c63ad1da7ba2c2 2010 (heads/master)
+e5638100b6e4514a6e6e7379a581fb2801b7a59b 2011 (heads/master)
+62bf8ad207a8d70706a3b1288e2aaa6b1efe20ed 2013 (heads/master)
+214a854c94d9d8c5283643a59fb633f46563fbd0 2014 (heads/master)
+a6770f5aa5c8552b3cd803b138e5ae71c281188b 2015 (heads/master)
+4790ba30c6e666c86ec26d78042eb2a82cb043f8 2016 (heads/master)
+bf191a0120bce11be84f3e9a260eb830939fecbc 2017 (heads/master)
$> git diff
diff --git a/2009 b/2009
index b547198..60d265e 160000
--- a/2009
+++ b/2009
@@ -1 +1 @@
-Subproject commit b547198f5c417a88fa3e683c7a73b2ba014b2331
+Subproject commit 60d265e09441a6316f7c3336c0eba4370afed2ac
...
$> head .gitmodules
[submodule "2009"]
path = 2009
url = ./2009
[submodule "2010"]
path = 2010
url = ./2010
[submodule "2011"]
path = 2011
url = ./2011
```
Failed to replicate with a simplistic
```shell
$> rm -rf /tmp/test && mkdir /tmp/test && cd /tmp/test; git init; echo smth > smth; git add smth; git commit -m smth; mkdir sub; cd sub; git init; echo 123 > 123; git add 123; git commit -m added ; cd ../; git submodule add ./sub sub; cd sub; echo 124 >| 123; git add 123; git commit -m changed; cd ..; git submodule; python -c 'from git import Repo; print Repo(".").submodules'
Initialized empty Git repository in /tmp/test/.git/
[master (root-commit) bbfc6e1] smth
1 file changed, 1 insertion(+)
create mode 100644 smth
Initialized empty Git repository in /tmp/test/sub/.git/
[master (root-commit) 6063d92] added
1 file changed, 1 insertion(+)
create mode 100644 123
smth sub/
Adding existing repo at 'sub' to the index
123
[master 4a3e397] changed
1 file changed, 1 insertion(+), 1 deletion(-)
smth sub/
+4a3e397542e87ff842555f8f6c6fdfafffa0c375 sub (heads/master)
[git.Submodule(name=sub, path=sub, url=./sub, branch_path=refs/heads/master)]
changes on filesystem:
sub | 2 +-
cached/staged changes:
.gitmodules | 3 +++
sub | 1 +
```
so I guess I have missed some aspect
| Thanks for letting me know. I assume that the original repository that is showing the issue is not public, so we are left with an issue that is not reproducible.
What do you suggest we should do?
eh -- took me a while to find that repo where I originally ran into it (again blaming myself for not quoting enough info for myself)... found it -- now you can get the entirety of it in its dirty state from http://www.onerussian.com/tmp/phsyionet-challenge.tar.gz
@yarikoptic Thanks for taking the time to look - I pulled the file and was able to easily reproduce the issue. Just to be on the safe side, I put it into [a gist](https://gist.githubusercontent.com/Byron/17073cdb4c3019675d34a4705339e962/raw/802129991aa0bd2c23ee0f600b2ec4cef88ffbad/challenge.base64.zip).
This is how to reproduce the issue, assuming you have gitpython in your PYTHON_PATH:
```bash
set -e
curl https://gist.githubusercontent.com/Byron/17073cdb4c3019675d34a4705339e962/raw/802129991aa0bd2c23ee0f600b2ec4cef88ffbad/challenge.base64.zip | base64 -D > challenge.zip
unzip challenge.zip
(cd challenge && python -c 'from git import Repo; print Repo(".").submodules')
```
In https://github.com/datalad/datalad/pull/1801 we ran into our datalad crashing because we used GitPython's parser for .gitmodules and used `get_value` which automagically casts values into int and bool whenever it could. We fixed it on our end, but then decided to test more... Now I have managed to reproduce both crashes -- the one we have in that datalad issue (when I use bool value for the name) and this one (when I use numeric), so I guess the reason could be the same "misuse" of `get_value` (casts) whenever `get` should have been used (no casting):
clean run:
```shell
$> s=s123; builtin cd /tmp/; rm -rf /tmp/gitsub; mkdir /tmp/gitsub; cd /tmp/gitsub; git init; mkdir $s; cd $s; git init; touch 1; git add 1; git commit -m msg 1; cd ..; git submodule add ./$s $s; git add .gitmodules; git commit -m "added subm $s"
Initialized empty Git repository in /tmp/gitsub/.git/
Initialized empty Git repository in /tmp/gitsub/s123/.git/
[master (root-commit) 2b57b47] msg
1 file changed, 0 insertions(+), 0 deletions(-)
create mode 100644 1
s123/
Adding existing repo at 's123' to the index
[master (root-commit) 8990a73] added subm s123
2 files changed, 4 insertions(+)
create mode 100644 .gitmodules
create mode 160000 s123
2 10278.....................................:Thu 07 Sep 2017 12:11:07 PM EDT:.
(git)hopa:/tmp/gitsub[master]git
$> python -c 'import git; r=git.Repo("."); print(r.submodules)'
[git.Submodule(name=s123, path=s123, url=./s123, branch_path=refs/heads/master)]
```
int:
```
2 10279.....................................:Thu 07 Sep 2017 12:11:10 PM EDT:.
(git)hopa:/tmp/gitsub[master]git
$> s=123; builtin cd /tmp/; rm -rf /tmp/gitsub; mkdir /tmp/gitsub; cd /tmp/gitsub; git init; mkdir $s; cd $s; git init; touch 1; git add 1; git commit -m msg 1; cd ..; git submodule add ./$s $s; git add .gitmodules; git commit -m "added subm $s"
Initialized empty Git repository in /tmp/gitsub/.git/
Initialized empty Git repository in /tmp/gitsub/123/.git/
[master (root-commit) 8297bb7] msg
1 file changed, 0 insertions(+), 0 deletions(-)
create mode 100644 1
123/
Adding existing repo at '123' to the index
[master (root-commit) 3417128] added subm 123
2 files changed, 4 insertions(+)
create mode 100644 .gitmodules
create mode 160000 123
2 10280.....................................:Thu 07 Sep 2017 12:11:13 PM EDT:.
(git)hopa:/tmp/gitsub[master]git
$> python -c 'import git; r=git.Repo("."); print(r.submodules)'
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/usr/lib/python2.7/dist-packages/git/repo/base.py", line 293, in submodules
return Submodule.list_items(self)
File "/usr/lib/python2.7/dist-packages/git/util.py", line 932, in list_items
out_list.extend(cls.iter_items(repo, *args, **kwargs))
File "/usr/lib/python2.7/dist-packages/git/objects/submodule/base.py", line 1181, in iter_items
sm = rt[p]
File "/usr/lib/python2.7/dist-packages/git/objects/tree.py", line 293, in __getitem__
info = self._cache[item]
IndexError: list index out of range
```
bool:
```
$> s=false; builtin cd /tmp/; rm -rf /tmp/gitsub; mkdir /tmp/gitsub; cd /tmp/gitsub; git init; mkdir $s; cd $s; git init; touch 1; git add 1; git commit -m msg 1; cd ..; git submodule add ./$s $s; git add .gitmodules; git commit -m "added subm $s"
Initialized empty Git repository in /tmp/gitsub/.git/
Initialized empty Git repository in /tmp/gitsub/false/.git/
[master (root-commit) 987d30b] msg
1 file changed, 0 insertions(+), 0 deletions(-)
create mode 100644 1
false/
Adding existing repo at 'false' to the index
[master (root-commit) 12afd38] added subm false
2 files changed, 4 insertions(+)
create mode 100644 .gitmodules
create mode 160000 false
2 10282.....................................:Thu 07 Sep 2017 12:14:50 PM EDT:.
(git)hopa:/tmp/gitsub[master]git
$> python -c 'import git; r=git.Repo("."); print(r.submodules)'
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/usr/lib/python2.7/dist-packages/git/repo/base.py", line 293, in submodules
return Submodule.list_items(self)
File "/usr/lib/python2.7/dist-packages/git/util.py", line 932, in list_items
out_list.extend(cls.iter_items(repo, *args, **kwargs))
File "/usr/lib/python2.7/dist-packages/git/objects/submodule/base.py", line 1194, in iter_items
sm._name = n
AttributeError: 'Blob' object has no attribute '_name'
``` | 2017-09-08T03:24:48Z | [] | [] |
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/git/repo/base.py", line 290, in submodules
return Submodule.list_items(self)
File "/home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/git/util.py", line 932, in list_items
out_list.extend(cls.iter_items(repo, *args, **kwargs))
File "/home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/git/objects/submodule/base.py", line 1181, in iter_items
sm = rt[p]
File "/home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/git/objects/tree.py", line 293, in __getitem__
info = self._cache[item]
IndexError: list index out of range
| 5,296 |
|||
gitpython-developers/GitPython | gitpython-developers__GitPython-685 | f237620189a55d491b64cac4b5dc01b832cb3cbe | diff --git a/git/repo/base.py b/git/repo/base.py
--- a/git/repo/base.py
+++ b/git/repo/base.py
@@ -905,6 +905,10 @@ def _clone(cls, git, url, path, odb_default_type, progress, **kwargs):
odbt = kwargs.pop('odbt', odb_default_type)
+ # when pathlib.Path or other classbased path is passed
+ if not isinstance(path, str):
+ path = str(path)
+
## A bug win cygwin's Git, when `--bare` or `--separate-git-dir`
# it prepends the cwd or(?) the `url` into the `path, so::
# git clone --bare /cygwin/d/foo.git C:\\Work
| Cloning to Path from pathlib no longer work
Since version 2.1.7 using `Path` from `pathlib` to declare path in `Repo.clone_from` doesn't work anymore - it was working fine on 2.1.6 and before.
How to reproduce:
```
pip install gitpython==2.1.7
```
Run
```
from pathlib import Path
from git import Repo
path = Path("test1").resolve()
repo = Repo.clone_from("https://github.com/gitpython-developers/async.git", path)
```
On 2.1.7 it raises `TypeError`
```
Traceback (most recent call last):
File "script.py", line 6, in <module>
repo = Repo.clone_from("https://github.com/gitpython-developers/async.git", path)
File "/home/miki/.pyenv/versions/test_gitpython/lib/python3.6/site-packages/git/repo/base.py", line 972, in clone_from
return cls._clone(git, url, to_path, GitCmdObjectDB, progress, **kwargs)
File "/home/miki/.pyenv/versions/test_gitpython/lib/python3.6/site-packages/git/repo/base.py", line 939, in _clone
repo = cls(path, odbt=odbt)
File "/home/miki/.pyenv/versions/test_gitpython/lib/python3.6/site-packages/git/repo/base.py", line 125, in __init__
if expand_vars and ("%" in epath or "$" in epath):
TypeError: argument of type 'PosixPath' is not iterable
```
I am running this on python 3.6.2.
| Thanks a lot for taking the time to flesh out the issue in such great detail!
Apparently the function assumes `path` to be a string, and fails otherwise. Probably this could be fixed by converting it to a string in the respective portion of the path, which could be an easy contribution to make.
Sure, I'll open a PR today. | 2017-10-07T10:25:53Z | [] | [] |
Traceback (most recent call last):
File "script.py", line 6, in <module>
repo = Repo.clone_from("https://github.com/gitpython-developers/async.git", path)
File "/home/miki/.pyenv/versions/test_gitpython/lib/python3.6/site-packages/git/repo/base.py", line 972, in clone_from
return cls._clone(git, url, to_path, GitCmdObjectDB, progress, **kwargs)
File "/home/miki/.pyenv/versions/test_gitpython/lib/python3.6/site-packages/git/repo/base.py", line 939, in _clone
repo = cls(path, odbt=odbt)
File "/home/miki/.pyenv/versions/test_gitpython/lib/python3.6/site-packages/git/repo/base.py", line 125, in __init__
if expand_vars and ("%" in epath or "$" in epath):
TypeError: argument of type 'PosixPath' is not iterable
| 5,299 |
|||
gitpython-developers/GitPython | gitpython-developers__GitPython-746 | 0857d33852b6b2f4d7bc470b4c97502c7f978180 | diff --git a/git/objects/util.py b/git/objects/util.py
--- a/git/objects/util.py
+++ b/git/objects/util.py
@@ -121,8 +121,11 @@ def dst(self, dt):
def from_timestamp(timestamp, tz_offset):
"""Converts a timestamp + tz_offset into an aware datetime instance."""
utc_dt = datetime.fromtimestamp(timestamp, utc)
- local_dt = utc_dt.astimezone(tzoffset(tz_offset))
- return local_dt
+ try:
+ local_dt = utc_dt.astimezone(tzoffset(tz_offset))
+ return local_dt
+ except ValueError:
+ return utc_dt
def parse_date(string_date):
| ValueError exception when asking for authored_datetime
I am receiving a `ValueError` exception when getting the author date of a commit in Rails project. You can replicate it with the following code:
```
> from git import Repo
> repo = Repo('rails')
> commit = repo.commit('4cf94979c9f4d6683c9338d694d5eb3106a4e734')
> commit.authored_datetime
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/dspadini/Documents/pydriller/.venv/lib/python3.6/site-packages/git/objects/commit.py", line 151, in authored_datetime
return from_timestamp(self.authored_date, self.author_tz_offset)
File "/Users/dspadini/Documents/pydriller/.venv/lib/python3.6/site-packages/git/objects/util.py", line 124, in from_timestamp
> local_dt = utc_dt.astimezone(tzoffset(tz_offset))
ValueError: offset must be a timedelta strictly between -timedelta(hours=24) and timedelta(hours=24).
```
Tracing back the error, I got that the timestamp of the commit is 1312735823, while the tz_offset is -1864800. The problem is when creating the tzoffset: I indeed printed the timedelta
```
local_dt = utc_dt.astimezone(tzoffset(tz_offset))
class tzoffset(tzinfo):
def __init__(self, secs_west_of_utc, name=None):
self._offset = timedelta(seconds=-secs_west_of_utc)
print('Timedelta is {}'.format(timedelta(seconds=-secs_west_of_utc)))
```
and it is not in the range of [-24,24] hours, but it is actually 21 days.
*EDIT*: looking at the information of the commit, the offset is completely wrong:
`Date: Mon Aug 29 06:50:23 2011 +51800`
So the problem is not of gitpython.
| In a way, it’s a denyal of service attack against `GitPython`, as it should certainly not choke on it the way it does.
Maybe it’s something worth fixing for that reason.
I was looking at how jGit solved it, and they simply return UTC timestamp if they fail in retrieving the timestamp of the commit. If you also agree with this solution, I can submit a PR tomorrow (I am UTC+2 ;) )
That would be great :)! Thank you 🙏.
On Tue 3. Apr 2018 at 20:31, Spadini Davide <notifications@github.com>
wrote:
> I was looking at how jGit solved it, and they simply return UTC timestamp
> if they fail in retrieving the timestamp of the commit. If you also agree
> with this solution, I can submit a PR tomorrow (I am UTC+2 ;) )
>
> —
> You are receiving this because you modified the open/close state.
>
>
> Reply to this email directly, view it on GitHub
> <https://github.com/gitpython-developers/GitPython/issues/741#issuecomment-378351238>,
> or mute the thread
> <https://github.com/notifications/unsubscribe-auth/AAD4hkfCD5VFQU0OF_wwRSMvV2SbsWYpks5tk8AUgaJpZM4TBdf_>
> .
>
| 2018-04-04T08:08:20Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/dspadini/Documents/pydriller/.venv/lib/python3.6/site-packages/git/objects/commit.py", line 151, in authored_datetime
return from_timestamp(self.authored_date, self.author_tz_offset)
File "/Users/dspadini/Documents/pydriller/.venv/lib/python3.6/site-packages/git/objects/util.py", line 124, in from_timestamp
> local_dt = utc_dt.astimezone(tzoffset(tz_offset))
ValueError: offset must be a timedelta strictly between -timedelta(hours=24) and timedelta(hours=24).
| 5,305 |
|||
gitpython-developers/GitPython | gitpython-developers__GitPython-780 | ce5dfe7f95ac35263e41017c8a3c3c40c4333de3 | diff --git a/git/repo/base.py b/git/repo/base.py
--- a/git/repo/base.py
+++ b/git/repo/base.py
@@ -36,6 +36,11 @@
import gc
import gitdb
+try:
+ import pathlib
+except ImportError:
+ pathlib = None
+
log = logging.getLogger(__name__)
@@ -116,6 +121,8 @@ def __init__(self, path=None, odbt=GitCmdObjectDB, search_parent_directories=Fal
epath = decygpath(epath)
epath = epath or path or os.getcwd()
+ if not isinstance(epath, str):
+ epath = str(epath)
if expand_vars and ("%" in epath or "$" in epath):
warnings.warn("The use of environment variables in paths is deprecated" +
"\nfor security reasons and may be removed in the future!!")
| Exception when constructing a Repo() from a pathlib.Path
``` python
Traceback (most recent call last):
File "venv/bin/gridtied", line 11, in <module>
load_entry_point('gridtied', 'console_scripts', 'gridtied')()
File "/home/epc/g/20/grid-tied/venv/lib/python3.6/site-packages/click/core.py", line 722, in __call__
return self.main(*args, **kwargs)
File "/home/epc/g/20/grid-tied/venv/lib/python3.6/site-packages/click/core.py", line 697, in main
rv = self.invoke(ctx)
File "/home/epc/g/20/grid-tied/venv/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/epc/g/20/grid-tied/venv/lib/python3.6/site-packages/click/core.py", line 895, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/epc/g/20/grid-tied/venv/lib/python3.6/site-packages/click/core.py", line 535, in invoke
return callback(*args, **kwargs)
File "/home/epc/g/20/grid-tied/python/gridtied/package.py", line 59, in cli
commit(),
File "/home/epc/g/20/grid-tied/python/gridtied/package.py", line 35, in commit
repo = git.Repo(gridtied.project_root)
File "/home/epc/g/20/grid-tied/venv/lib/python3.6/site-packages/git/repo/base.py", line 119, in __init__
if expand_vars and ("%" in epath or "$" in epath):
TypeError: argument of type 'PosixPath' is not iterable
```
https://github.com/gitpython-developers/GitPython/blob/master/git/repo/base.py#L119
| 2018-07-12T11:06:50Z | [] | [] |
Traceback (most recent call last):
File "venv/bin/gridtied", line 11, in <module>
load_entry_point('gridtied', 'console_scripts', 'gridtied')()
File "/home/epc/g/20/grid-tied/venv/lib/python3.6/site-packages/click/core.py", line 722, in __call__
return self.main(*args, **kwargs)
File "/home/epc/g/20/grid-tied/venv/lib/python3.6/site-packages/click/core.py", line 697, in main
rv = self.invoke(ctx)
File "/home/epc/g/20/grid-tied/venv/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/epc/g/20/grid-tied/venv/lib/python3.6/site-packages/click/core.py", line 895, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/epc/g/20/grid-tied/venv/lib/python3.6/site-packages/click/core.py", line 535, in invoke
return callback(*args, **kwargs)
File "/home/epc/g/20/grid-tied/python/gridtied/package.py", line 59, in cli
commit(),
File "/home/epc/g/20/grid-tied/python/gridtied/package.py", line 35, in commit
repo = git.Repo(gridtied.project_root)
File "/home/epc/g/20/grid-tied/venv/lib/python3.6/site-packages/git/repo/base.py", line 119, in __init__
if expand_vars and ("%" in epath or "$" in epath):
TypeError: argument of type 'PosixPath' is not iterable
| 5,308 |
||||
gitpython-developers/GitPython | gitpython-developers__GitPython-922 | a2d8a5144bd8c0940d9f2593a21aec8bebf7c035 | diff --git a/git/objects/commit.py b/git/objects/commit.py
--- a/git/objects/commit.py
+++ b/git/objects/commit.py
@@ -484,7 +484,8 @@ def _deserialize(self, stream):
buf = enc.strip()
while buf:
if buf[0:10] == b"encoding ":
- self.encoding = buf[buf.find(' ') + 1:].decode('ascii')
+ self.encoding = buf[buf.find(' ') + 1:].decode(
+ self.encoding, 'ignore')
elif buf[0:7] == b"gpgsig ":
sig = buf[buf.find(b' ') + 1:] + b"\n"
is_next_header = False
@@ -498,7 +499,7 @@ def _deserialize(self, stream):
break
sig += sigbuf[1:]
# end read all signature
- self.gpgsig = sig.rstrip(b"\n").decode('ascii')
+ self.gpgsig = sig.rstrip(b"\n").decode(self.encoding, 'ignore')
if is_next_header:
continue
buf = readline().strip()
| UnicodeDecodeError in gpgsig
Hi @Byron !
So, someone find this bug using my tool, and it's propagated to GitPython. The problem is in the decoding of a non-ascii character.
repo: https://github.com/gentoo/gentoo
commit: `13e644bb36a0b1f3ef0c2091ab648978d18f369d`
code:
```
from git import Repo, Commit
gr = Repo('/tmp/gentoo')
c = gr.commit('13e644bb36a0b1f3ef0c2091ab648978d18f369d')
print(c.authored_date)
```
This returns:
```
Traceback (most recent call last):
File "/Users/dspadini/Documents/pydriller/tmp.py", line 341, in <module>
print(c.authored_date)
File "/Users/dspadini/Documents/pydriller/venv/lib/python3.7/site-packages/gitdb/util.py", line 253, in __getattr__
self._set_cache_(attr)
File "/Users/dspadini/Documents/pydriller/venv/lib/python3.7/site-packages/git/objects/commit.py", line 144, in _set_cache_
self._deserialize(BytesIO(stream.read()))
File "/Users/dspadini/Documents/pydriller/venv/lib/python3.7/site-packages/git/objects/commit.py", line 502, in _deserialize
self.gpgsig = sig.rstrip(b"\n").decode('ascii')
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 75: ordinal not in range(128)
```
The problem is in [this line](https://github.com/gitpython-developers/GitPython/blob/master/git/objects/commit.py#L501). The line to be decoded is the following:
```
b'-----BEGIN PGP SIGNATURE-----\nVersion: GnuPG v2.1\nComment: Signed-off-by: J\xc3\xb6rg Bornkessel <hd_brummy@gentoo.org>\n\n.........\n-----END PGP SIGNATURE-----\n'
```
As you can see, at the beginning we have `J\xc3\xb6rg`. This fails the decoding.
So, I tried to change `.decode('ascii')` to `.decode('UTF-8')` and **it works**.
Also, changing `.decode('ascii')` to `.decode('ascii', 'ignore')` **works**.
However, I am not sure whether I should do it. Why is `ascii` in the first place (instead of UTF-8)?
Are we gonna break tests with this change?
| Thanks a lot for posting! In GitPython, all handling of string encodings is somewhat botched, as it was written in a time when things were silently assumed to be ascii only.
I would be happy about a PR, it looks like changing this to `.decode('UTF-8', 'ignore')` would do the job in most cases, while never being able to fail, and without breaking backwards compatibility.
If there was a rewrite, one would have to stop assuming any encoding, and work with bytes instead, to leave the decoding to the consumer. | 2019-09-16T08:21:30Z | [] | [] |
Traceback (most recent call last):
File "/Users/dspadini/Documents/pydriller/tmp.py", line 341, in <module>
print(c.authored_date)
File "/Users/dspadini/Documents/pydriller/venv/lib/python3.7/site-packages/gitdb/util.py", line 253, in __getattr__
self._set_cache_(attr)
File "/Users/dspadini/Documents/pydriller/venv/lib/python3.7/site-packages/git/objects/commit.py", line 144, in _set_cache_
self._deserialize(BytesIO(stream.read()))
File "/Users/dspadini/Documents/pydriller/venv/lib/python3.7/site-packages/git/objects/commit.py", line 502, in _deserialize
self.gpgsig = sig.rstrip(b"\n").decode('ascii')
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 75: ordinal not in range(128)
| 5,315 |
|||
gitpython-developers/GitPython | gitpython-developers__GitPython-953 | dfa0eac1578bff14a8f7fa00bfc3c57aba24f877 | diff --git a/git/remote.py b/git/remote.py
--- a/git/remote.py
+++ b/git/remote.py
@@ -717,7 +717,7 @@ def _get_push_info(self, proc, progress):
# read the lines manually as it will use carriage returns between the messages
# to override the previous one. This is why we read the bytes manually
progress_handler = progress.new_message_handler()
- output = IterableList('name')
+ output = []
def stdout_handler(line):
try:
@@ -833,7 +833,7 @@ def push(self, refspec=None, progress=None, **kwargs):
:note: No further progress information is returned after push returns.
:param kwargs: Additional arguments to be passed to git-push
:return:
- IterableList(PushInfo, ...) iterable list of PushInfo instances, each
+ list(PushInfo, ...) list of PushInfo instances, each
one informing about an individual head which had been updated on the remote
side.
If the push contains rejected heads, these will have the PushInfo.ERROR bit set
| AttributeError: 'PushInfo' object has no attribute 'name'
A simple call to:
result = self._client.remotes.origin.push(**args)
print(result.summary)
results in:
Traceback (most recent call last):
File "C:\Users\jsmith\Envs\halpy\lib\site-packages\halpy\cms.py", line 623, in checkin_files
print(result.summary)
File "C:\Users\jsmith\Envs\halpy\lib\site-packages\git\util.py", line 883, in __getattr__
if getattr(item, self._id_attr) == attr:
AttributeError: 'PushInfo' object has no attribute 'name'
| Which version of GitPython are you using? Could you also provide a more self-contained example on how to reproduce the issue? Maybe `**args` makes all the difference.
Thank you
I'm using 2.1.11 and the following will reproduce the error.
`import git`
`git_client = git.Repo.clone_from(some_repo, 'test_clone')`
`result = git_client.remotes.origin.push()`
`print(result.summary)`
Thanks a lot! I could reproduce the issue. | 2019-10-24T02:40:16Z | [] | [] |
Traceback (most recent call last):
File "C:\Users\jsmith\Envs\halpy\lib\site-packages\halpy\cms.py", line 623, in checkin_files
print(result.summary)
File "C:\Users\jsmith\Envs\halpy\lib\site-packages\git\util.py", line 883, in __getattr__
if getattr(item, self._id_attr) == attr:
AttributeError: 'PushInfo' object has no attribute 'name'
| 5,321 |
|||
google/jax | google__jax-1096 | 24d4aaf3e20d58dd66a3df8ffe0d0ea9e976c307 | diff --git a/build/build.py b/build/build.py
--- a/build/build.py
+++ b/build/build.py
@@ -187,6 +187,9 @@ def check_bazel_version(bazel_path, min_version, max_version):
build:cuda --crosstool_top=@local_config_cuda//crosstool:toolchain
build:cuda --define=using_cuda=true --define=using_cuda_nvcc=true
+
+build --spawn_strategy=standalone
+build --strategy=Genrule=standalone
"""
| jaxlib build w/ cuda: File not found during compilation
I'm compiling `jaxlib` with CUDA 10.0 on Ubuntu 18.04. The build fails with the following error:
```
$ python3 build/build.py --enable_cuda --cuda_path /usr/local/cuda-10.0/ --cudnn_path /usr/local/cuda-10.0/ --enable_march_native
[...]
ERROR: /home/clem/.cache/bazel/_bazel_clem/ffaac3f7c6ad1cb26f04f1933452eef6/external/nccl_archive/BUILD.bazel:53:1: error while parsing .d file: /h
ome/clem/.cache/bazel/_bazel_clem/ffaac3f7c6ad1cb26f04f1933452eef6/execroot/__main__/bazel-out/k8-opt/bin/external/nccl_archive/_objs/device_lib/pr
od_i32_reduce_scatter.cu.d (No such file or directory)
nvcc fatal : Could not open input file /tmp/tmpxft_00000004_00000000-6_prod_i32_reduce_scatter.cu.compute_35.cpp1.ii
Target //build:install_xla_in_source_tree failed to build
INFO: Elapsed time: 278.116s, Critical Path: 69.60s
INFO: 1281 processes: 1281 linux-sandbox.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully
Traceback (most recent call last):
File "build/build.py", line 331, in <module>
main()
File "build/build.py", line 326, in main
[":install_xla_in_source_tree", os.getcwd()])
File "build/build.py", line 50, in shell
output = subprocess.check_output(cmd)
File "/usr/lib/python3.6/subprocess.py", line 356, in check_output
**kwargs).stdout
File "/usr/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['./bazel-0.24.1-linux-x86_64', 'run', '--verbose_failures=true', '--config=opt', '--config=mkl_open_source
_only', '--config=cuda', ':install_xla_in_source_tree', '/home/clem/git/jax/build']' returned non-zero exit status 1.
```
Above this error message are only compiler warnings but no errors which could lead to some file not being created. Am I missing something? Or might there be a file name bug? Thanks a lot for your help!
---
I'm on a fresh Ubuntu 18.04.2 install with CUDA 10.0, cudnn and driver version 410.48.
[Full log](http://paste.ubuntu.com/p/tvXBHbr5gw/)
| I saw this too. It seems to be nondeterministic and related to nvcc, but I didn't have time to track down the problem. Try running the build again, and it should make more progress.
Thanks for the advice. I had to restart the compilation ~10 times and finally it finished.
However, after installing `jaxlib` and `jax` the xla backend does not find my GPU and falls back to CPU. Could this be related?
No, I think the two are unrelated. One is a build problem, the other is a run time problem.
Are you sure it's using the right `jaxlib` (i.e., the one you just built?) You can install it locally with `pip install -e jax/build`.
(You might also try a prebuilt `jaxlib` wheel; there are links to CUDA 10 wheels on the JAX github README.md).
Thank you again, I'll look into it.
I have tried the pre-build `jaxlib` wheels without success¹. The readme states the GPU support of those is experimental, thats why I tried building myself.
¹ _without my GPU being detected that is_
I've had this bug for a few months too, but never got around to reporting it. If I resume compilation, it usually progresses until another failure on a file in that directory. (But as reported above, it does eventually work.) My guess is that it's a race condition on a lot of requirements in `nccl`. My build machine has 20 threads (bazel uses all of them), and it happens most of the time.
I don't have any issues detecting my GPU with CUDA 10.0 and cuDNN 7.5 on Ubuntu 18.04.
Thank you @kroq-gar78 for the information. I posted a separate issue on that as it apparently does not correlate to the compilation error.
Edit: It indeed does not, see #993. | 2019-08-02T01:45:09Z | [] | [] |
Traceback (most recent call last):
File "build/build.py", line 331, in <module>
main()
File "build/build.py", line 326, in main
[":install_xla_in_source_tree", os.getcwd()])
File "build/build.py", line 50, in shell
output = subprocess.check_output(cmd)
File "/usr/lib/python3.6/subprocess.py", line 356, in check_output
**kwargs).stdout
File "/usr/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['./bazel-0.24.1-linux-x86_64', 'run', '--verbose_failures=true', '--config=opt', '--config=mkl_open_source
| 5,336 |
|||
google/jax | google__jax-1718 | 6cf2e4b8bf7de1f561e1fac166ca70d15816df6b | diff --git a/jax/core.py b/jax/core.py
--- a/jax/core.py
+++ b/jax/core.py
@@ -300,6 +300,7 @@ def aval(self):
assert False
def __neg__(self): return self.aval._neg(self)
+ def __pos__(self): return self.aval._pos(self)
def __eq__(self, other): return self.aval._eq(self, other)
def __ne__(self, other): return self.aval._ne(self, other)
def __lt__(self, other): return self.aval._lt(self, other)
diff --git a/jax/numpy/lax_numpy.py b/jax/numpy/lax_numpy.py
--- a/jax/numpy/lax_numpy.py
+++ b/jax/numpy/lax_numpy.py
@@ -3194,6 +3194,7 @@ def _unimplemented_setitem(self, i, x):
"getitem": _rewriting_take,
"setitem": _unimplemented_setitem,
"neg": negative,
+ "pos": positive,
"eq": equal,
"ne": not_equal,
"lt": less,
| unary + does not work in jax
```
>>> import jax
>>> +jax.numpy.asarray([1,2,3])
/Users/lukasheinrich/Code/excursiondev/excvenv/lib/python3.6/site-packages/jax/lib/xla_bridge.py:119: UserWarning: No GPU/TPU found, falling back to CPU.
warnings.warn('No GPU/TPU found, falling back to CPU.')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: bad operand type for unary +: 'DeviceArray'
```
while in numpy it works
```
>>> import numpy
>>> +numpy.asarray([1,2,3])
array([1, 2, 3])
```
a similar issue appear for the `-` operator
| I see the issue for unary `+`; that's just a case we forgot to implement (the `+` operator isn't that useful!) Do you have an example of unary `-` failing? It works for me.
sorry, I misremembered: unary minus works for me. | 2019-11-19T02:17:58Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: bad operand type for unary +: 'DeviceArray'
| 5,399 |
|||
google/jax | google__jax-1931 | 82dbf9131105a0b3a22c191930bc42b15d420794 | diff --git a/jax/util.py b/jax/util.py
--- a/jax/util.py
+++ b/jax/util.py
@@ -214,6 +214,10 @@ def get_module_functions(module):
"""
module_fns = set()
for key in dir(module):
+ # Omitting module level __getattr__, __dir__ which was added in Python 3.7
+ # https://www.python.org/dev/peps/pep-0562/
+ if key in ('__getattr__', '__dir__'):
+ continue
attr = getattr(module, key)
if isinstance(
attr, (types.BuiltinFunctionType, types.FunctionType, onp.ufunc)):
| not compatible with numpy 1.18 and python 3.7
With numpy 1.18 and python 3.7, `import jax` will raise the error
```python
>>> import jax
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/fehiepsi/miniconda3/envs/py37/lib/python3.7/site-packages/jax/__init__.py", line 20, in <module>
from jax import nn
File "/home/fehiepsi/miniconda3/envs/py37/lib/python3.7/site-packages/jax/nn/__init__.py", line 17, in <module>
from . import initializers
File "/home/fehiepsi/miniconda3/envs/py37/lib/python3.7/site-packages/jax/nn/initializers.py", line 28, in <module>
from jax import random
File "/home/fehiepsi/miniconda3/envs/py37/lib/python3.7/site-packages/jax/random.py", line 33, in <module>
from . import numpy as np
File "/home/fehiepsi/miniconda3/envs/py37/lib/python3.7/site-packages/jax/numpy/__init__.py", line 16, in <module>
from .lax_numpy import *
File "<frozen importlib._bootstrap>", line 1019, in _handle_fromlist
File "/home/fehiepsi/miniconda3/envs/py37/lib/python3.7/site-packages/jax/numpy/lax_numpy.py", line 3211, in wrapped
raise NotImplementedError(msg.format(fun))
NotImplementedError: Numpy function <function __getattr__ at 0x7f029e9dd560> not yet implemented
```
| 2019-12-31T20:21:46Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/fehiepsi/miniconda3/envs/py37/lib/python3.7/site-packages/jax/__init__.py", line 20, in <module>
from jax import nn
File "/home/fehiepsi/miniconda3/envs/py37/lib/python3.7/site-packages/jax/nn/__init__.py", line 17, in <module>
from . import initializers
File "/home/fehiepsi/miniconda3/envs/py37/lib/python3.7/site-packages/jax/nn/initializers.py", line 28, in <module>
from jax import random
File "/home/fehiepsi/miniconda3/envs/py37/lib/python3.7/site-packages/jax/random.py", line 33, in <module>
from . import numpy as np
File "/home/fehiepsi/miniconda3/envs/py37/lib/python3.7/site-packages/jax/numpy/__init__.py", line 16, in <module>
from .lax_numpy import *
File "<frozen importlib._bootstrap>", line 1019, in _handle_fromlist
File "/home/fehiepsi/miniconda3/envs/py37/lib/python3.7/site-packages/jax/numpy/lax_numpy.py", line 3211, in wrapped
raise NotImplementedError(msg.format(fun))
NotImplementedError: Numpy function <function __getattr__ at 0x7f029e9dd560> not yet implemented
| 5,419 |
||||
google/jax | google__jax-2183 | 86984b37dd34f40dc71f399318151257251555ec | diff --git a/jax/interpreters/pxla.py b/jax/interpreters/pxla.py
--- a/jax/interpreters/pxla.py
+++ b/jax/interpreters/pxla.py
@@ -494,7 +494,7 @@ def dynamic_fun(dummy, *args):
jaxpr_replicas = xla.jaxpr_replicas(jaxpr)
num_local_replicas = axis_size * jaxpr_replicas
num_global_replicas = global_axis_size * jaxpr_replicas
- axis_env = xla.AxisEnv(num_global_replicas, [axis_name], [global_axis_size], devices)
+ axis_env = xla.AxisEnv(num_global_replicas, (axis_name,), (global_axis_size,), devices)
tuple_args = len(avals) > 100 # pass long arg lists as tuple for TPU
diff --git a/jax/interpreters/xla.py b/jax/interpreters/xla.py
--- a/jax/interpreters/xla.py
+++ b/jax/interpreters/xla.py
@@ -170,14 +170,23 @@ def xla_primitive_callable(prim, *arg_specs, **params):
handlers = tuple(map(partial(aval_to_result_handler, device), aval_out))
handle_result = lambda xs: tuple(h(x) for h, x in zip(handlers, xs.destructure()))
tuple_args = len(avals) > 100
- built_c = primitive_computation(prim, backend, tuple_args, *avals, **params)
+ if prim in initial_style_translations:
+ nreps = initial_style_primitive_replicas(params)
+ else:
+ nreps = 1
+ built_c = primitive_computation(prim, AxisEnv(nreps), backend, tuple_args,
+ *avals, **params)
options = xb.get_compile_options(
num_replicas=1,
num_partitions=1,
device_assignment=device and (device.id,))
compiled = built_c.Compile(compile_options=options, backend=backend)
- return partial(_execute_compiled_primitive, prim, compiled, backend,
- tuple_args, handle_result)
+ if nreps == 1:
+ return partial(_execute_compiled_primitive, prim, compiled, backend,
+ tuple_args, handle_result)
+ else:
+ return partial(_execute_replicated_primitive, prim, compiled, backend,
+ tuple_args, handle_result)
def _device_from_arg_devices(devices):
"""Given devices of inputs, determine where to perform a computation.
@@ -198,7 +207,7 @@ def _device_from_arg_devices(devices):
raise ValueError(msg.format(", ".join(names)))
@cache()
-def primitive_computation(prim, backend, tuple_args, *avals, **params):
+def primitive_computation(prim, axis_env, backend, tuple_args, *avals, **params):
c = xb.make_computation_builder("primitive_computation_{}".format(prim.name))
c.SetOpMetadata(xc.OpMetadata(
op_type=prim.name,
@@ -208,13 +217,13 @@ def primitive_computation(prim, backend, tuple_args, *avals, **params):
# return val always set as a side-effect on c
if prim in backend_specific_translations[platform]:
rule = backend_specific_translations[platform][prim]
- rule(c, *xla_args, **params)
+ rule(c, *xla_args, **params)
elif prim in translations:
rule = translations[prim]
rule(c, *xla_args, **params)
elif prim in initial_style_translations:
rule = initial_style_translations[prim]
- rule(c, AxisEnv(), extend_name_stack(prim.name), *xla_args, backend=backend, **params)
+ rule(c, axis_env, extend_name_stack(prim.name), *xla_args, backend=backend, **params)
else:
raise NotImplementedError("XLA translation rule for {} not found".format(prim))
c.ClearOpMetadata()
@@ -227,7 +236,7 @@ def primitive_computation(prim, backend, tuple_args, *avals, **params):
raise RuntimeError(msg)
def primitive_subcomputation(prim, *avals, **params):
- return primitive_computation(prim, None, False, *avals, **params)
+ return primitive_computation(prim, AxisEnv(1), None, False, *avals, **params)
def _execute_compiled_primitive(prim, compiled, backend, tuple_args,
result_handler, *args):
@@ -240,6 +249,17 @@ def _execute_compiled_primitive(prim, compiled, backend, tuple_args,
check_nans(prim, out_buf.destructure() if prim.multiple_results else out_buf)
return result_handler(out_buf)
+def _execute_replicated_primitive(prim, compiled, backend, tuple_args,
+ result_handler, *args):
+ input_bufs = [
+ [device_put(x, device) for x in args if x is not token]
+ for device in compiled.local_devices()]
+ if tuple_args:
+ input_bufs = [[make_tuple(bufs, device, backend)] for bufs, device in
+ zip(input_bufs, compiled.local_devices())]
+ out_buf = compiled.ExecutePerReplica(input_bufs)[0]
+ return result_handler(out_buf)
+
def check_nans(prim, bufs):
if prim.multiple_results:
for buf in bufs:
@@ -350,14 +370,16 @@ def check_backend_params(params, outer_backend):
class AxisEnv(object):
- def __init__(self, nreps=1, names=None, sizes=None, devices=None):
+ def __init__(self, nreps, names=(), sizes=(), devices=None):
+ assert isinstance(names, tuple)
+ assert isinstance(sizes, tuple)
self.nreps = nreps
- self.names = names if names else []
- self.sizes = sizes if sizes else []
+ self.names = names
+ self.sizes = sizes
self.devices = devices
def extend_axis_env(env, name, size):
- return AxisEnv(env.nreps, env.names + [name], env.sizes + [size], env.devices)
+ return AxisEnv(env.nreps, env.names + (name,), env.sizes + (size,), env.devices)
def axis_read(axis_env, axis_name):
return max(i for i, name in enumerate(axis_env.names) if name == axis_name)
@@ -382,17 +404,22 @@ def _axis_groups(nrep, mesh_spec, mesh_axes):
def jaxpr_replicas(jaxpr):
return max(it.chain([1], (eqn_replicas(eqn) for eqn in jaxpr.eqns)))
+# TODO(mattjj): this function assumes that only pmap has a parameter named
+# axis_size, and that it corresponds to cross-replica mapping
def eqn_replicas(eqn):
if eqn.bound_subjaxpr:
return eqn.params.get('axis_size', 1) * jaxpr_replicas(eqn.bound_subjaxpr)
elif eqn.primitive in initial_style_translations:
- nums = (jaxpr_replicas(param if type(param) is core.Jaxpr else param.jaxpr)
- for param in eqn.params.values()
- if type(param) in (core.Jaxpr, core.TypedJaxpr))
- return max(it.chain([1], nums))
+ return initial_style_primitive_replicas(eqn.params)
else:
return 1
+def initial_style_primitive_replicas(params):
+ nums = (jaxpr_replicas(param if type(param) is core.Jaxpr else param.jaxpr)
+ for param in params.values()
+ if type(param) in (core.Jaxpr, core.TypedJaxpr))
+ return max(it.chain([1], nums))
+
# TODO(mattjj,skyewm): the functions here are utilities for checking if
# not-yet-supported features are used with multi-host programming
@@ -485,7 +512,7 @@ def _xla_callable(fun, device, backend, name, *arg_specs):
xla_consts = _map(c.Constant, consts)
xla_args = _xla_callable_args(c, abstract_args, tuple_args)
out_nodes = jaxpr_subcomp(
- c, jaxpr, backend, AxisEnv(nreps, [], []), xla_consts,
+ c, jaxpr, backend, AxisEnv(nreps, (), ()), xla_consts,
extend_name_stack(wrap_name(name, 'jit')), *xla_args)
built = c.Build(c.Tuple(*out_nodes))
@@ -635,10 +662,11 @@ def lower_fun(fun, instantiate=False, initial_style=False):
"""Build a translation rule for a traceable function."""
def f(c, *args, **params):
backend = params.pop('backend', None)
+ # TODO(mattjj): revise this 'calling convention'
if initial_style:
axis_env, name_stack, xla_args = args[0], args[1], args[2:]
else:
- axis_env, name_stack, xla_args = AxisEnv(), '', args
+ axis_env, name_stack, xla_args = AxisEnv(1), '', args
xla_shapes = tuple(map(c.GetShape, xla_args))
avals = map(_aval_from_xla_shape, xla_shapes)
pvals = [pe.PartialVal((a, core.unit)) for a in avals]
diff --git a/jax/lax/lax.py b/jax/lax/lax.py
--- a/jax/lax/lax.py
+++ b/jax/lax/lax.py
@@ -3425,7 +3425,7 @@ def _reduce_batch_rule(batched_args, batch_dims, computation, jaxpr, consts, dim
def _reduction_computation(c, jaxpr, consts, init_value):
shape = c.GetShape(init_value)
- axis_env = xla.AxisEnv() # no parallel primitives inside reductions
+ axis_env = xla.AxisEnv(1) # no parallel primitives inside reductions
subc = xla_bridge.make_computation_builder("reduction_computation")
consts = [subc.ParameterWithShape(const) for const in consts]
args = [subc.ParameterWithShape(shape), subc.ParameterWithShape(shape)]
diff --git a/jax/lax/lax_control_flow.py b/jax/lax/lax_control_flow.py
--- a/jax/lax/lax_control_flow.py
+++ b/jax/lax/lax_control_flow.py
@@ -803,10 +803,7 @@ def scan(f, init, xs, length=None):
linear=(False,) * (len(consts) + len(in_flat)))
return tree_unflatten(out_tree, out)
-def _scan_impl(*args, **kwargs):
- forward, length, num_consts, num_carry, jaxpr, linear = split_dict(
- kwargs, ["forward", "length", "num_consts", "num_carry", "jaxpr", "linear"])
-
+def _scan_impl(*args, forward, length, num_consts, num_carry, jaxpr, linear):
consts, init, xs = split_list(args, [num_consts, num_carry])
_, _, x_avals = split_list(jaxpr.in_avals, [num_consts, num_carry])
_, y_avals = split_list(jaxpr.out_avals, [num_carry])
@@ -841,6 +838,13 @@ def _update_array(i, aval, xs, x):
else:
return lax.dynamic_update_index_in_dim(xs, x, i, 0)
+# TODO(mattjj): make scan a primitive
+# def _scan_abstract_eval(*args, forward, length, num_consts, num_carry, jaxpr, linear):
+# carry_avals, y_avals = split_list(jaxpr.out_avals, [num_carry])
+# ys_avals = [ShapedArray((length,) + aval.shape, aval.dtype)
+# if aval is not core.abstract_unit else aval for aval in y_avals]
+# return carry_avals + y_avals
+
def _scan_jvp(primals, tangents, forward, length, jaxpr, num_consts, num_carry,
linear):
num_xs = len(jaxpr.in_avals) - num_carry - num_consts
@@ -879,9 +883,9 @@ def _scan_jvp(primals, tangents, forward, length, jaxpr, num_consts, num_carry,
[num_carry, num_ys], [len(init_dot), sum(nonzeros_out) - len(init_dot)])
consts_linear, init_linear, xs_linear = split_list(linear, [num_consts, num_carry])
- jaxpr_jvp_linear = (consts_linear + [True] * len(consts_dot)
- + init_linear + [True] * len(init_dot)
- + xs_linear + [True] * len(xs_dot))
+ jaxpr_jvp_linear = tuple(consts_linear + [True] * len(consts_dot)
+ + init_linear + [True] * len(init_dot)
+ + xs_linear + [True] * len(xs_dot))
out_flat = scan_p.bind(
*(consts + consts_dot + init + init_dot + xs + xs_dot),
@@ -899,9 +903,8 @@ def _scan_jvp(primals, tangents, forward, length, jaxpr, num_consts, num_carry,
def _prune_zeros(ts):
return [t for t in ts if t is not ad_util.zero]
-def _scan_partial_eval(trace, *tracers, **kwargs):
- forward, length, num_consts, num_carry, jaxpr, linear = split_dict(
- kwargs, ["forward", "length", "num_consts", "num_carry", "jaxpr", "linear"])
+def _scan_partial_eval(trace, *tracers, forward, length, num_consts, num_carry,
+ jaxpr, linear):
num_xs = len(jaxpr.in_avals) - num_carry - num_consts
num_ys = len(jaxpr.out_avals) - num_carry
@@ -955,10 +958,11 @@ def _scan_partial_eval(trace, *tracers, **kwargs):
[core.unit if uk else t.pval[1]
for uk, t in zip(unknowns[num_consts:], tracers[num_consts:])])
linear_1 = ([False] * len(consts_1) + [True] * num_consts +
- [lin or uk for uk, lin in zip(unknowns[num_consts:], linear[num_consts:])])
+ [lin or uk for uk, lin
+ in zip(unknowns[num_consts:], linear[num_consts:])])
out_flat = scan_p.bind(
*in_consts, forward=forward, length=length, jaxpr=jaxpr_1_opt,
- num_consts=num_consts_1, num_carry=num_carry, linear=linear_1)
+ num_consts=num_consts_1, num_carry=num_carry, linear=tuple(linear_1))
out_carry, ys, res_and_units = split_list(out_flat, [num_carry, num_ys])
extensive_residuals = [r for r, (pv, _) in zip(res_and_units, res_pvals) if pv is not None]
@@ -981,7 +985,7 @@ def _scan_partial_eval(trace, *tracers, **kwargs):
out_tracers, scan_p,
dict(forward=forward, length=length, jaxpr=jaxpr_2_opt,
num_consts=num_consts_2,
- num_carry=num_carry, linear=linear_2))
+ num_carry=num_carry, linear=tuple(linear_2)))
for t in out_tracers: t.recipe = eqn
return out_tracers
@@ -991,10 +995,7 @@ def _promote_aval_rank(sz, aval):
else:
return ShapedArray((sz,) + aval.shape, aval.dtype)
-def _scan_transpose(cts, *args, **kwargs):
- forward, length, num_consts, num_carry, jaxpr, linear = split_dict(
- kwargs, ["forward", "length", "num_consts", "num_carry", "jaxpr", "linear"])
-
+def _scan_transpose(cts, *args, forward, length, num_consts, num_carry, jaxpr, linear):
# we've only implemented transposing scans with specific lin/nonlin patterns
consts_lin, init_lin, xs_lin = split_list(linear, [num_consts, num_carry])
num_ires = len(consts_lin) - sum(consts_lin)
@@ -1030,7 +1031,7 @@ def _scan_transpose(cts, *args, **kwargs):
outs = scan_p.bind(
*(ires + ct_consts + ct_carry + ct_ys + eres), forward=not forward,
length=length, jaxpr=jaxpr_trans, num_consts=num_ires,
- num_carry=num_consts-num_ires+num_carry, linear=linear_trans)
+ num_carry=num_consts-num_ires+num_carry, linear=tuple(linear_trans))
ct_consts, ct_init, ct_xs = split_list(outs, [num_consts - num_ires, num_carry])
return [None] * num_ires + ct_consts + ct_init + ct_xs + [None] * num_eres
| pmap inside scan triggers ragged assertion.
```python
import os
os.environ['XLA_FLAGS'] = '--xla_force_host_platform_device_count=8'
import jax
from jax import lax
import jax.numpy as np
@jax.soft_pmap
def fn1(x):
return x
def fn2(c, _):
return fn1(c), ()
print('N devices', jax.local_device_count())
print(jax.__version__)
lax.scan(fn2, np.zeros([2]), xs=np.zeros([5]))
```
Yields:
```
N devices 8
0.1.57
Traceback (most recent call last):
File "test.py", line 18, in <module>
lax.scan(fn2, np.zeros([2]), xs=np.zeros([5]))
File "/home/siege/.local/lib/python3.7/site-packages/jax/lax/lax_control_flow.py", line 553, in scan
linear=(False,) * (len(consts) + len(in_flat)))
File "/home/siege/.local/lib/python3.7/site-packages/jax/lax/lax_control_flow.py", line 919, in scan_bind
num_carry=num_carry, linear=linear)
File "/home/siege/.local/lib/python3.7/site-packages/jax/core.py", line 156, in bind
return self.impl(*args, **kwargs)
File "/home/siege/.local/lib/python3.7/site-packages/jax/lax/lax_control_flow.py", line 574, in _scan_impl
return fori_loop(lax._const(length, 0), length, body_fun, init + ys_init)
File "/home/siege/.local/lib/python3.7/site-packages/jax/lax/lax_control_flow.py", line 143, in fori_loop
(lower, upper, init_val))
File "/home/siege/.local/lib/python3.7/site-packages/jax/lax/lax_control_flow.py", line 210, in while_loop
body_nconsts=len(body_consts), body_jaxpr=body_jaxpr)
File "/home/siege/.local/lib/python3.7/site-packages/jax/core.py", line 156, in bind
return self.impl(*args, **kwargs)
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/xla.py", line 160, in apply_primitive
compiled_fun = xla_primitive_callable(prim, *map(arg_spec, args), **params)
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/xla.py", line 175, in xla_primitive_callable
built_c = primitive_computation(prim, backend, tuple_args, *avals, **params)
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/xla.py", line 217, in primitive_computation
rule(c, AxisEnv(), *xla_args, backend=backend, **params) # side-effect on c
File "/home/siege/.local/lib/python3.7/site-packages/jax/lax/lax_control_flow.py", line 248, in _while_loop_translation_rule
_map(body_c.Constant, body_jaxpr.literals), (), *(y + z))
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/xla.py", line 326, in jaxpr_subcomp
backend=backend, **new_params)
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/pxla.py", line 639, in _pmap_translation_rule
outs = [_xla_unshard(c, new_env, shard) for shard in sharded_outs]
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/pxla.py", line 639, in <listcomp>
outs = [_xla_unshard(c, new_env, shard) for shard in sharded_outs]
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/pxla.py", line 670, in _xla_unshard
return c.CrossReplicaSum(padded, xla.axis_groups(axis_env, axis_env.names[-1]))
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/xla.py", line 371, in axis_groups
return _axis_groups(axis_env.nreps, axis_env.sizes, mesh_axes)
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/xla.py", line 375, in _axis_groups
assert not ragged
```
Works fine with `vmap`.
| 2020-02-06T18:39:52Z | [] | [] |
Traceback (most recent call last):
File "test.py", line 18, in <module>
lax.scan(fn2, np.zeros([2]), xs=np.zeros([5]))
File "/home/siege/.local/lib/python3.7/site-packages/jax/lax/lax_control_flow.py", line 553, in scan
linear=(False,) * (len(consts) + len(in_flat)))
File "/home/siege/.local/lib/python3.7/site-packages/jax/lax/lax_control_flow.py", line 919, in scan_bind
num_carry=num_carry, linear=linear)
File "/home/siege/.local/lib/python3.7/site-packages/jax/core.py", line 156, in bind
return self.impl(*args, **kwargs)
File "/home/siege/.local/lib/python3.7/site-packages/jax/lax/lax_control_flow.py", line 574, in _scan_impl
return fori_loop(lax._const(length, 0), length, body_fun, init + ys_init)
File "/home/siege/.local/lib/python3.7/site-packages/jax/lax/lax_control_flow.py", line 143, in fori_loop
(lower, upper, init_val))
File "/home/siege/.local/lib/python3.7/site-packages/jax/lax/lax_control_flow.py", line 210, in while_loop
body_nconsts=len(body_consts), body_jaxpr=body_jaxpr)
File "/home/siege/.local/lib/python3.7/site-packages/jax/core.py", line 156, in bind
return self.impl(*args, **kwargs)
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/xla.py", line 160, in apply_primitive
compiled_fun = xla_primitive_callable(prim, *map(arg_spec, args), **params)
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/xla.py", line 175, in xla_primitive_callable
built_c = primitive_computation(prim, backend, tuple_args, *avals, **params)
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/xla.py", line 217, in primitive_computation
rule(c, AxisEnv(), *xla_args, backend=backend, **params) # side-effect on c
File "/home/siege/.local/lib/python3.7/site-packages/jax/lax/lax_control_flow.py", line 248, in _while_loop_translation_rule
_map(body_c.Constant, body_jaxpr.literals), (), *(y + z))
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/xla.py", line 326, in jaxpr_subcomp
backend=backend, **new_params)
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/pxla.py", line 639, in _pmap_translation_rule
outs = [_xla_unshard(c, new_env, shard) for shard in sharded_outs]
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/pxla.py", line 639, in <listcomp>
outs = [_xla_unshard(c, new_env, shard) for shard in sharded_outs]
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/pxla.py", line 670, in _xla_unshard
return c.CrossReplicaSum(padded, xla.axis_groups(axis_env, axis_env.names[-1]))
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/xla.py", line 371, in axis_groups
return _axis_groups(axis_env.nreps, axis_env.sizes, mesh_axes)
File "/home/siege/.local/lib/python3.7/site-packages/jax/interpreters/xla.py", line 375, in _axis_groups
assert not ragged
```
Works fine with `vmap`.
| 5,441 |
||||
google/jax | google__jax-2800 | 56f6294e377cb4e03c8e1a6fe82dade0c965d617 | diff --git a/jax/api.py b/jax/api.py
--- a/jax/api.py
+++ b/jax/api.py
@@ -53,6 +53,7 @@
from .lib.xla_bridge import (device_count, local_device_count, devices, local_devices,
host_id, host_ids, host_count)
from .abstract_arrays import ConcreteArray, ShapedArray, raise_to_shaped
+from .interpreters.masking import eval_polymorphic_shape, Poly, Mon
from .interpreters import partial_eval as pe
from .interpreters import xla
from .interpreters import pxla
@@ -60,7 +61,6 @@
from .interpreters import batching
from .interpreters import parallel
from .interpreters import masking
-from .interpreters.masking import ensure_poly
from .custom_derivatives import custom_jvp, custom_vjp
from .config import flags, config, bool_env
@@ -1162,24 +1162,23 @@ def wrapped_fun(args, logical_env):
out_shapes = map(masking.finalize_spec, out_specs, map(onp.shape, outs))
if not out_shapes == list(out_shapes_):
raise masking.ShapeError
- if not all(onp.shape(out) == masking.eval_shape_expr(padded_env, expr)
- for out, expr in zip(outs, out_shapes)):
+ if not all(onp.shape(out) == eval_polymorphic_shape(shape, padded_env)
+ for out, shape in zip(outs, out_shapes)):
raise masking.ShapeError
return tree_unflatten(out_tree(), outs)
return wrapped_fun
def _remap_ids(names, shape_spec):
- ShapeSpec, Poly, Mon = masking.ShapeSpec, masking.Poly, masking.Mon
- mdim = masking.monomorphic_dim
- return ShapeSpec(Poly({Mon({names[id] : deg for id, deg in mon.items()})
+ return masking.ShapeSpec(Poly({Mon({names[id] : deg for id, deg in mon.items()})
: coeff for mon, coeff in poly.items()})
- if poly is not mdim else mdim for poly in shape_spec)
+ if poly is not masking._monomorphic_dim else
+ masking._monomorphic_dim for poly in shape_spec)
def _bind_shapes(shape_exprs, shapes):
env = {}
for shape_expr, shape in zip(shape_exprs, shapes):
for poly, d in zip(shape_expr, shape):
- if ensure_poly(poly).is_constant:
+ if type(poly) is not Poly or poly.is_constant:
continue
else:
(binder,), = poly # TODO generalize to handle striding
@@ -1188,23 +1187,20 @@ def _bind_shapes(shape_exprs, shapes):
@curry
-def shapecheck(in_shapes, out_shape, fun):
+def shapecheck(in_shapes, out_shape, fun: Callable):
_check_callable(fun)
in_shapes, in_tree = tree_flatten(in_shapes)
in_shapes = map(masking.parse_spec, in_shapes)
out_shapes, out_tree = tree_flatten(out_shape)
out_shapes = map(masking.parse_spec, out_shapes)
flat_fun, out_tree_ = flatten_fun_nokwargs(lu.wrap_init(fun), in_tree)
- out_shapes_ = masking.shapecheck(flat_fun, in_shapes)
+ avals = map(partial(ShapedArray, dtype=onp.float32), in_shapes)
+ out_shapes_ = [o.shape for o in pe.abstract_eval_fun(flat_fun.call_wrapped, *avals)]
if out_tree != out_tree_(): raise TypeError("pytree mismatch")
- if not all(map(_shape_spec_consistent, out_shapes, out_shapes_)):
+ if not all(map(masking._shape_spec_consistent, out_shapes, out_shapes_)):
raise masking.ShapeError
return fun
-def _shape_spec_consistent(spec, expr):
- return all(a == b for a, b in zip(spec, expr) if a is not masking.monomorphic_dim)
-
-
def jvp(fun: Callable, primals, tangents) -> Tuple[Any, Any]:
"""Computes a (forward-mode) Jacobian-vector product of ``fun``.
diff --git a/jax/interpreters/masking.py b/jax/interpreters/masking.py
--- a/jax/interpreters/masking.py
+++ b/jax/interpreters/masking.py
@@ -12,12 +12,10 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-
from contextlib import contextmanager
from collections import Counter, namedtuple
-import functools
from functools import partial
-import itertools as it
+from itertools import chain, product
import operator as op
import string
from typing import Callable, Dict
@@ -27,19 +25,30 @@
from .. import abstract_arrays
from .. import core
from ..core import Trace, Tracer
-from ..util import unzip2, safe_map, safe_zip
+from ..util import safe_map, safe_zip, unzip2, prod
from ..abstract_arrays import ShapedArray
from .. import linear_util as lu
map = safe_map
zip = safe_zip
-def prod(xs):
- xs = list(xs)
- return functools.reduce(op.mul, xs) if xs else 1
+shape_parameterized_primitive_rules: Dict[core.Primitive, Callable] = {}
+masking_rules: Dict[core.Primitive, Callable] = {}
+
+def defvectorized(prim):
+ masking_rules[prim] = partial(vectorized_masking_rule, prim)
+
+def defnaryop(prim):
+ masking_rules[prim] = partial(naryop_masking_rule, prim)
+def vectorized_masking_rule(prim, padded_vals, logical_shapes, **params):
+ del logical_shapes # Unused.
+ padded_val, = padded_vals
+ return prim.bind(padded_val, **params)
-### main transformation functions
+def naryop_masking_rule(prim, padded_vals, logical_shapes):
+ del logical_shapes # Unused.
+ return prim.bind(*padded_vals)
ShapeEnvs = namedtuple("ShapeEnvs", ["logical", "padded"])
shape_envs = ShapeEnvs({}, {}) # TODO(mattjj): make this a stack for efficiency
@@ -47,29 +56,17 @@ def prod(xs):
@contextmanager
def extend_shape_envs(logical_env, padded_env):
global shape_envs
- new_logical = dict(it.chain(shape_envs.logical.items(), logical_env.items()))
- new_padded = dict(it.chain(shape_envs.padded.items(), padded_env.items()))
+ new_logical = dict(chain(shape_envs.logical.items(), logical_env.items()))
+ new_padded = dict(chain(shape_envs.padded.items(), padded_env.items()))
shape_envs, prev = ShapeEnvs(new_logical, new_padded), shape_envs
yield
shape_envs = prev
-def is_polymorphic(shape):
- return any(map(lambda d: isinstance(d, Poly), shape))
-
-def shape_as_value(expr):
- if type(expr) is tuple and is_polymorphic(expr):
- return tuple(eval_dim_expr(shape_envs.logical, d) if type(d) is Poly else d
- for d in expr)
- else:
- return expr
-
-def padded_shape_as_value(expr):
- if type(expr) is tuple and is_polymorphic(expr):
- return tuple(eval_dim_expr(shape_envs.padded, d) if type(d) is Poly else d
- for d in expr)
- else:
- return expr
+def shape_as_value(shape):
+ return eval_polymorphic_shape(shape, shape_envs.logical)
+def padded_shape_as_value(shape):
+ return eval_polymorphic_shape(shape, shape_envs.padded)
def mask_fun(fun, logical_env, padded_env, in_vals, shape_exprs):
with core.new_master(MaskTrace) as master:
@@ -89,31 +86,29 @@ def mask_subtrace(master, in_vals, shape_exprs):
out_vals, out_shapes = unzip2((t.val, t.shape_expr) for t in out_tracers)
yield out_vals, out_shapes
-def ensure_poly(p):
- if isinstance(p, Poly):
+def eval_polymorphic_shape(shape, values_dict):
+ return tuple(dim.evaluate(values_dict) if type(dim) is Poly else dim
+ for dim in shape)
+
+def _ensure_poly(p):
+ if type(p) is Poly:
return p
- return constant_poly(int(p))
+ return Poly({Mon(): p})
-class Poly(Counter):
- """Polynomial with integer coefficients,
- usable as element in a polymorphic shape.
+class Poly(dict):
+ """Polynomial with nonnegative integer coefficients for polymorphic shapes."""
- type Poly = Map Mon Int -- monomials to coeffs
- type Mon = Map Str Int
- """
def __init__(self, coeffs):
- # Makes sure Polynomials are always in canonical form to simplify operators:
- coeffs = {mon: coeff for mon, coeff in coeffs.items() if coeff != 0}
- coeffs = {Mon(): 0} if len(coeffs) == 0 else coeffs
+ # Makes sure Polynomials are always in canonical form
+ coeffs = {mon: op.index(coeff) for mon, coeff in coeffs.items() if coeff != 0}
+ coeffs = coeffs or {Mon(): 0}
super().__init__(coeffs)
def __add__(self, other):
coeffs = self.copy()
-
- for mon, coeff in ensure_poly(other).items():
+ for mon, coeff in _ensure_poly(other).items():
coeffs[mon] = coeffs.get(mon, 0) + coeff
-
return Poly(coeffs)
def __sub__(self, other):
@@ -123,23 +118,21 @@ def __neg__(self):
return Poly({mon: -coeff for mon, coeff in self.items()})
def __mul__(self, other):
- coeffs = dict()
- for (mon1, coeff1), (mon2, coeff2) \
- in it.product(self.items(), ensure_poly(other).items()):
- mon = Mon(mon1 + mon2) # add monomials' id degrees
- coeff = coeff1 * coeff2 # multiply integer coeffs
- coeffs[mon] = coeffs.get(mon, 0) + coeff # accumulate coeffs
-
+ other = _ensure_poly(other)
+ coeffs = {}
+ for (mon1, coeff1), (mon2, coeff2) in product(self.items(), other.items()):
+ mon = mon1 * mon2
+ coeffs[mon] = coeffs.get(mon, 0) + coeff1 * coeff2
return Poly(coeffs)
def __rmul__(self, other):
- return self * other
+ return self * other # multiplication commutes
def __radd__(self, other):
- return self + other
+ return self + other # addition commutes
def __rsub__(self, other):
- return self + -other
+ return _ensure_poly(other) - self
def __floordiv__(self, divisor):
q, _ = divmod(self, divisor) # pytype: disable=wrong-arg-types
@@ -151,66 +144,65 @@ def __mod__(self, divisor):
def __divmod__(self, divisor):
if self.is_constant:
- q, r = divmod(int(self), divisor)
-
- return constant_poly(q), r
-
- def divided(count):
- q, r = divmod(count, divisor)
- if r != 0:
- raise ValueError('shapecheck currently only supports strides '
- 'that exactly divide the strided axis length.')
- return q
+ return divmod(int(self), divisor)
+ else:
+ def divided(count):
+ q, r = divmod(count, divisor)
+ if r != 0:
+ raise ValueError('shapecheck currently only supports strides '
+ 'that exactly divide the strided axis length.')
+ return q
- return Poly(
- {k: coeff // divisor if k.degree == 0 else divided(coeff)
- for k, coeff in self.items()}), self[Mon()] % divisor
+ return Poly(
+ {k: coeff // divisor if k.degree == 0 else divided(coeff)
+ for k, coeff in self.items()}), self.get(Mon(), 0) % divisor
def __hash__(self):
- return hash(super())
+ return hash(tuple(sorted(self.items())))
def __eq__(self, other):
- return super().__eq__(ensure_poly(other))
+ return super().__eq__(_ensure_poly(other))
def __ne__(self, other):
return not self == other
def __ge__(self, other):
- other = ensure_poly(other)
+ other = _ensure_poly(other)
if other.is_constant and self.is_constant:
return int(self) >= int(other)
-
- if other.is_constant and int(other) <= 1:
- # Assume polynomials > 0, allowing to use shape rules of binops, conv:
- return True
-
- if self.is_constant and int(self) <= 0:
- return False # See above.
-
- if self == other:
+ elif other.is_constant and int(other) <= 1:
+ # Assume nonzero polynomials are positive, allows use in shape rules
+ return True
+ elif self.is_constant and int(self) <= 0:
+ return False # See above.
+ elif self == other:
return True
raise ValueError('Polynomials comparison "{} >= {}" is inconclusive.'
.format(self, other))
def __le__(self, other):
- return ensure_poly(other) >= self
+ return _ensure_poly(other) >= self
def __lt__(self, other):
return not (self >= other)
def __gt__(self, other):
- return not (ensure_poly(other) >= self)
+ return not (_ensure_poly(other) >= self)
def __str__(self):
- return ' + '.join('{} {}'.format(v, k) if (v != 1 or k.degree == 0) else str(k)
+ return ' + '.join('{} {}'.format(v, k)
+ if (v != 1 or k.degree == 0) else str(k)
for k, v in sorted(self.items())).strip()
def __int__(self):
assert self.is_constant
+ return op.index(next(iter(self.values())))
- return int(next(iter(self.values())))
+ def evaluate(self, values_dict):
+ return sum(coeff * prod([values_dict[id] ** deg for id, deg in mon.items()])
+ for mon, coeff in self.items())
@property
def is_constant(self):
@@ -219,7 +211,7 @@ def is_constant(self):
abstract_arrays._DIMENSION_TYPES.add(Poly)
-class Mon(Counter): # type Mon = Map Id Int -- ids to degrees
+class Mon(dict):
def __hash__(self):
return hash(tuple(self.items()))
@@ -233,34 +225,13 @@ def __lt__(self, other):
other_key = other.degree, tuple(sorted(other))
return self_key < other_key
+ def __mul__(self, other):
+ return Mon(Counter(self) + Counter(other))
+
@property
def degree(self):
return sum(self.values())
-def eval_shape_expr(env, expr):
- return tuple(eval_dim_expr(env, poly) for poly in expr)
-
-def eval_dim_expr(env, poly):
- terms = [mul(coeff, prod([pow(env[id], deg) for id, deg in mon.items()]))
- for mon, coeff in poly.items()]
- return sum(terms) if len(terms) > 1 else terms[0]
-
-def pow(x, deg):
- try:
- deg = int(deg)
- except:
- return x ** deg
- else:
- return 1 if deg == 0 else x if deg == 1 else x ** deg
-
-def mul(coeff, mon):
- try:
- coeff = int(coeff)
- except:
- return coeff * mon
- else:
- return 0 if coeff == 0 else mon if coeff == 1 else coeff * mon
-
class ShapeError(Exception): pass
class ShapeSyntaxError(Exception): pass
@@ -280,15 +251,15 @@ class ShapeSyntaxError(Exception): pass
# dims ::= dim ',' dims | ''
# dim ::= str | int | dim '*' dim | dim '+' dim | '_'
#
-# ShapeSpecs encode ShapeExprs but can have some monomorphic dims inside them,
-# which must be replaced with concrete shapes when known.
+# ShapeSpecs can have some monomorphic dims inside them, which must be replaced
+# with concrete shapes when known.
class ShapeSpec(tuple):
def __str__(self):
return 'ShapeSpec({})'.format(', '.join(map(str, self)))
def finalize_spec(spec, shape):
- return tuple(parse_lit(d) if e is monomorphic_dim else e
+ return tuple(_parse_lit(d) if e is _monomorphic_dim else e
for e, d in zip(spec, shape))
def parse_spec(spec=''):
@@ -297,35 +268,33 @@ def parse_spec(spec=''):
if spec[0] == '(':
if spec[-1] != ')': raise ShapeSyntaxError(spec)
spec = spec[1:-1]
- dims = map(parse_dim, spec.replace(' ', '').strip(',').split(','))
+ dims = map(_parse_dim, spec.replace(' ', '').strip(',').split(','))
return ShapeSpec(dims)
-def parse_dim(spec):
+def _parse_dim(spec):
if '+' in spec:
- terms = map(parse_dim, spec.split('+'))
- return functools.reduce(op.add, terms)
+ return onp.sum(map(_parse_dim, spec.split('+')))
elif '*' in spec:
- terms = map(parse_dim, spec.split('*'))
- return functools.reduce(op.mul, terms)
+ return prod(map(_parse_dim, spec.split('*')))
elif spec.isdigit() or spec.startswith('-') and spec[1:].isdigit():
- return parse_lit(spec)
- elif spec in identifiers:
- return parse_id(spec)
+ return _parse_lit(spec)
+ elif spec in _identifiers:
+ return _parse_id(spec)
elif spec == '_':
- return monomorphic_dim
+ return _monomorphic_dim
else:
raise ShapeSyntaxError(spec)
-digits = frozenset(string.digits)
-identifiers = frozenset(string.ascii_lowercase)
-def parse_id(name): return Poly({Mon({name: 1}): 1})
-def parse_lit(val_str): return constant_poly(int(val_str))
-def constant_poly(val): return Poly({Mon(): val})
+_identifiers = frozenset(string.ascii_lowercase)
+
+def _parse_id(name): return Poly({Mon({name: 1}): 1})
+
+def _parse_lit(val_str): return Poly({Mon(): int(val_str)})
class MonomorphicDim(object):
def __str__(self): return '_'
-monomorphic_dim = MonomorphicDim()
+_monomorphic_dim = MonomorphicDim()
# Two convenient ways to provide shape annotations:
# 1. '(m, n)'
@@ -333,14 +302,13 @@ def __str__(self): return '_'
class S_(object):
def __getitem__(self, idx):
- if type(idx) is tuple:
- return parse_spec('(' + ','.join(map(str, idx)) + ')')
- else:
- return parse_spec(str(idx))
-s_ = S_()
+ return parse_spec(('(' + ','.join(map(str, idx)) + ')')
+ if type(idx) is tuple else str(idx))
+s_ = S_()
-### automasking tracer machinery
+def _shape_spec_consistent(spec, expr):
+ return all(a == b for a, b in zip(spec, expr) if a is not _monomorphic_dim)
class MaskTracer(Tracer):
__slots__ = ["val", "shape_expr"]
@@ -355,7 +323,7 @@ def aval(self):
return ShapedArray(self.shape_expr, self.val.dtype)
def is_pure(self):
- return all(ensure_poly(poly).is_constant for poly in self.shape_expr)
+ return all(type(poly) is not Poly or poly.is_constant for poly in self.shape_expr)
def full_lower(self):
if self.is_pure():
@@ -363,6 +331,7 @@ def full_lower(self):
else:
return self
+
class MaskTrace(Trace):
def pure(self, val):
return MaskTracer(self, val, onp.shape(val))
@@ -379,8 +348,10 @@ def process_primitive(self, primitive, tracers, params):
rule = shape_parameterized_primitive_rules[primitive]
out, out_shape = rule(shape_envs, vals, shape_exprs, **params)
else:
- out_shape = shape_rules[primitive](*(t.aval for t in tracers), **params)
- logical_shapes = map(partial(eval_shape_expr, shape_envs.logical), shape_exprs)
+ avals = [t.aval for t in tracers]
+ out = primitive.abstract_eval(*avals, **params)
+ out_shape = [o.shape for o in out] if primitive.multiple_results else out.shape
+ logical_shapes = map(partial(eval_polymorphic_shape, values_dict=shape_envs.logical), shape_exprs)
out = masking_rules[primitive](vals, logical_shapes, **params)
if not primitive.multiple_results:
return MaskTracer(self, out, out_shape)
@@ -389,79 +360,3 @@ def process_primitive(self, primitive, tracers, params):
def process_call(self, call_primitive, f: lu.WrappedFun, tracers, params):
raise NotImplementedError # TODO mask-of-jit
-
-shape_parameterized_primitive_rules: Dict[core.Primitive, Callable] = {}
-masking_rules: Dict[core.Primitive, Callable] = {}
-shape_rules: Dict[core.Primitive, Callable] = {}
-
-def defvectorized(prim):
- masking_rules[prim] = partial(vectorized_masking_rule, prim)
-
-def vectorized_masking_rule(prim, padded_vals, logical_shapes, **params):
- del logical_shapes # Unused.
- padded_val, = padded_vals
- return prim.bind(padded_val, **params)
-
-
-def defnaryop(prim):
- masking_rules[prim] = partial(naryop_masking_rule, prim)
-
-def naryop_masking_rule(prim, padded_vals, logical_shapes):
- del logical_shapes # Unused.
- return prim.bind(*padded_vals)
-
-
-### definition-time (import-time) shape checker tracer machinery
-
-def shapecheck(fun: lu.WrappedFun, in_shapes):
- with core.new_master(ShapeCheckTrace) as master:
- out_shapes = check_subtrace(fun, master).call_wrapped(in_shapes)
- del master
- return out_shapes
-
-@lu.transformation
-def check_subtrace(master, in_shapes):
- trace = ShapeCheckTrace(master, core.cur_sublevel())
- in_tracers = map(partial(ShapeCheckTracer, trace), in_shapes)
- outs = yield in_tracers, {}
- out_tracers = map(trace.full_raise, outs)
- yield [t.shape_expr for t in out_tracers]
-
-
-# TODO(mattjj): add dtypes?
-class ShapeCheckTracer(Tracer):
- __slots__ = ["shape_expr"]
-
- def __init__(self, trace, shape_expr):
- self._trace = trace
- self.shape_expr = shape_expr
-
- @property
- def aval(self):
- return ShapedArray(self.shape_expr, None)
-
- def full_lower(self):
- return self
-
-class ShapeCheckTrace(Trace):
- def pure(self, val):
- return ShapeCheckTracer(self, onp.shape(val))
-
- def lift(self, val):
- return ShapeCheckTracer(self, onp.shape(val))
-
- def sublift(self, val):
- return ShapeCheckTracer(self, val.shape_expr)
-
- def process_primitive(self, primitive, tracers, params):
- avals = [t.aval for t in tracers]
- shape_rule = shape_rules.get(primitive)
- if shape_rule is None:
- raise NotImplementedError('Shape rule for {} not implemented yet.'.format(primitive))
- out_shape = shape_rule(*avals, **params)
- return ShapeCheckTracer(self, out_shape)
-
- def process_call(self, call_primitive, f: lu.WrappedFun, tracers, params):
- # TODO apply proper subtrace:
- return map(self.full_raise, f.call_wrapped(*tracers))
-
diff --git a/jax/interpreters/xla.py b/jax/interpreters/xla.py
--- a/jax/interpreters/xla.py
+++ b/jax/interpreters/xla.py
@@ -997,7 +997,6 @@ def _device_put_impl(x, device=None):
device_put_p.def_impl(_device_put_impl)
pe.custom_partial_eval_rules[device_put_p] = lambda trace, x, **params: x
ad.deflinear(device_put_p, lambda cotangent, **kwargs: [cotangent])
-masking.shape_rules[device_put_p] = lambda x, **_: x.shape
masking.defvectorized(device_put_p)
diff --git a/jax/lax/lax.py b/jax/lax/lax.py
--- a/jax/lax/lax.py
+++ b/jax/lax/lax.py
@@ -36,7 +36,7 @@
from .. import lazy
from .. import lib
from ..config import flags
-from ..core import _canonicalize_dimension, Primitive
+from ..core import Primitive
from ..abstract_arrays import (UnshapedArray, ShapedArray, ConcreteArray,
AbstractToken, array_types, make_shaped_array,
raise_to_shaped, abstract_token, canonicalize_shape)
@@ -79,7 +79,7 @@ def broadcast_shapes(*shapes):
if not onp.all((shapes == result_shape) | (shapes == 1)):
raise ValueError("Incompatible shapes for broadcasting: {}"
.format(tuple(map(tuple, shapes))))
- return tuple(map(_canonicalize_dimension, result_shape))
+ return canonicalize_shape(result_shape)
def _identity(x): return x
@@ -1234,10 +1234,11 @@ def iota(dtype: DType, size: int) -> Array:
<https://www.tensorflow.org/xla/operation_semantics#iota>`_
operator.
"""
- size = int(size)
+ size = size if type(size) is masking.Poly else int(size)
+ shape = canonicalize_shape((size,))
dtype = dtypes.canonicalize_dtype(dtype)
- lazy_expr = lazy.iota(dtype, size)
- aval = ShapedArray((size,), dtype)
+ lazy_expr = lazy.iota(dtype, shape[0])
+ aval = ShapedArray(shape, dtype)
return xla.DeviceArray(aval, None, lazy_expr, xla.DeviceConstant())
def broadcasted_iota(dtype: DType, shape: Shape, dimension: int) -> Array:
@@ -1696,7 +1697,6 @@ def standard_primitive(shape_rule, dtype_rule, name, translation_rule=None):
prim.def_impl(partial(xla.apply_primitive, prim))
prim.def_abstract_eval(partial(standard_abstract_eval, prim, shape_rule, dtype_rule))
xla.translations[prim] = translation_rule or partial(standard_translate, name)
- masking.shape_rules[prim] = shape_rule
return prim
@@ -4689,7 +4689,6 @@ def _tie_in_batch_rule(batched_args, batch_dims):
xla.translations[tie_in_p] = lambda c, x, y: y
ad.deflinear(tie_in_p, _tie_in_transpose_rule)
batching.primitive_batchers[tie_in_p] = _tie_in_batch_rule
-masking.shape_rules[tie_in_p] = lambda x, y: y.shape
masking.masking_rules[tie_in_p] = lambda vals, logical_shapes: vals[1]
@@ -4964,8 +4963,6 @@ def conv_transpose_shape_tuple(lhs_shape, rhs_shape, window_strides, padding,
def _check_shapelike(fun_name, arg_name, obj):
"""Check that `obj` is a shape-like value (e.g. tuple of nonnegative ints)."""
- if (type(obj) is tuple and masking.is_polymorphic(obj)):
- return obj
if not isinstance(obj, (tuple, list, onp.ndarray)):
msg = "{} {} must be of type tuple/list/ndarray, got {}."
raise TypeError(msg.format(fun_name, arg_name, type(obj)))
@@ -4976,7 +4973,9 @@ def _check_shapelike(fun_name, arg_name, obj):
if obj_arr.ndim != 1:
msg = "{} {} must be rank 1, got {}."
raise TypeError(msg.format(obj_arr.ndim))
- if not dtypes.issubdtype(obj_arr.dtype, onp.integer):
+ try:
+ canonicalize_shape(obj_arr)
+ except TypeError:
msg = "{} {} must have every element be an integer type, got {}."
raise TypeError(msg.format(fun_name, arg_name, tuple(map(type, obj))))
if not (obj_arr >= 0).all():
diff --git a/jax/lax/lax_control_flow.py b/jax/lax/lax_control_flow.py
--- a/jax/lax/lax_control_flow.py
+++ b/jax/lax/lax_control_flow.py
@@ -1285,7 +1285,7 @@ def _scan_masking_rule(shape_envs, padded_vals, shape_exprs, forward, length,
jaxpr, num_consts, num_carry, linear):
out_shape = _scan_shape_rule(shape_exprs, forward, length, jaxpr,
num_consts, num_carry, linear)
- dynamic_length = masking.eval_dim_expr(shape_envs.logical, length)
+ dynamic_length = length.evaluate(shape_envs.logical)
masked_jaxpr = _masked_scan_jaxpr(jaxpr, num_consts, num_carry)
consts, init, xs = split_list(padded_vals, [num_consts, num_carry])
max_length, = {x.shape[0] for x in xs}
diff --git a/jax/numpy/lax_numpy.py b/jax/numpy/lax_numpy.py
--- a/jax/numpy/lax_numpy.py
+++ b/jax/numpy/lax_numpy.py
@@ -33,7 +33,7 @@
import re
import string
import types
-from typing import Tuple
+from typing import Tuple, Union
import warnings
import numpy as onp
@@ -42,9 +42,10 @@
from jax import jit, device_put, custom_jvp
from .. import core
from .. import dtypes
-from ..abstract_arrays import UnshapedArray, ShapedArray, ConcreteArray
+from ..abstract_arrays import UnshapedArray, ShapedArray, ConcreteArray, canonicalize_shape
from ..config import flags
from ..interpreters.xla import DeviceArray
+from ..interpreters.masking import Poly
from .. import lax
from .. import ops
from ..util import partial, get_module_functions, unzip2, prod as _prod, subvals
@@ -1295,7 +1296,7 @@ def broadcast_arrays(*args):
def broadcast_to(arr, shape):
"""Like Numpy's broadcast_to but doesn't necessarily return views."""
arr = arr if isinstance(arr, ndarray) else array(arr)
- shape = tuple(map(int, shape)) # check that shape is concrete
+ shape = canonicalize_shape(shape) # check that shape is concrete
arr_shape = _shape(arr)
if arr_shape == shape:
return arr
@@ -2140,7 +2141,7 @@ def arange(start, stop=None, step=None, dtype=None):
lax._check_user_dtype_supported(dtype, "arange")
if stop is None and step is None:
dtype = dtype or _dtype(start)
- return lax.iota(dtype, start) # avoids materializing
+ return lax.iota(dtype, start) # avoids materializing
else:
return array(onp.arange(start, stop=stop, step=step, dtype=dtype))
@@ -3033,6 +3034,9 @@ def take(a, indices, axis=None, out=None, mode=None):
def _normalize_index(index, axis_size):
"""Normalizes an index value in the range [-N, N) to the range [0, N)."""
+ if type(axis_size) is Poly:
+ return index + axis_size if index < 0 else index
+
return lax.select(
lax.lt(index, _constant_like(index, 0)),
lax.add(index, _constant_like(index, axis_size)),
@@ -3316,7 +3320,8 @@ def _index_to_gather(x_shape, idx):
collapsed_slice_dims = []
start_index_map = []
- index_dtype = int64 if _max(x_shape, default=0) >= (1 << 31) else int32
+ use_64bit_index = _any([type(d) is Poly or d >= (1 << 31) for d in x_shape])
+ index_dtype = int64 if use_64bit_index else int32
gather_indices = onp.zeros((0,), dtype=index_dtype) # use onp to save a compilation
# We perform three transformations to y before the scatter op, in order:
@@ -3375,6 +3380,10 @@ def _index_to_gather(x_shape, idx):
# XLA gives error when indexing into an axis of size 0
raise IndexError(f"index is out of bounds for axis {x_axis} with size 0")
i = _normalize_index(i, x_shape[x_axis])
+ if type(i) is Poly:
+ # dummy index if i is polynomial, doesn't matter for shape inference
+ # TODO(mattjj,j-towns,juliuskunze): revise this logic
+ i = 0
i = lax.convert_element_type(i, index_dtype)
i = broadcast_to(i, tuple(gather_indices.shape[:-1]) + (1,))
gather_indices = concatenate((gather_indices, i), -1)
@@ -3397,7 +3406,8 @@ def _index_to_gather(x_shape, idx):
x_axis += 1
# Handle slice index (only static, otherwise an error is raised)
elif isinstance(i, slice):
- if not _all(elt is None or type(core.get_aval(elt)) is ConcreteArray
+ if not _all(elt is None or type(elt) is Poly
+ or type(core.get_aval(elt)) is ConcreteArray
for elt in (i.start, i.stop, i.step)):
msg = ("Array slice indices must have static start/stop/step to be used "
"with Numpy indexing syntax. Try lax.dynamic_slice/"
@@ -3547,13 +3557,42 @@ def _canonicalize_tuple_index(arr_ndim, idx):
idx = tuple(idx) + colons
return idx
+def _polymorphic_slice_indices(idx: slice, size: Union[int, Poly]):
+ # like idx.indices(size), but allows for polymorphic indices and size
+ # see https://github.com/python/cpython/blob/6d6508765514c7c10719478a0430f5e47c9a96ac/Objects/sliceobject.c#L372
+ assert isinstance(idx, slice)
+
+ step = 1 if idx.step is None else idx.step
+ step_is_negative = step < 0
+ lower = -1 if step_is_negative else 0
+ upper = size + lower
+
+ def sanitize(index, default):
+ if index is None:
+ return default
+ elif type(index) is Poly:
+ return index
+ elif index < 0:
+ return _max(index + size, lower)
+ else:
+ return _min(index, upper)
+
+ start = sanitize(idx.start, default=upper if step_is_negative else lower)
+ stop = sanitize(idx.stop, default=lower if step_is_negative else upper)
+ return start, stop, step
-def _static_idx(idx, size):
+def _static_idx(idx: slice, size: Union[int, Poly]):
"""Helper function to compute the static slice start/limit/stride values."""
- assert isinstance(idx, slice)
- start, stop, step = idx.indices(size)
- if (step < 0 and stop >= start) or (step > 0 and start >= stop):
- return 0, 0, 1, False # sliced to size zero
+ if _any(type(s) is Poly for s in (idx.start, idx.stop, idx.step, size)):
+ start, stop, step = _polymorphic_slice_indices(idx, size)
+ elif isinstance(size, int):
+ start, stop, step = idx.indices(size)
+ else:
+ raise TypeError(size)
+
+ if type(start) is not Poly and type(stop) is not Poly:
+ if (step < 0 and stop >= start) or (step > 0 and start >= stop):
+ return 0, 0, 1, False # sliced to size zero
if step > 0:
return start, stop, step, False
diff --git a/jax/random.py b/jax/random.py
--- a/jax/random.py
+++ b/jax/random.py
@@ -289,11 +289,8 @@ def _random_bits(key, bit_width, shape):
def _check_shape(name, shape, *param_shapes):
- try:
- shape = tuple(map(int, shape))
- except TypeError as err:
- msg = "{} requires a concrete tuple of integers as shape argument, got {}."
- raise ValueError(msg.format(name, shape)) from err
+ shape = abstract_arrays.canonicalize_shape(shape)
+
if param_shapes:
shape_ = lax.broadcast_shapes(shape, *param_shapes)
if shape != shape_:
| Recent shape checking changes result in new XLA errors
It turns out that PR #2017 breaks the following code:
```
array = np.ones(5)
print(array[:10])
```
should print `[1. 1. 1. 1. 1.]` but instead throws an error
```
Traceback (most recent call last):
File "/Users/necula/Source/jax/jax/interpreters/xla.py", line 230, in primitive_computation
return c.Build()
File "/Users/necula/Source/jax/jax/lib/xla_bridge.py", line 281, in Build
*args, **kwargs)
File "/Users/necula/Source/jax/build/jaxlib/xla_client.py", line 734, in Build
return Computation(self._builder.Build(), backend=backend)
RuntimeError: Invalid argument: Slice size at index 0 in gather op is out of range, must be within [0, 6), got 10.:
```
In some sense this error is reasonable, but it is not what JAX used to do before.
Before JAX used to sanitize indices, and produce the following JAXPR for the `gather`:
```
{ lambda b ; a.
let c = gather[ dimension_numbers=GatherDimensionNumbers(offset_dims=(0,), collapsed_slice_dims=(), start_index_map=(0,))
operand_shape=(5,)
slice_sizes=(5,) ] a b
in c }
```
After this change, the JAXPR is:
```
{ lambda b ; a.
let c = gather[ dimension_numbers=GatherDimensionNumbers(offset_dims=(0,), collapsed_slice_dims=(), start_index_map=(0,))
operand_shape=(5,)
slice_sizes=(10,) ] a b
in c }
```
| Thanks for catching this and managing it! I can follow up. | 2020-04-22T18:45:18Z | [] | [] |
Traceback (most recent call last):
File "/Users/necula/Source/jax/jax/interpreters/xla.py", line 230, in primitive_computation
return c.Build()
File "/Users/necula/Source/jax/jax/lib/xla_bridge.py", line 281, in Build
*args, **kwargs)
File "/Users/necula/Source/jax/build/jaxlib/xla_client.py", line 734, in Build
return Computation(self._builder.Build(), backend=backend)
RuntimeError: Invalid argument: Slice size at index 0 in gather op is out of range, must be within [0, 6), got 10.:
| 5,484 |
|||
google/jax | google__jax-285 | 57fc3a4cca22a855d983277e78e47b7bac1d0b2a | diff --git a/jax/experimental/stax.py b/jax/experimental/stax.py
--- a/jax/experimental/stax.py
+++ b/jax/experimental/stax.py
@@ -178,9 +178,9 @@ def apply_fun(params, inputs, rng=None):
def _normalize_by_window_size(dims, strides, padding):
def rescale(outputs, inputs):
- one = np.ones(inputs.shape[1:3], dtype=inputs.dtype)
+ one = np.ones(inputs.shape[1:-1], dtype=inputs.dtype)
window_sizes = lax.reduce_window(one, 0., lax.add, dims, strides, padding)
- return outputs / window_sizes
+ return outputs / window_sizes[..., np.newaxis]
return rescale
AvgPool = _pooling_layer(lax.add, 0., _normalize_by_window_size)
| AvgPool does only work as global average pooling
```python3
from jax.experimental import stax
import jax.numpy as np
init_fn, apply_fun = stax.AvgPool((2, 2), strides=(2, 2))
output_shape, params = init_fn((-1, 32, 32, 3))
print(output_shape)
print(params)
apply_fun(params, np.zeros((100, 32, 32, 3)))
```
This should work but instead it fails with this error:
```
(-1, 16, 16, 3)
()
Traceback (most recent call last):
File "minimal_example.py", line 9, in <module>
apply_fun(params, np.zeros((100, 32, 32, 3)))
File "/gpfs01/bethge/home/jrauber/PhD/063_jax/jax/jax/experimental/stax.py", line 172, in apply_fun
return rescale(out, inputs) if rescale else out
File "/gpfs01/bethge/home/jrauber/PhD/063_jax/jax/jax/experimental/stax.py", line 183, in rescale
return outputs / window_sizes
File "/gpfs01/bethge/home/jrauber/PhD/063_jax/jax/jax/numpy/lax_numpy.py", line 350, in true_divide
x1, x2 = _promote_shapes(x1, x2)
File "/gpfs01/bethge/home/jrauber/PhD/063_jax/jax/jax/numpy/lax_numpy.py", line 134, in _promote_shapes
nd = len(_broadcast_shapes(*shapes))
File "/gpfs01/bethge/home/jrauber/PhD/063_jax/jax/jax/util.py", line 161, in memoized_fun
ans = cache[key] = fun(*args, **kwargs)
File "/gpfs01/bethge/home/jrauber/PhD/063_jax/jax/jax/numpy/lax_numpy.py", line 151, in _broadcast_shapes
.format(tuple(map(tuple, shapes))))
ValueError: Incompatible shapes for broadcasting: ((100, 16, 16, 3), (1, 1, 16, 16))
```
Note that it also doesn't work if stride is 1. The only case in which `AvgPool` works seems to be if stride is 1 and the pooling size is identical to the input size (i.e. global average pooling). `MaxPool` seems to work fine.
| Thanks for catching this! | 2019-01-28T14:24:37Z | [] | [] |
Traceback (most recent call last):
File "minimal_example.py", line 9, in <module>
apply_fun(params, np.zeros((100, 32, 32, 3)))
File "/gpfs01/bethge/home/jrauber/PhD/063_jax/jax/jax/experimental/stax.py", line 172, in apply_fun
return rescale(out, inputs) if rescale else out
File "/gpfs01/bethge/home/jrauber/PhD/063_jax/jax/jax/experimental/stax.py", line 183, in rescale
return outputs / window_sizes
File "/gpfs01/bethge/home/jrauber/PhD/063_jax/jax/jax/numpy/lax_numpy.py", line 350, in true_divide
x1, x2 = _promote_shapes(x1, x2)
File "/gpfs01/bethge/home/jrauber/PhD/063_jax/jax/jax/numpy/lax_numpy.py", line 134, in _promote_shapes
nd = len(_broadcast_shapes(*shapes))
File "/gpfs01/bethge/home/jrauber/PhD/063_jax/jax/jax/util.py", line 161, in memoized_fun
ans = cache[key] = fun(*args, **kwargs)
File "/gpfs01/bethge/home/jrauber/PhD/063_jax/jax/jax/numpy/lax_numpy.py", line 151, in _broadcast_shapes
.format(tuple(map(tuple, shapes))))
ValueError: Incompatible shapes for broadcasting: ((100, 16, 16, 3), (1, 1, 16, 16))
| 5,493 |
|||
google/jax | google__jax-3096 | 7c687b245b34397c13563a714ad9bf0290b419e3 | diff --git a/jax/lax/lax.py b/jax/lax/lax.py
--- a/jax/lax/lax.py
+++ b/jax/lax/lax.py
@@ -4289,6 +4289,11 @@ def _select_and_gather_add_shape_rule(
64: onp.uint64,
}
+_INT_DTYPES = {
+ 16: onp.int16,
+ 32: onp.int32,
+ 64: onp.int64,
+}
def _select_and_gather_add_translation(
c, tangents, operand, *, select_prim, window_dimensions, window_strides,
@@ -4563,8 +4568,69 @@ def _sort_abstract_eval(*args, **kwargs):
raise TypeError(f"Arguments to sort must have equal shapes, got: {shapes}")
return args
+
+def _float_to_int_for_sort(x):
+ # Switch from a floating point value to a integer value in such a way that
+ # when using the integer value to compare, we get the same result for normal
+ # values, and -nan is treated as the smallest value, and nan is treated as
+ # the largest value.
+ # If f is a float, and
+ # x = bit_cast<int32>(f);
+ # y = x < 0 ? int32_max - x : x;
+ # then y is ordered as an int32 such that finite values have the obvious
+ # order, -0 is ordered before 0, and -NaN and NaN appear at the beginning
+ # and end of the ordering.
+ # Note that in order to avoid -x to overflow, we calculate
+ # int32_max - x as unsigned, and then convert back to signed.
+ if x.dtype == dtypes.bfloat16:
+ x = convert_element_type(x, onp.float32)
+ nbits = onp.finfo(x).bits
+ signed_dtype = _INT_DTYPES[nbits]
+ unsigned_dtype = _UINT_DTYPES[nbits]
+
+ signed = bitcast_convert_type(x, signed_dtype)
+ unsigned = bitcast_convert_type(x, unsigned_dtype)
+ flipped = bitcast_convert_type(
+ sub(unsigned_dtype(onp.iinfo(signed_dtype).max), unsigned), signed_dtype)
+ return select(lt(signed, _zero(signed)), flipped, signed)
+
+# Default comparator that sorts the operands only on their first arguments.
+# For floating point types, a total order is created where
+# -NaN < -infinity < ... < -0 < 0 < ... < infinity < NaN.
+# For complex types, the (real, imag) pairs are sorted lexicographically
+# (following NumPy's semantics).
+# This code adds complex-number support to the algorithm from:
+# https://github.com/tensorflow/tensorflow/blob/ba43780830f09da72081fe5061c436f1c6203a92/tensorflow/compiler/xla/client/lib/comparators.h#L33
+def _sort_lt_comparator(*operands):
+ assert len(operands) >= 2 and len(operands) % 2 == 0, operands
+ x, y = operands[:2]
+ assert x.dtype == y.dtype, (x.dtype, y.dtype)
+ if onp.issubdtype(x.dtype, onp.complexfloating):
+ x_keys = [_float_to_int_for_sort(real(x)), _float_to_int_for_sort(imag(x))]
+ y_keys = [_float_to_int_for_sort(real(y)), _float_to_int_for_sort(imag(y))]
+ elif onp.issubdtype(x.dtype, onp.floating):
+ x_keys = [_float_to_int_for_sort(x)]
+ y_keys = [_float_to_int_for_sort(y)]
+ else:
+ x_keys = [x]
+ y_keys = [y]
+
+ p = None
+ for xk, yk in zip(x_keys[::-1], y_keys[::-1]):
+ p = (bitwise_or(lt(xk, yk), bitwise_and(eq(xk, yk), p)) if p is not None
+ else lt(xk, yk))
+ return p
+
def _sort_translation_rule(c, *operands, dimension):
- out = xops.Sort(c, operands, dimension=dimension, is_stable=True)
+ types = [c.get_shape(x).xla_element_type() for x in operands]
+ subc = xla_bridge.make_computation_builder("sort_lt_comparator")
+ params = [xb.parameter(subc, 2 * i + j, xc.Shape.array_shape(typ, ()))
+ for i, typ in enumerate(types) for j in range(2)]
+ result = xla.lower_fun(_sort_lt_comparator,
+ multiple_results=False)(subc, *params)
+ comparator = subc.build(result)
+ out = xops.Sort(c, operands, dimension=dimension, is_stable=True,
+ comparator=comparator)
return out if len(operands) != 1 else xops.Tuple(c, [out])
def _sort_jvp(primals, tangents, *, dimension):
| RuntimeError: Unimplemented: complex comparison 'LT'
I'm getting a weird internal JAX error with this script:
```python
import jax.numpy as jp
def lqr_continuous_time_infinite_horizon(A, B, Q, R, N):
# Take the last dimension, in case we try to do some kind of broadcasting
# thing in the future.
x_dim = A.shape[-1]
# See https://en.wikipedia.org/wiki/Linear%E2%80%93quadratic_regulator#Infinite-horizon,_continuous-time_LQR.
A1 = A - B @ jp.linalg.solve(R, N.T)
Q1 = Q - N @ jp.linalg.solve(R, N.T)
# See https://en.wikipedia.org/wiki/Algebraic_Riccati_equation#Solution.
H = jp.block([[A1, -B @ jp.linalg.solve(R, B.T)], [-Q1, -A1]])
eigvals, eigvectors = jp.linalg.eig(H)
argsort = jp.argsort(eigvals)
ix = argsort[:x_dim]
U = eigvectors[:, ix]
P = U[x_dim:, :] @ jp.linalg.inv(U[:x_dim, :])
K = jp.linalg.solve(R, (B.T @ P + N.T))
return K, P, eigvals[ix]
def _test_lqr(n):
import control
from jax.tree_util import tree_multimap
A = jp.eye(n)
B = jp.eye(n)
Q = jp.eye(n)
R = jp.eye(n)
N = jp.zeros((n, n))
actual = lqr_continuous_time_infinite_horizon(A, B, Q, R, N)
expected = control.lqr(A, B, Q, R, N)
assert tree_multimap(jp.allclose, actual, expected)
if __name__ == "__main__":
_test_lqr(2)
```
I'm getting:
```
❯ pipenv run python -m research.lqr
/Users/skainswo/dev/jax/jax/lib/xla_bridge.py:116: UserWarning: No GPU/TPU found, falling back to CPU.
warnings.warn('No GPU/TPU found, falling back to CPU.')
Traceback (most recent call last):
File "/usr/local/Cellar/python/3.7.6_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/usr/local/Cellar/python/3.7.6_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/Users/skainswo/dev/research/research/lqr.py", line 50, in <module>
_test_lqr1(2)
File "/Users/skainswo/dev/research/research/lqr.py", line 45, in _test_lqr1
actual = lqr_continuous_time_infinite_horizon(A, B, Q, R, N)
File "/Users/skainswo/dev/research/research/lqr.py", line 26, in lqr_continuous_time_infinite_horizon
argsort = jp.argsort(eigvals)
File "/Users/skainswo/dev/jax/jax/numpy/lax_numpy.py", line 2886, in argsort
_, perm = lax.sort_key_val(a, iota, dimension=axis)
File "/Users/skainswo/dev/jax/jax/lax/lax.py", line 1190, in sort_key_val
result = sort_key_val_p.bind(keys, values, dimension=dimension)
File "/Users/skainswo/dev/jax/jax/core.py", line 211, in bind
return self.impl(*args, **kwargs)
File "/Users/skainswo/dev/jax/jax/interpreters/xla.py", line 217, in apply_primitive
compiled_fun = xla_primitive_callable(prim, *map(arg_spec, args), **params)
File "/Users/skainswo/dev/jax/jax/interpreters/xla.py", line 254, in xla_primitive_callable
compiled = backend.compile(built_c, compile_options=options)
RuntimeError: Unimplemented: complex comparison 'LT'
```
I'm guessing that this has something to do with the fact that I'm getting complex eigenvalues out, but the error message is pretty confusing...
| Yes, this is certainly a bad error.
I'm curious if you actually want the `np.argsort` behavior on complex numbers (lexicographic order) or just a better error... either way, I guess we'll implement the `np.argsort` behavior.
@hawkinsp Yeah, that's a great question. I think I'd personally prefer a better error, but only because it's not immediately clear to me what the right ordering should be. | 2020-05-14T18:51:47Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/Cellar/python/3.7.6_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/usr/local/Cellar/python/3.7.6_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/Users/skainswo/dev/research/research/lqr.py", line 50, in <module>
_test_lqr1(2)
File "/Users/skainswo/dev/research/research/lqr.py", line 45, in _test_lqr1
actual = lqr_continuous_time_infinite_horizon(A, B, Q, R, N)
File "/Users/skainswo/dev/research/research/lqr.py", line 26, in lqr_continuous_time_infinite_horizon
argsort = jp.argsort(eigvals)
File "/Users/skainswo/dev/jax/jax/numpy/lax_numpy.py", line 2886, in argsort
_, perm = lax.sort_key_val(a, iota, dimension=axis)
File "/Users/skainswo/dev/jax/jax/lax/lax.py", line 1190, in sort_key_val
result = sort_key_val_p.bind(keys, values, dimension=dimension)
File "/Users/skainswo/dev/jax/jax/core.py", line 211, in bind
return self.impl(*args, **kwargs)
File "/Users/skainswo/dev/jax/jax/interpreters/xla.py", line 217, in apply_primitive
compiled_fun = xla_primitive_callable(prim, *map(arg_spec, args), **params)
File "/Users/skainswo/dev/jax/jax/interpreters/xla.py", line 254, in xla_primitive_callable
compiled = backend.compile(built_c, compile_options=options)
RuntimeError: Unimplemented: complex comparison 'LT'
| 5,509 |
|||
google/jax | google__jax-3207 | 0eace80a6e12ded1ad7ecb3e7d6ccb41b84e97e4 | diff --git a/jax/lax/lax_control_flow.py b/jax/lax/lax_control_flow.py
--- a/jax/lax/lax_control_flow.py
+++ b/jax/lax/lax_control_flow.py
@@ -329,7 +329,7 @@ def _while_loop_translation_rule(c, axis_env, name_stack, avals, backend, *args,
body_pred, = xla.jaxpr_subcomp(body_c, cond_jaxpr.jaxpr, backend, axis_env,
_map(partial(xb.constant, body_c), cond_jaxpr.literals),
extend_name_stack(name_stack, 'body_pred'), *(x + z))
- new_z = _map(partial(_pred_bcast_select, body_c, body_pred), new_z, z)
+ new_z = _map(partial(_pred_bcast_select, body_c, body_pred), new_z, z, body_jaxpr.out_avals)
assert _map(body_c.get_shape, new_z) == _map(body_c.get_shape, z) # no broadcast
new_carry = xops.Tuple(body_c, list(itertools.chain(x, y, new_z)))
@@ -338,13 +338,14 @@ def _while_loop_translation_rule(c, axis_env, name_stack, avals, backend, *args,
_, _, z = split_list(ans_elts, [cond_nconsts, body_nconsts])
return xops.Tuple(c, z)
-def _pred_bcast_select(c, pred, x, y):
+def _pred_bcast_select(c, pred, x, y, x_y_aval: core.AbstractValue):
pred_shape = c.get_shape(pred).dimensions()
x_shape = c.get_shape(x).dimensions()
y_shape = c.get_shape(y).dimensions()
assert x_shape == y_shape
- if not c.get_shape(x).is_array() and not c.get_shape(y).is_array():
- # Two tokens
+ if x_y_aval is core.abstract_unit:
+ return x
+ elif x_y_aval is core.abstract_token:
return xops.AfterAll(c, [x, y])
else:
assert pred_shape == x_shape[:len(pred_shape)] == y_shape[:len(pred_shape)]
| While-loop vmap bug
The following code block runs as expected on jax 0.1.63, jaxlib 0.1.45, but fails on all later versions, including master:
```python
import jax
import jax.numpy as np
from jax.experimental import loops
def test(a,b):
with loops.Scope() as s:
s.val = 0
s.i = 0
s.j = 0
for _ in s.while_range(lambda: s.i < a + 1):
s.j = 0
for _ in s.while_range(lambda: s.j < b + 1):
s.val += s.i + s.j
s.j += 1
s.i += 1
return s.val
# vectorized version
vmap_test = jax.vmap(test, (0,0))
arr = np.arange(5)
vmap_test(arr, arr)
```
<details><summary>Click for Traceback</summary>
<p>
```python
Traceback (most recent call last):
File "test.py", line 21, in <module>
print(vmap_test(arr, arr))
File "/home/adabbott/Git/jax/jax/jax/api.py", line 858, in batched_fun
lambda: flatten_axes(out_tree(), out_axes))
File "/home/adabbott/Git/jax/jax/jax/interpreters/batching.py", line 34, in batch
return batched_fun.call_wrapped(*in_vals)
File "/home/adabbott/Git/jax/jax/jax/linear_util.py", line 150, in call_wrapped
ans = self.f(*args, **dict(self.params, **kwargs))
File "test.py", line 12, in test
for _ in s.while_range(lambda: s.j < b + 1):
File "/home/adabbott/Git/jax/jax/jax/experimental/loops.py", line 341, in __next__
self.end_tracing_body()
File "/home/adabbott/Git/jax/jax/jax/experimental/loops.py", line 407, in end_tracing_body
carried_init_vals, body_typed_jaxpr, body_const_vals)
File "/home/adabbott/Git/jax/jax/jax/experimental/loops.py", line 576, in build_output_vals
body_jaxpr=body_typed_jaxpr)
File "/home/adabbott/Git/jax/jax/jax/core.py", line 212, in bind
out_tracer = top_trace.process_primitive(self, tracers, kwargs)
File "/home/adabbott/Git/jax/jax/jax/interpreters/partial_eval.py", line 141, in process_primitive
return custom_partial_eval_rules[primitive](self, *tracers, **params)
File "/home/adabbott/Git/jax/jax/jax/lax/lax_control_flow.py", line 517, in _while_partial_eval
body_jaxpr=body_jaxpr_known)
File "/home/adabbott/Git/jax/jax/jax/core.py", line 212, in bind
out_tracer = top_trace.process_primitive(self, tracers, kwargs)
File "/home/adabbott/Git/jax/jax/jax/interpreters/batching.py", line 134, in process_primitive
val_out, dim_out = batched_primitive(vals_in, dims_in, **params)
File "/home/adabbott/Git/jax/jax/jax/lax/lax_control_flow.py", line 391, in _while_loop_batching_rule
body_nconsts=body_nconsts, body_jaxpr=body_jaxpr_batched)
File "/home/adabbott/Git/jax/jax/jax/core.py", line 209, in bind
return self.impl(*args, **kwargs)
File "/home/adabbott/Git/jax/jax/jax/interpreters/xla.py", line 217, in apply_primitive
compiled_fun = xla_primitive_callable(prim, *map(arg_spec, args), **params)
File "/home/adabbott/Git/jax/jax/jax/interpreters/xla.py", line 248, in xla_primitive_callable
*avals, **params)
File "/home/adabbott/Git/jax/jax/jax/interpreters/xla.py", line 295, in primitive_computation
*xla_args, **params)
File "/home/adabbott/Git/jax/jax/jax/lax/lax_control_flow.py", line 332, in _while_loop_translation_rule
new_z = _map(partial(_pred_bcast_select, body_c, body_pred), new_z, z)
File "/home/adabbott/Git/jax/jax/jax/util.py", line 34, in safe_map
return list(map(f, *args))
File "/home/adabbott/Git/jax/jax/jax/lax/lax_control_flow.py", line 350, in _pred_bcast_select
assert pred_shape == x_shape[:len(pred_shape)] == y_shape[:len(pred_shape)]
AssertionError
```
</p>
</details>
It appears to only occur when the nested while-loop variable `b` is vectorized:
```python
# this works
vmap_test = jax.vmap(test, (0,None))
vmap_test(arr, 3)
# this fails
vmap_test = jax.vmap(test, (None,0))
vmap_test(3, arr)
```
| I should also note the same behavior occurs when using `jax.lax.while_loop` directly, without the convenience of the `loops` module:
<details><summary>Click for pure jax.lax.while_loop version </summary>
<p>
```python
import jax
import jax.numpy as np
def test(a,b):
val = 0
i = 0
j = 0
condfun_1 = lambda inp: inp[1] < a + 1
condfun_2 = lambda inp: inp[2] < b + 1
def bodyfun_1(inp):
val, i, j = inp
j = 0
def bodyfun_2(inp):
val, i, j = inp
val += i + j
j += 1
return (val, i, j)
result = jax.lax.while_loop(condfun_2, bodyfun_2, (val,i,j))
val = result[0]
i += 1
return (val, i, j)
result = jax.lax.while_loop(condfun_1, bodyfun_1, (val,i,j))
return result[0]
arr = np.arange(5)
vmap_test = jax.vmap(test, (0,0))
vmap_test(arr, arr)
```
</p>
</details> | 2020-05-26T06:43:29Z | [] | [] |
Traceback (most recent call last):
File "test.py", line 21, in <module>
print(vmap_test(arr, arr))
File "/home/adabbott/Git/jax/jax/jax/api.py", line 858, in batched_fun
lambda: flatten_axes(out_tree(), out_axes))
File "/home/adabbott/Git/jax/jax/jax/interpreters/batching.py", line 34, in batch
return batched_fun.call_wrapped(*in_vals)
File "/home/adabbott/Git/jax/jax/jax/linear_util.py", line 150, in call_wrapped
ans = self.f(*args, **dict(self.params, **kwargs))
File "test.py", line 12, in test
for _ in s.while_range(lambda: s.j < b + 1):
File "/home/adabbott/Git/jax/jax/jax/experimental/loops.py", line 341, in __next__
self.end_tracing_body()
File "/home/adabbott/Git/jax/jax/jax/experimental/loops.py", line 407, in end_tracing_body
carried_init_vals, body_typed_jaxpr, body_const_vals)
File "/home/adabbott/Git/jax/jax/jax/experimental/loops.py", line 576, in build_output_vals
body_jaxpr=body_typed_jaxpr)
File "/home/adabbott/Git/jax/jax/jax/core.py", line 212, in bind
out_tracer = top_trace.process_primitive(self, tracers, kwargs)
File "/home/adabbott/Git/jax/jax/jax/interpreters/partial_eval.py", line 141, in process_primitive
return custom_partial_eval_rules[primitive](self, *tracers, **params)
File "/home/adabbott/Git/jax/jax/jax/lax/lax_control_flow.py", line 517, in _while_partial_eval
body_jaxpr=body_jaxpr_known)
File "/home/adabbott/Git/jax/jax/jax/core.py", line 212, in bind
out_tracer = top_trace.process_primitive(self, tracers, kwargs)
File "/home/adabbott/Git/jax/jax/jax/interpreters/batching.py", line 134, in process_primitive
val_out, dim_out = batched_primitive(vals_in, dims_in, **params)
File "/home/adabbott/Git/jax/jax/jax/lax/lax_control_flow.py", line 391, in _while_loop_batching_rule
body_nconsts=body_nconsts, body_jaxpr=body_jaxpr_batched)
File "/home/adabbott/Git/jax/jax/jax/core.py", line 209, in bind
return self.impl(*args, **kwargs)
File "/home/adabbott/Git/jax/jax/jax/interpreters/xla.py", line 217, in apply_primitive
compiled_fun = xla_primitive_callable(prim, *map(arg_spec, args), **params)
File "/home/adabbott/Git/jax/jax/jax/interpreters/xla.py", line 248, in xla_primitive_callable
*avals, **params)
File "/home/adabbott/Git/jax/jax/jax/interpreters/xla.py", line 295, in primitive_computation
*xla_args, **params)
File "/home/adabbott/Git/jax/jax/jax/lax/lax_control_flow.py", line 332, in _while_loop_translation_rule
new_z = _map(partial(_pred_bcast_select, body_c, body_pred), new_z, z)
File "/home/adabbott/Git/jax/jax/jax/util.py", line 34, in safe_map
return list(map(f, *args))
File "/home/adabbott/Git/jax/jax/jax/lax/lax_control_flow.py", line 350, in _pred_bcast_select
assert pred_shape == x_shape[:len(pred_shape)] == y_shape[:len(pred_shape)]
AssertionError
| 5,527 |
|||
google/jax | google__jax-671 | b5d95f8b84a435aba511665b6f2f02e561b71c7a | diff --git a/jax/numpy/lax_numpy.py b/jax/numpy/lax_numpy.py
--- a/jax/numpy/lax_numpy.py
+++ b/jax/numpy/lax_numpy.py
@@ -2153,9 +2153,7 @@ def _is_slice_none(idx):
def _is_advanced_int_indexer(idx):
"""Returns True if idx should trigger int array indexing, False otherwise."""
# https://docs.scipy.org/doc/numpy/reference/arrays.indexing.html#advanced-indexing
- if isinstance(idx, (tuple, list)):
- # We assume this check comes *after* the check for non-advanced tuple index,
- # and hence we already know at least one element is a sequence if it's a tuple
+ if isinstance(idx, (tuple, list)) and _any(onp.ndim(elt) != 0 for elt in idx):
return _all(e is None or e is Ellipsis or isinstance(e, slice)
or _is_int_arraylike(e) for e in idx)
else:
diff --git a/jax/ops/__init__.py b/jax/ops/__init__.py
--- a/jax/ops/__init__.py
+++ b/jax/ops/__init__.py
@@ -14,4 +14,4 @@
from __future__ import absolute_import
-from .scatter import index, index_add, index_update
\ No newline at end of file
+from .scatter import index, index_add, index_update, segment_sum
diff --git a/jax/ops/scatter.py b/jax/ops/scatter.py
--- a/jax/ops/scatter.py
+++ b/jax/ops/scatter.py
@@ -26,6 +26,19 @@
from ..numpy import lax_numpy as np
+# TODO(mattjj): clean up this logic
+def _is_advanced_int_indexer(idx):
+ _int = lambda aval: not aval.shape and onp.issubdtype(aval.dtype, onp.integer)
+ try:
+ abstract_idx = core.get_aval(idx)
+ except TypeError:
+ abstract_idx = None
+ out = not (isinstance(abstract_idx, ConcreteArray) and _int(abstract_idx) or
+ isinstance(abstract_idx, ShapedArray) and _int(abstract_idx) or
+ isinstance(idx, slice) or
+ isinstance(idx, tuple) and all(onp.ndim(elt) == 0 for elt in idx))
+ return out and np._is_advanced_int_indexer(idx)
+
def _scatter_update(x, idx, y, scatter_op):
"""Helper for indexed updates.
@@ -33,11 +46,19 @@ def _scatter_update(x, idx, y, scatter_op):
x[idx] op= y
except in a pure functional way, with no in-place updating.
- Support NumPy-style basic indexing only, i.e., `idx` must be
- `None`, an integer, a `slice` object, or ellipses, or a tuple of the above.
+ Args:
+ x: ndarray to be updated.
+ idx: None, an integer, a slice, an ellipsis, an ndarray with integer dtype,
+ or a tuple of those indicating the locations of `x` into which to scatter-
+ update the values in `y`.
+ y: values to be scattered.
+ scatter_op: callable, either lax.scatter or lax.scatter_add.
- TODO(phawkins): support advanced indexing.
+ Returns:
+ An ndarray representing an updated `x` after performing the scatter-update.
"""
+ # For more clues on the logic of this implementation, see the code for
+ # jax.numpy._rewriting_take (which has links to NumPy docs).
x = np.asarray(x)
y = np.asarray(y)
@@ -45,14 +66,45 @@ def _scatter_update(x, idx, y, scatter_op):
y_shape = np.shape(y)
y = lax.convert_element_type(y, lax.dtype(x))
+ # Check if there's advanced indexing going on, and handle differently based on
+ # whether it is or isn't mixed with basic indexing.
+ if _is_advanced_int_indexer(idx):
+ if np._is_advanced_int_indexer_without_slices(idx):
+ if isinstance(idx, (tuple, list)):
+ if any(onp.shape(e) for e in idx):
+ # At least one sequence element in the index list means broadcasting.
+ idx = np.broadcast_arrays(*idx)
+ else:
+ # The index list is a flat list of integers.
+ idx = [lax.concatenate([lax.reshape(e, (1,)) for e in idx], 0)]
+ else:
+ # The indexer is just a single integer array.
+ idx = [idx]
+
+ stacked_idx = np.concatenate(
+ [np.mod(np.reshape(a, (-1, 1)), np._constant_like(a, x.shape[i]))
+ for i, a in enumerate(idx)], axis=1)
+
+ y = np.broadcast_to(y, idx[0].shape + onp.shape(x)[len(idx):])
+ y = lax.reshape(y, (stacked_idx.shape[0],) + onp.shape(x)[len(idx):])
+
+ dnums = lax.ScatterDimensionNumbers(
+ update_window_dims=tuple(range(1, y.ndim)),
+ inserted_window_dims=tuple(range(len(idx))),
+ scatter_dims_to_operand_dims=tuple(range(len(idx))))
+ return scatter_op(x, stacked_idx, y, dnums)
+ elif np._is_advanced_int_indexer(idx):
+ # TODO(mattjj, phawkins): one of us is going to implement this case someday
+ msg = "Unimplemented case for indexed update. Open a feature request!"
+ raise NotImplementedError(msg)
+ else:
+ assert False # unreachable
+
+ # At this point there's no advanced indexing going on, so we process each
+ # element of the index one at a time to build up a scatter.
if not isinstance(idx, tuple):
idx = (idx,)
- # Test for unsupported advanced indexing and report an error.
- if any(onp.ndim(elt) != 0 for elt in idx):
- raise NotImplementedError("Unimplemented case for indexed update. Advanced "
- "indexing is not yet implemented.")
-
# Remove ellipses and add trailing slice(None)s.
idx = np._canonicalize_tuple_index(x, idx)
@@ -189,10 +241,11 @@ def index_add(x, idx, y):
(e.g., due to concurrency on some hardware platforms).
Args:
- x: an array.
- idx: a Numpy-style basic index, consisting of `None`, integers, `slice`
- objects, ellipses, or a tuple of the above. A convenient syntactic sugar
- for forming indices is via the :data:`jax.ops.index` object.
+ x: an array with the values to be updated.
+ idx: a Numpy-style index, consisting of `None`, integers, `slice` objects,
+ ellipses, ndarrays with integer dtypes, or a tuple of the above. A
+ convenient syntactic sugar for forming indices is via the
+ :data:`jax.ops.index` object.
y: the array of updates. `y` must be broadcastable to the shape of the
array that would be returned by `x[idx]`.
@@ -225,10 +278,11 @@ def index_update(x, idx, y):
updates on some hardware platforms).
Args:
- x: an array.
- idx: a Numpy-style basic index, consisting of `None`, integers, `slice`
- objects, ellipses, or a tuple of the above. A convenient syntactic sugar
- for forming indices is via the :data:`jax.ops.index` object.
+ x: an array with the values to be updated.
+ idx: a Numpy-style index, consisting of `None`, integers, `slice` objects,
+ ellipses, ndarrays with integer dtypes, or a tuple of the above. A
+ convenient syntactic sugar for forming indices is via the
+ :data:`jax.ops.index` object.
y: the array of updates. `y` must be broadcastable to the shape of the
array that would be returned by `x[idx]`.
@@ -244,3 +298,32 @@ def index_update(x, idx, y):
[1., 1., 1., 6., 6., 6.]], dtype=float32)
"""
return _scatter_update(x, idx, y, lax.scatter)
+
+def segment_sum(data, segment_ids, num_segments=None):
+ """Computes the sum within segments of an array.
+
+ Similar to TensorFlow's segment_sum:
+ https://www.tensorflow.org/api_docs/python/tf/math/segment_sum
+
+ Args:
+ data: an array with the values to be summed.
+ segment_ids: an array with integer dtype that indicates the segments of
+ `data` (along its leading axis) to be summed. Values can be repeated and
+ need not be sorted. Values outside of the range [0, num_segments) are
+ wrapped into that range by applying np.mod.
+ num_segments: optional, an int with positive value indicating the number of
+ segments. The default is ``max(segment_ids % data.shape[0]) + 1`` but
+ since `num_segments` determines the size of the output, a static value
+ must be provided to use `segment_sum` in a `jit`-compiled function.
+
+ Returns:
+ An array with shape ``(num_segments,) + data.shape[1:]`` representing the
+ segment sums.
+ """
+ if num_segments is None:
+ num_segments = np.max(np.mod(segment_ids, data.shape[0])) + 1
+ num_segments = int(num_segments)
+
+ out = np.zeros((num_segments,) + data.shape[1:], dtype=data.dtype)
+ segment_ids = np.mod(segment_ids, num_segments)
+ return index_add(out, segment_ids, data)
| segment_sum primitives / advanced indexing in jax.ops.index_add
Would it be useful to others (aside from me) to have better support of sorted and unsorted segment_sums?
https://www.tensorflow.org/api_docs/python/tf/math/unsorted_segment_sum
https://www.tensorflow.org/api_docs/python/tf/math/segment_sum
In numpy one way to do unsorted segment sums is to sort then call np.add.reduceat, but this doesn't seem to be in jax or autograd:
```
>>> import jax.numpy as np
np.add
>>> np.add
<function _one_to_one_binop.<locals>.<lambda> at 0x1430cdbf8>
>>> np.add.reduceat
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'function' object has no attribute 'reduceat'
```
| Does `jax.ops.index_add` do what you need? It seems roughly equivalent to `unsorted_segment_sum`.
https://jax.readthedocs.io/en/latest/_autosummary/jax.ops.index_add.html#jax.ops.index_add
Ah, I guess it does not *yet* because it does not support advanced indexing. We should fix that.
Nice! Would it be hard to support the basic operators +,-,*,/ on top of add? Not sure if there's associativity assumptions here in the underlying implementation or not
Edit: Nevermind: pretty sure - and / will be very hard to do correctly now that I think about it a little more for more than a single operation (since JAX says that it applies all the updates)
After playing around a little bit with the numpy version of advanced index assignment, I take it the idea is to implement the unsorted/sorted segment sums using something similar to:
``` python
ys = np.arange(5)
idxs = [0,0,1,0,1]
sums = np.zeros(2) # two slots
seg_sum = jax.ops.index_add(sums, jax.ops.index[idxs], ys)
```
edit: of course the jax version will work because it actually accumulates properly as opposed to just applying the last element
Yup, that's the idea! | 2019-05-04T23:27:01Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'function' object has no attribute 'reduceat'
| 5,589 |
|||
google/jax | google__jax-703 | 78c804772e47f98721acd31a7e0953504aa504a8 | diff --git a/build/build.py b/build/build.py
--- a/build/build.py
+++ b/build/build.py
@@ -60,19 +60,19 @@ def get_python_bin_path(python_bin_path_flag):
# Bazel
-BAZEL_BASE_URI = "https://github.com/bazelbuild/bazel/releases/download/0.25.1/"
+BAZEL_BASE_URI = "https://github.com/bazelbuild/bazel/releases/download/0.24.1/"
BazelPackage = collections.namedtuple("BazelPackage", ["file", "sha256"])
bazel_packages = {
"Linux":
BazelPackage(
- file="bazel-0.25.1-linux-x86_64",
+ file="bazel-0.24.1-linux-x86_64",
sha256=
- "9fe5a74fa319e771b0328b42f79bf00496592fd9c0989247b4dd322ce9a082e9"),
+ "e18e2877e18a447eb5d94f5efbec375366d82af6443c6a83a93c62657a7b1c32"),
"Darwin":
BazelPackage(
- file="bazel-0.25.1-darwin-x86_64",
+ file="bazel-0.24.1-darwin-x86_64",
sha256=
- "436e34cf8cf47f43620a70927e3fcdb1f23659e3e0ae22e42ff8b6d8b7626cfa"),
+ "cf763752550050d117e03659aaa6ccd6f97da1f983a6029300a497fdaeaaec46"),
}
@@ -140,8 +140,8 @@ def get_bazel_path(bazel_path_flag):
sys.exit(-1)
-def check_bazel_version(bazel_path, min_version):
- """Checks Bazel's version is at least `min_version`."""
+def check_bazel_version(bazel_path, min_version, max_version):
+ """Checks Bazel's version is in the range [`min_version`, `max_version`)."""
version_output = shell([bazel_path, "--bazelrc=/dev/null", "version"])
match = re.search("Build label: *([0-9\\.]+)[^0-9\\.]", version_output)
if match is None:
@@ -155,6 +155,12 @@ def check_bazel_version(bazel_path, min_version):
print("Outdated bazel revision (>= {} required, found {})".format(
min_version, version))
sys.exit(0)
+ if max_version is not None:
+ max_ints = [int(x) for x in max_version.split(".")]
+ if actual_ints >= max_ints:
+ print("Please downgrade your bazel revision to build JAX (>= {} and < {}"
+ " required, found {})".format(min_version, max_version, version))
+ sys.exit(0)
BAZELRC_TEMPLATE = """
@@ -283,7 +289,7 @@ def main():
# Find a working Bazel.
bazel_path = get_bazel_path(args.bazel_path)
- check_bazel_version(bazel_path, "0.25.0")
+ check_bazel_version(bazel_path, min_version="0.24.0", max_version="0.25.0")
print("Bazel binary path: {}".format(bazel_path))
python_bin_path = get_python_bin_path(args.python_bin_path)
| Build on linux, got "@local_config_cuda//crosstool:cc-compiler-windows" error
Built jax on master 78c804772e47f98721acd31a7e0953504aa504a8 gives
```
➜ jax git:(master) python build/build.py --enable_march_native --enable_mkl_dnn --enable_cuda
_ _ __ __
| | / \ \ \/ /
_ | |/ _ \ \ /
| |_| / ___ \/ \
\___/_/ \/_/\_\
Bazel binary path: /home/titan/bin/bazel
Python binary path: /home/titan/frameworks/miniconda3/envs/ml/bin/python
MKL-DNN enabled: yes
-march=native: yes
CUDA enabled: yes
Building XLA and installing it in the jaxlib source tree...
INFO: An error occurred during the fetch of repository 'local_config_python'
INFO: Call stack for the definition of repository 'local_config_python':
- /home/titan/.cache/bazel/_bazel_titan/0fa25a5a75e40e18875f78640c2a6d59/external/org_tensorflow/tensorflow/workspace.bzl:69:5
- /home/titan/resources/oss/machine-learning/Google/jax/WORKSPACE:41:1
ERROR: /home/titan/.cache/bazel/_bazel_titan/0fa25a5a75e40e18875f78640c2a6d59/external/local_config_cuda/crosstool/BUILD:64:1: in cc_toolchain rule @local_config_cuda//crosstool:cc-compiler-windows: attributes 'cpu' and 'compiler' have been deprecated, please remove them. See https://github.com/bazelbuild/bazel/issues/7075 for details.
ERROR: Analysis of target '//build:install_xla_in_source_tree' failed; build aborted: Analysis of target '@local_config_cuda//crosstool:cc-compiler-windows' failed; build aborted
INFO: Elapsed time: 0.095s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded, 1 target configured)
FAILED: Build did NOT complete successfully (0 packages loaded, 1 target configured)
Traceback (most recent call last):
File "build/build.py", line 324, in <module>
main()
File "build/build.py", line 320, in main
[":install_xla_in_source_tree", os.getcwd()])
File "build/build.py", line 50, in shell
output = subprocess.check_output(cmd)
File "/home/titan/frameworks/miniconda3/envs/ml/lib/python3.7/subprocess.py", line 395, in check_output
**kwargs).stdout
File "/home/titan/frameworks/miniconda3/envs/ml/lib/python3.7/subprocess.py", line 487, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['/home/titan/bin/bazel', 'run', '--verbose_failures=true', '--config=opt', '--config=mkl_open_source_only', '--config=cuda', ':install_xla_in_source_tree', '/home/titan/resources/oss/machine-learning/Google/jax/build']' returned non-zero exit status 1.
```
I have successfully built jax before without encounter this error.
| If I'm parsing the log correctly, it's picking up your own version of bazel (/home/titan/bin/bazel), rather than one it automatically downloads. What version of bazel is that?
This is my bazel version:
```
Build label: 0.25.0
Build target: bazel-out/k8-opt/bin/src/main/java/com/google/devtools/build/lib/bazel/BazelServer_deploy.jar
Build time: Wed May 1 21:45:01 2019 (1556747101)
Build timestamp: 1556747101
Build timestamp as int: 1556747101
```
Also try to remove my local bazel to let jax use a download bazel, still an issue:
```
jax git:(master) python build/build.py --enable_march_native --enable_mkl_dnn --enable_cuda
_ _ __ __
| | / \ \ \/ /
_ | |/ _ \ \ /
| |_| / ___ \/ \
\___/_/ \/_/\_\
Downloading bazel from: https://github.com/bazelbuild/bazel/releases/download/0.25.1/bazel-0.25.1-linux-x86_64
bazel-0.25.1-linux-x86_64 [########################################] 100%
Extracting Bazel installation...
Starting local Bazel server and connecting to it...
Bazel binary path: ./bazel-0.25.1-linux-x86_64
Python binary path: /home/titan/frameworks/miniconda3/envs/ml/bin/python
MKL-DNN enabled: yes
-march=native: yes
CUDA enabled: yes
Building XLA and installing it in the jaxlib source tree...
ERROR: /home/titan/.cache/bazel/_bazel_titan/0fa25a5a75e40e18875f78640c2a6d59/external/local_config_cuda/crosstool/BUILD:34:1: in cc_toolchain rule @local_config_cuda//crosstool:cc-compiler-local: attributes 'cpu' and 'compiler' have been deprecated, please remove them. See https://github.com/bazelbuild/bazel/issues/7075 for details.
INFO: An error occurred during the fetch of repository 'local_config_python'
INFO: Call stack for the definition of repository 'local_config_python':
- /home/titan/.cache/bazel/_bazel_titan/0fa25a5a75e40e18875f78640c2a6d59/external/org_tensorflow/tensorflow/workspace.bzl:69:5
- /home/titan/resources/oss/machine-learning/Google/jax/WORKSPACE:41:1
INFO: An error occurred during the fetch of repository 'cython'
INFO: Call stack for the definition of repository 'cython':
- /home/titan/.cache/bazel/_bazel_titan/0fa25a5a75e40e18875f78640c2a6d59/external/org_tensorflow/tensorflow/workspace.bzl:727:5
- /home/titan/resources/oss/machine-learning/Google/jax/WORKSPACE:41:1
INFO: Repository 'cython' used the following cache hits instead of downloading the corresponding file.
* Hash 'bccc9aa050ea02595b2440188813b936eaf345e85fb9692790cecfe095cf91aa' for http://mirror.tensorflow.org/github.com/cython/cython/archive/0.28.4.tar.gz
If the definition of 'cython' was updated, verify that the hashes were also updated.
ERROR: Analysis of target '//build:install_xla_in_source_tree' failed; build aborted: Analysis of target '@local_config_cuda//crosstool:cc-compiler-local' failed; build aborted
INFO: Elapsed time: 1.291s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (27 packages loaded, 76 targets configured)
FAILED: Build did NOT complete successfully (27 packages loaded, 76 targets configured)
currently loading: @org_tensorflow//tensorflow/core ... (5 packages)
Traceback (most recent call last):
File "build/build.py", line 324, in <module>
main()
File "build/build.py", line 320, in main
[":install_xla_in_source_tree", os.getcwd()])
File "build/build.py", line 50, in shell
output = subprocess.check_output(cmd)
File "/home/titan/frameworks/miniconda3/envs/ml/lib/python3.6/subprocess.py", line 336, in check_output
**kwargs).stdout
File "/home/titan/frameworks/miniconda3/envs/ml/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['./bazel-0.25.1-linux-x86_64', 'run', '--verbose_failures=true', '--config=opt', '--config=mkl_open_source_only', '--config=cuda', ':install_xla_in_source_tree', '/home/titan/resources/oss/machine-learning/Google/jax/build']' returned non-zero exit status 1.
```
I also tried to clean `.cache/bazel`, still the same error as above.
Thanks for raising this, and for looking into it so thoroughly! Sure looks like jaxlib's building process is broken.
I'm not very familiar with bazel, but [this recent pr by @hawkinsp](https://github.com/google/jax/pull/698/files) might have some clues. Presumably the build worked then...
7f9e1809bbd035b1998268cde68c1b9567c26a21 seems to be the issue. Hard reset to 60918077c285a787c697b694d8d7bb04da56a9e5 works for me. | 2019-05-12T13:26:39Z | [] | [] |
Traceback (most recent call last):
File "build/build.py", line 324, in <module>
main()
File "build/build.py", line 320, in main
[":install_xla_in_source_tree", os.getcwd()])
File "build/build.py", line 50, in shell
output = subprocess.check_output(cmd)
File "/home/titan/frameworks/miniconda3/envs/ml/lib/python3.7/subprocess.py", line 395, in check_output
**kwargs).stdout
File "/home/titan/frameworks/miniconda3/envs/ml/lib/python3.7/subprocess.py", line 487, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['/home/titan/bin/bazel', 'run', '--verbose_failures=true', '--config=opt', '--config=mkl_open_source_only', '--config=cuda', ':install_xla_in_source_tree', '/home/titan/resources/oss/machine-learning/Google/jax/build']' returned non-zero exit status 1.
| 5,596 |
|||
google/jax | google__jax-778 | 90eda42e6efe6edab5bb8db330a88f173ef7c60f | diff --git a/jax/lax/lax.py b/jax/lax/lax.py
--- a/jax/lax/lax.py
+++ b/jax/lax/lax.py
@@ -2804,6 +2804,9 @@ def _scatter_add_transpose_rule(t, operand, scatter_indices, updates,
update_jaxpr, update_consts, dimension_numbers,
updates_shape):
assert scatter_indices is not None
+ if t is ad_util.zero:
+ return [ad_util.zero, None, ad_util.zero]
+
operand_t = update_t = None
if operand is None:
operand_t = t
| Scatter transpose rule doesn’t handle symbolic zeros
Unfortunately, I don't have a terribly concise repro for this error. From playing around with it, it seems as though the error probably involves index_update, but it's hard to be sure. However, given the function
```python
def nvt_nose_hoover(energy_fn, dt, T, chain_length=5, tau=0.01):
force = grad(lambda R, *args, **kwargs: -energy_fn(R, *args, **kwargs))
dt_2 = dt / 2.0
dt_4 = dt_2 / 2.0
dt_8 = dt_4 / 2.0
dt, dt_2, dt_4, dt_8, tau = static_cast(dt, dt_2, dt_4, dt_8, tau)
def init_fun(key, R):
V = random.normal(key, R.shape, dtype=R.dtype)
KE = 0.5 * np.mean(V ** 2)
# Nose-Hoover parameters.
xi = np.zeros(chain_length, R.dtype)
v_xi = np.zeros(chain_length, R.dtype)
DOF = R.shape[0] * R.shape[1]
Q = tau ** f32(2) * np.ones(chain_length, dtype=R.dtype)
Q = ops.index_update(Q, 0, Q[0] * DOF)
return R, V, KE, xi, v_xi, Q
def step_chain(KE, V, xi, v_xi, Q, DOF, T):
"""Applies a single update to the chain parameters and rescales velocity."""
M = chain_length - 1
G = (Q[M - 1] * v_xi[M - 1] ** f32(2) - T) / Q[M]
v_xi = ops.index_add(v_xi, M, dt_4 * G)
G = (f32(2.0) * KE - DOF * T) / Q[0]
scale = np.exp(-dt_8 * v_xi[1])
v_xi = ops.index_update(v_xi, 0, scale * (scale * v_xi[0] + dt_4 * G))
return KE, V, xi, v_xi
def apply_fun(state, t=0.0, **kwargs):
R, V, KE, xi, v_xi, Q = state
DOF = R.shape[0] * R.shape[1]
KE, V, xi, v_xi = step_chain(KE, V, xi, v_xi, Q, DOF, T)
R = R + dt_2 * V
F = force(R, t=t, **kwargs)
V = V + dt * F
KE = 0.5 * np.mean(V ** 2)
R = R + dt_2 * V
KE, V, xi, v_xi = step_chain(KE, V, xi, v_xi, Q, DOF, T)
return R, V, KE, xi, v_xi, Q
return init_fun, apply_fun
```
The following code proceeds without error,
```python
def do_sim(scale):
key = random.PRNGKey(0)
R_key, R0_key, V_key = random.split(key, 3)
R = random.normal(
R_key, (PARTICLE_COUNT, spatial_dimension), dtype=dtype)
R0 = random.normal(
R0_key, (PARTICLE_COUNT, spatial_dimension), dtype=dtype)
E = functools.partial(
lambda R, R0, **kwargs: scale * np.sum((R - R0) ** 2), R0=R0)
init_fn, apply_fn = nvt_nose_hoover(E, 1e-3, 0.1)
state = init_fn(V_key, R)
for _ in range(2):
state = apply_fn(state)
return E(state[0])
assert grad(do_sim)(1.0) > 0.0
```
However, when the apply_fn is jitted, as in
```python
def do_sim(scale):
key = random.PRNGKey(0)
R_key, R0_key, V_key = random.split(key, 3)
R = random.normal(
R_key, (PARTICLE_COUNT, spatial_dimension), dtype=dtype)
R0 = random.normal(
R0_key, (PARTICLE_COUNT, spatial_dimension), dtype=dtype)
_, shift = space.free()
E = functools.partial(
lambda R, R0, **kwargs: scale * np.sum((R - R0) ** 2), R0=R0)
init_fn, apply_fn = simulate.nvt_nose_hoover(E, 1e-3, 0.1)
apply_fn = jit(apply_fn)
state = init_fn(V_key, R)
for _ in range(2):
state = apply_fn(state)
return E(state[0])
assert grad(do_sim)(1.0) > 0.0
```
The code throws an error with the following stack trace,
```
Traceback (most recent call last):
File "/usr/local/google/home/schsam/.local/lib/python2.7/site-packages/absl/third_party/unittest3_backport/case.py", line 37, in testPartExecutor
yield
File "/usr/local/google/home/schsam/.local/lib/python2.7/site-packages/absl/third_party/unittest3_backport/case.py", line 162, in run
testMethod()
File "/usr/local/google/home/schsam/.local/lib/python2.7/site-packages/absl/testing/parameterized.py", line 262, in bound_param_test
test_method(self, **testcase_params)
File "simulate_test.py", line 164, in test_grad_through_nvt
assert grad(do_sim)(1.0) > 0.0
File "/usr/local/google/home/schsam/Source/jax/jax/api.py", line 235, in grad_f
_, g = value_and_grad_f(*args, **kwargs)
File "/usr/local/google/home/schsam/Source/jax/jax/api.py", line 289, in value_and_grad_f
g = vjp_py(onp.ones((), dtype=dtype))
File "/usr/local/google/home/schsam/Source/jax/jax/api_util.py", line 62, in apply_jaxtree_fun
ans = fun(*args)
File "/usr/local/google/home/schsam/Source/jax/jax/api.py", line 822, in out_vjp_packed
return out_vjp(cotangent_in)
File "/usr/local/google/home/schsam/Source/jax/jax/interpreters/ad.py", line 112, in vjp_
_, arg_cts = backward_pass(jaxpr, consts, (), dummy_args, dummy_primal_and_ct)
File "/usr/local/google/home/schsam/Source/jax/jax/interpreters/ad.py", line 180, in backward_pass
eqn.params, subjaxprs, sub_consts, sub_freevar_vals, invals, ct_in)
File "/usr/local/google/home/schsam/Source/jax/jax/interpreters/ad.py", line 536, in call_transpose
ans = primitive.bind(fun, all_args, **params)
File "/usr/local/google/home/schsam/Source/jax/jax/core.py", line 636, in call_bind
ans = primitive.impl(f, *args, **params)
File "/usr/local/google/home/schsam/Source/jax/jax/interpreters/xla.py", line 591, in xla_call_impl
compiled_fun = xla_callable(fun, device_values, *map(abstractify, args))
File "/usr/local/google/home/schsam/Source/jax/jax/linear_util.py", line 208, in memoized_fun
ans = call(f, *args)
File "/usr/local/google/home/schsam/Source/jax/jax/interpreters/xla.py", line 604, in xla_callable
jaxpr, (pval, consts, env) = pe.trace_to_subjaxpr(fun, master, False).call_wrapped(pvals)
File "/usr/local/google/home/schsam/Source/jax/jax/linear_util.py", line 147, in call_wrapped
ans = self.f(*args, **dict(self.params, **kwargs))
File "/usr/local/google/home/schsam/Source/jax/jax/interpreters/ad.py", line 186, in backward_pass
cts_out = get_primitive_transpose(eqn.primitive)(ct_in, *invals, **eqn.params)
File "/usr/local/google/home/schsam/Source/jax/jax/lax/lax.py", line 2818, in _scatter_add_transpose_rule
for i in xrange(len(t.shape)):
AttributeError: 'Zero' object has no attribute 'shape'
```
| Looks like just a bug in the scatter transpose rule; it doesn’t handle symbolic zeros like transpose rules need to.
Totally for my own curiosity, is there a reason why this only affects the version where apply_fn was jitted? | 2019-05-28T14:31:47Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/google/home/schsam/.local/lib/python2.7/site-packages/absl/third_party/unittest3_backport/case.py", line 37, in testPartExecutor
yield
File "/usr/local/google/home/schsam/.local/lib/python2.7/site-packages/absl/third_party/unittest3_backport/case.py", line 162, in run
testMethod()
File "/usr/local/google/home/schsam/.local/lib/python2.7/site-packages/absl/testing/parameterized.py", line 262, in bound_param_test
test_method(self, **testcase_params)
File "simulate_test.py", line 164, in test_grad_through_nvt
assert grad(do_sim)(1.0) > 0.0
File "/usr/local/google/home/schsam/Source/jax/jax/api.py", line 235, in grad_f
_, g = value_and_grad_f(*args, **kwargs)
File "/usr/local/google/home/schsam/Source/jax/jax/api.py", line 289, in value_and_grad_f
g = vjp_py(onp.ones((), dtype=dtype))
File "/usr/local/google/home/schsam/Source/jax/jax/api_util.py", line 62, in apply_jaxtree_fun
ans = fun(*args)
File "/usr/local/google/home/schsam/Source/jax/jax/api.py", line 822, in out_vjp_packed
return out_vjp(cotangent_in)
File "/usr/local/google/home/schsam/Source/jax/jax/interpreters/ad.py", line 112, in vjp_
_, arg_cts = backward_pass(jaxpr, consts, (), dummy_args, dummy_primal_and_ct)
File "/usr/local/google/home/schsam/Source/jax/jax/interpreters/ad.py", line 180, in backward_pass
eqn.params, subjaxprs, sub_consts, sub_freevar_vals, invals, ct_in)
File "/usr/local/google/home/schsam/Source/jax/jax/interpreters/ad.py", line 536, in call_transpose
ans = primitive.bind(fun, all_args, **params)
File "/usr/local/google/home/schsam/Source/jax/jax/core.py", line 636, in call_bind
ans = primitive.impl(f, *args, **params)
File "/usr/local/google/home/schsam/Source/jax/jax/interpreters/xla.py", line 591, in xla_call_impl
compiled_fun = xla_callable(fun, device_values, *map(abstractify, args))
File "/usr/local/google/home/schsam/Source/jax/jax/linear_util.py", line 208, in memoized_fun
ans = call(f, *args)
File "/usr/local/google/home/schsam/Source/jax/jax/interpreters/xla.py", line 604, in xla_callable
jaxpr, (pval, consts, env) = pe.trace_to_subjaxpr(fun, master, False).call_wrapped(pvals)
File "/usr/local/google/home/schsam/Source/jax/jax/linear_util.py", line 147, in call_wrapped
ans = self.f(*args, **dict(self.params, **kwargs))
File "/usr/local/google/home/schsam/Source/jax/jax/interpreters/ad.py", line 186, in backward_pass
cts_out = get_primitive_transpose(eqn.primitive)(ct_in, *invals, **eqn.params)
File "/usr/local/google/home/schsam/Source/jax/jax/lax/lax.py", line 2818, in _scatter_add_transpose_rule
for i in xrange(len(t.shape)):
AttributeError: 'Zero' object has no attribute 'shape'
| 5,616 |
|||
google/jax | google__jax-884 | 4e9d79268fd9ccbbcaf20d9f0aeb92b2a3a04601 | diff --git a/jax/api.py b/jax/api.py
--- a/jax/api.py
+++ b/jax/api.py
@@ -937,7 +937,7 @@ def _wrap_hashably(arg):
try:
hash(arg)
except TypeError:
- return WrapHashably(arg)
+ return WrapHashably(arg) # e.g. ndarrays, DeviceArrays
else:
return Hashable(arg)
diff --git a/jax/interpreters/xla.py b/jax/interpreters/xla.py
--- a/jax/interpreters/xla.py
+++ b/jax/interpreters/xla.py
@@ -593,11 +593,7 @@ def __format__(self, format_spec):
def __eq__(self, other): return self._value == other
def __hash__(self):
- # TODO(mattjj): this is not semantically correct because it is possible
- # __eq__ is true for values with unequal __hash__ values. However, the
- # main use case at the moment is memoization for which false negatives are
- # fine.
- return id(self)
+ raise TypeError("JAX DeviceArray, like numpy.ndarray, is not hashable.")
scalar_types.add(DeviceArray)
diff --git a/jax/lax/lax.py b/jax/lax/lax.py
--- a/jax/lax/lax.py
+++ b/jax/lax/lax.py
@@ -1041,7 +1041,7 @@ def _conv_transpose_padding(k, s, padding):
else:
pad_a = int(onp.ceil(pad_len / 2))
elif padding == 'VALID':
- pad_len = k + s - 2 + max(k - s, 0)
+ pad_len = k + s - 2 + _max(k - s, 0)
pad_a = k - 1
else:
raise ValueError('Padding mode must be `SAME` or `VALID`.')
diff --git a/jax/util.py b/jax/util.py
--- a/jax/util.py
+++ b/jax/util.py
@@ -213,7 +213,6 @@ def __eq__(self, other):
return self.val == other.val
-
def get_module_functions(module):
"""Finds functions in module.
Args:
| DeviceArray as static argument: incompatible with jit cache?
First of all, thank you for this nice package. I had quite some fun playing around with it in the past couple of days.
I am getting errors as soon as I call a `jit`ed function for the second time if that function treats an argument of type `DeviceArray` as static. The error occurs while querying if the function is already cached. A minimal example is shown below:
```python
import jax.numpy as np
from jax import jit, partial
@partial(jit, static_argnums=(1,))
def f(x,v):
return x
def main():
x = np.ones((10,10))
v = np.array([1,2,3])
firstCall = f(x,v)
print(firstCall)
secondCall = f(x,v)
print(secondCall)
if __name__=='__main__':
main()
```
The raised error:
```python
Traceback (most recent call last):
File "<ipython-input-7-563ff9ef5fe4>", line 1, in <module>
runfile('/home/hpc/capm/sn0523/SIR/minExample.py', wdir='/home/hpc/capm/sn0523/SIR')
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/spyder_kernels/customize/spydercustomize.py", line 827, in runfile
execfile(filename, namespace)
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/spyder_kernels/customize/spydercustomize.py", line 110, in execfile
exec(compile(f.read(), filename, 'exec'), namespace)
File "/home/hpc/capm/sn0523/SIR/minExample.py", line 25, in <module>
main()
File "/home/hpc/capm/sn0523/SIR/minExample.py", line 21, in main
secondCall = f(x,v)
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/jax/api.py", line 123, in f_jitted
out = xla.xla_call(flat_fun, *args_flat, device_values=device_values)
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/jax/core.py", line 658, in call_bind
ans = primitive.impl(f, *args, **params)
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/jax/interpreters/xla.py", line 653, in xla_call_impl
compiled_fun = xla_callable(fun, device_values, *map(abstractify, args))
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/jax/linear_util.py", line 201, in memoized_fun
if key in cache:
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/jax/linear_util.py", line 174, in __eq__
return self.hashable_payload() == other.hashable_payload()
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/jax/interpreters/xla.py", line 489, in forward_method
return fun(getattr(self, attrname), *args)
ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()
```
Surprisingly - at least for me - the same code runs flawlessly if the static argument is a ordinary `numpy.ndarray`. Is this a bug or the intended behavior?
| Definitely a bug! Thanks for the simple repro. | 2019-06-19T17:18:01Z | [] | [] |
Traceback (most recent call last):
File "<ipython-input-7-563ff9ef5fe4>", line 1, in <module>
runfile('/home/hpc/capm/sn0523/SIR/minExample.py', wdir='/home/hpc/capm/sn0523/SIR')
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/spyder_kernels/customize/spydercustomize.py", line 827, in runfile
execfile(filename, namespace)
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/spyder_kernels/customize/spydercustomize.py", line 110, in execfile
exec(compile(f.read(), filename, 'exec'), namespace)
File "/home/hpc/capm/sn0523/SIR/minExample.py", line 25, in <module>
main()
File "/home/hpc/capm/sn0523/SIR/minExample.py", line 21, in main
secondCall = f(x,v)
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/jax/api.py", line 123, in f_jitted
out = xla.xla_call(flat_fun, *args_flat, device_values=device_values)
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/jax/core.py", line 658, in call_bind
ans = primitive.impl(f, *args, **params)
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/jax/interpreters/xla.py", line 653, in xla_call_impl
compiled_fun = xla_callable(fun, device_values, *map(abstractify, args))
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/jax/linear_util.py", line 201, in memoized_fun
if key in cache:
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/jax/linear_util.py", line 174, in __eq__
return self.hashable_payload() == other.hashable_payload()
File "/home/hpc/capm/sn0523/jax/lib/python3.6/site-packages/jax/interpreters/xla.py", line 489, in forward_method
return fun(getattr(self, attrname), *args)
ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()
| 5,636 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-10076 | b387134827dbc3be0e1b431201e0875798002fda | diff --git a/recommender/synth.py b/recommender/synth.py
--- a/recommender/synth.py
+++ b/recommender/synth.py
@@ -29,7 +29,8 @@
for version in versions:
library = gapic.py_library(
"recommender", version,
- include_protos=True
+ include_protos=True,
+ config_path="/google/cloud/recommender/v1beta1/artman_recommender_v1beta1.yaml"
)
s.move(library, excludes=['nox.py', 'docs/index.rst', 'README.rst', 'setup.py'])
| Synthesis failed for recommender
Hello! Autosynth couldn't regenerate recommender. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/74dee1a3-0367-43bf-9f40-1001ae7ea243).
| Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/723fbd40-45df-4c9e-a7e6-697632a21bb4).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/e7253e70-663e-4b46-afba-40e553894a71).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/96ff7db8-7596-4be7-b334-8dcfaa8b27bc).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/30ee95aa-c33a-4cd4-8740-35b8400340cb).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/4dd9d4d7-646c-40f1-ab1c-851c1483d843).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/481f0b8f-e42d-4427-b838-200840f538d8).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/4afb6855-4f08-4b1f-8ec4-e77e3c81f05b).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/93712291-c7a7-41d8-aa83-2a0e77e6d453).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/2d180c1c-5ab9-4da4-bb7d-2ffc1a716260).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/33593ebc-06fa-43e3-bf47-9f68fd1fa851).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/1ce02b08-dd55-4364-b003-2049ca724ff2).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/62192014-f4a8-4ae7-876e-849afc5ec52c).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/cc3ec087-8d87-4a35-b84e-36391917cf9e).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/00068b1d-889a-4e6c-8f61-6e58ae9171ed).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/d4100184-0ec8-4a04-9461-f94f6564ed0c).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/87cde9b2-210b-4592-85df-e0c8727b2cbd).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/bf876db0-7b7a-43ac-8cef-12e359d52409).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/7e45fdc8-b999-4bd4-b21a-3bfe94eed074).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/31cff84c-78e0-4d3b-92d6-c000254f6532).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/44366b90-2997-4274-9416-29b86b9c44fd).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/35061e8f-800f-4ef3-a16d-d22ef723aff6).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 95, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 83, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/7ce80c29-7aa2-4627-89a5-bffd90130b90).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/14d5f6bc-2faf-470d-a1b6-9228cfa04971).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/e583beab-6ca7-4940-8428-04f8b12c3b1e).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/c8da2d68-203c-4cb1-925c-ff43446796c5).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 108, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/da39f20f-3447-4d27-87b1-1868ee143d82).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/21cd1f1f-2c7c-4aaf-a5b1-f70748b00b7b).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/3ea4f089-9281-4eef-ad63-220b05ef926b).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/e8fd0036-ec2e-4eae-9413-91f3ad7a41e9).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/1f9fc3dc-2005-46be-bd97-4aec26827767).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/674756f4-a69f-47f8-88bf-aeff096441a9).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/a6a33d6c-ef08-4f16-a602-2afe4f996566).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/01e4056c-2886-4b8d-b13a-ccaef2a0ff0b).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/dbe38c94-10c7-4c41-91bf-df1ff1602815).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/6a0f51ba-2065-4589-bd0a-1270f2b5b4d9).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/a45ecdda-3dc4-4947-a6d6-b8f049d33553).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/357b4c4f-869a-4ada-8790-8c8ff82246f1).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/4a1b49e0-b39b-4dc1-bf05-aef4d0fccfbf).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/5cb5b24e-53f9-43f0-8267-5bb44d3c2bcf).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/5f27b597-fa96-43a7-8436-46bbf4bd43b9).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/34eafaa0-e4bd-40ef-bf68-29b3dd66f717).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/bb13fd52-1903-4707-bc88-fce0f5f3e491).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:264654a37596a44b0668b8ce6ac41082d713f6ee150b3fc6425fa78cc64e4f20
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/7b7fb516-2505-42fe-8040-b474ff8c3a04).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:264654a37596a44b0668b8ce6ac41082d713f6ee150b3fc6425fa78cc64e4f20
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/564b1aa7-df88-4842-b4fd-7ecbb1af7352).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:264654a37596a44b0668b8ce6ac41082d713f6ee150b3fc6425fa78cc64e4f20
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/cbd2d7fe-87fe-43a3-a0a0-1bef1f2b0c78).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:264654a37596a44b0668b8ce6ac41082d713f6ee150b3fc6425fa78cc64e4f20
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/72bd8463-aa4a-4b2f-a674-f472babfb6c3).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:264654a37596a44b0668b8ce6ac41082d713f6ee150b3fc6425fa78cc64e4f20
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/71aad1bd-5408-4718-b0ea-5e1a0055bfc6).
Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:264654a37596a44b0668b8ce6ac41082d713f6ee150b3fc6425fa78cc64e4f20
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/3cf73926-ecfb-4d28-998b-2b45a4de79e8).
| 2020-01-08T20:44:24Z | [] | [] |
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 32, in <module>
include_protos=True
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code
f"Unable to find configuration yaml file: {(googleapis / config_path)}."
FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/recommender/artman_recommender_v1beta1.yaml.
| 5,661 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-10077 | b387134827dbc3be0e1b431201e0875798002fda | diff --git a/cloudbuild/synth.py b/cloudbuild/synth.py
--- a/cloudbuild/synth.py
+++ b/cloudbuild/synth.py
@@ -39,9 +39,9 @@
'nox*.py',
'setup.py',
'setup.cfg',
- 'README.rst'
- '**/*.proto'
- 'google/cloud/devtools/__init__.py' # declare this as a namespace package
+ 'README.rst',
+ '**/*.proto',
+ 'google/cloud/devtools/__init__.py', # declare this as a namespace package
],
)
| Synthesis failed for cloudbuild
Hello! Autosynth couldn't regenerate cloudbuild. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/9a1d852f-709e-47f8-804f-f67508f36db2).
| Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/744054f6-bbd7-463c-8cae-3173ec508556).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/74dee1a3-0367-43bf-9f40-1001ae7ea243).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/723fbd40-45df-4c9e-a7e6-697632a21bb4).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/e7253e70-663e-4b46-afba-40e553894a71).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/96ff7db8-7596-4be7-b334-8dcfaa8b27bc).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/30ee95aa-c33a-4cd4-8740-35b8400340cb).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/4dd9d4d7-646c-40f1-ab1c-851c1483d843).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/481f0b8f-e42d-4427-b838-200840f538d8).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/4afb6855-4f08-4b1f-8ec4-e77e3c81f05b).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/93712291-c7a7-41d8-aa83-2a0e77e6d453).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/2d180c1c-5ab9-4da4-bb7d-2ffc1a716260).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/33593ebc-06fa-43e3-bf47-9f68fd1fa851).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/1ce02b08-dd55-4364-b003-2049ca724ff2).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/62192014-f4a8-4ae7-876e-849afc5ec52c).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/cc3ec087-8d87-4a35-b84e-36391917cf9e).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/00068b1d-889a-4e6c-8f61-6e58ae9171ed).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/d4100184-0ec8-4a04-9461-f94f6564ed0c).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/87cde9b2-210b-4592-85df-e0c8727b2cbd).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
e80174c8b43b: Pulling fs layer
d1072db285cc: Pulling fs layer
858453671e67: Pulling fs layer
3d07b1124f98: Pulling fs layer
0bcd130eda36: Pulling fs layer
5cceec7f4e01: Pulling fs layer
53c06b559c6a: Pulling fs layer
d2bba2c7f11f: Pulling fs layer
24cb4b2de85f: Pulling fs layer
658ab44dc43b: Pulling fs layer
937571f0d8ac: Pulling fs layer
48dbc0e8d85d: Pulling fs layer
6c14da5a84fe: Pulling fs layer
162e3b1208cf: Pulling fs layer
3460c0f48165: Pulling fs layer
0055179ea37f: Pulling fs layer
dbe6b609efc2: Pulling fs layer
7c28acbb94b0: Pulling fs layer
b1d22290f30b: Pulling fs layer
719062edd34d: Pulling fs layer
bd911dc1ff2e: Pulling fs layer
783d4fe863d2: Pulling fs layer
ab1d753ecb80: Pulling fs layer
7e11f22eab1a: Pulling fs layer
d339f693838c: Pulling fs layer
44517ffde706: Pulling fs layer
0435c69e347e: Pulling fs layer
485949b20cae: Pulling fs layer
3d07b1124f98: Waiting
0bcd130eda36: Waiting
5cceec7f4e01: Waiting
7c28acbb94b0: Waiting
b1d22290f30b: Waiting
6c14da5a84fe: Waiting
719062edd34d: Waiting
bd911dc1ff2e: Waiting
783d4fe863d2: Waiting
162e3b1208cf: Waiting
3460c0f48165: Waiting
ab1d753ecb80: Waiting
0055179ea37f: Waiting
d2bba2c7f11f: Waiting
7e11f22eab1a: Waiting
24cb4b2de85f: Waiting
d339f693838c: Waiting
dbe6b609efc2: Waiting
44517ffde706: Waiting
658ab44dc43b: Waiting
0435c69e347e: Waiting
485949b20cae: Waiting
937571f0d8ac: Waiting
d1072db285cc: Download complete
858453671e67: Download complete
3d07b1124f98: Verifying Checksum
3d07b1124f98: Download complete
e80174c8b43b: Download complete
0bcd130eda36: Verifying Checksum
0bcd130eda36: Download complete
53c06b559c6a: Verifying Checksum
53c06b559c6a: Download complete
d2bba2c7f11f: Verifying Checksum
d2bba2c7f11f: Download complete
24cb4b2de85f: Verifying Checksum
24cb4b2de85f: Download complete
658ab44dc43b: Verifying Checksum
658ab44dc43b: Download complete
48dbc0e8d85d: Verifying Checksum
48dbc0e8d85d: Download complete
937571f0d8ac: Download complete
e80174c8b43b: Pull complete
5cceec7f4e01: Verifying Checksum
5cceec7f4e01: Download complete
162e3b1208cf: Verifying Checksum
162e3b1208cf: Download complete
d1072db285cc: Pull complete
858453671e67: Pull complete
3d07b1124f98: Pull complete
0055179ea37f: Verifying Checksum
0055179ea37f: Download complete
3460c0f48165: Verifying Checksum
3460c0f48165: Download complete
6c14da5a84fe: Verifying Checksum
6c14da5a84fe: Download complete
dbe6b609efc2: Download complete
7c28acbb94b0: Verifying Checksum
7c28acbb94b0: Download complete
b1d22290f30b: Verifying Checksum
b1d22290f30b: Download complete
bd911dc1ff2e: Verifying Checksum
bd911dc1ff2e: Download complete
0bcd130eda36: Pull complete
ab1d753ecb80: Verifying Checksum
ab1d753ecb80: Download complete
7e11f22eab1a: Verifying Checksum
7e11f22eab1a: Download complete
719062edd34d: Download complete
d339f693838c: Verifying Checksum
d339f693838c: Download complete
44517ffde706: Verifying Checksum
44517ffde706: Download complete
0435c69e347e: Verifying Checksum
0435c69e347e: Download complete
485949b20cae: Verifying Checksum
485949b20cae: Download complete
783d4fe863d2: Verifying Checksum
783d4fe863d2: Download complete
5cceec7f4e01: Pull complete
53c06b559c6a: Pull complete
d2bba2c7f11f: Pull complete
24cb4b2de85f: Pull complete
658ab44dc43b: Pull complete
937571f0d8ac: Pull complete
48dbc0e8d85d: Pull complete
6c14da5a84fe: Pull complete
162e3b1208cf: Pull complete
3460c0f48165: Pull complete
0055179ea37f: Pull complete
dbe6b609efc2: Pull complete
7c28acbb94b0: Pull complete
b1d22290f30b: Pull complete
719062edd34d: Pull complete
bd911dc1ff2e: Pull complete
783d4fe863d2: Pull complete
ab1d753ecb80: Pull complete
7e11f22eab1a: Pull complete
d339f693838c: Pull complete
44517ffde706: Pull complete
0435c69e347e: Pull complete
485949b20cae: Pull complete
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Downloaded newer image for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/bf876db0-7b7a-43ac-8cef-12e359d52409).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
e80174c8b43b: Pulling fs layer
d1072db285cc: Pulling fs layer
858453671e67: Pulling fs layer
3d07b1124f98: Pulling fs layer
0bcd130eda36: Pulling fs layer
5cceec7f4e01: Pulling fs layer
53c06b559c6a: Pulling fs layer
d2bba2c7f11f: Pulling fs layer
24cb4b2de85f: Pulling fs layer
658ab44dc43b: Pulling fs layer
937571f0d8ac: Pulling fs layer
48dbc0e8d85d: Pulling fs layer
6c14da5a84fe: Pulling fs layer
162e3b1208cf: Pulling fs layer
3460c0f48165: Pulling fs layer
0055179ea37f: Pulling fs layer
dbe6b609efc2: Pulling fs layer
7c28acbb94b0: Pulling fs layer
b1d22290f30b: Pulling fs layer
719062edd34d: Pulling fs layer
bd911dc1ff2e: Pulling fs layer
783d4fe863d2: Pulling fs layer
ab1d753ecb80: Pulling fs layer
7e11f22eab1a: Pulling fs layer
d339f693838c: Pulling fs layer
44517ffde706: Pulling fs layer
0435c69e347e: Pulling fs layer
485949b20cae: Pulling fs layer
5cceec7f4e01: Waiting
3d07b1124f98: Waiting
0bcd130eda36: Waiting
d2bba2c7f11f: Waiting
162e3b1208cf: Waiting
24cb4b2de85f: Waiting
3460c0f48165: Waiting
658ab44dc43b: Waiting
0055179ea37f: Waiting
dbe6b609efc2: Waiting
7c28acbb94b0: Waiting
bd911dc1ff2e: Waiting
b1d22290f30b: Waiting
783d4fe863d2: Waiting
719062edd34d: Waiting
ab1d753ecb80: Waiting
6c14da5a84fe: Waiting
937571f0d8ac: Waiting
48dbc0e8d85d: Waiting
7e11f22eab1a: Waiting
d339f693838c: Waiting
44517ffde706: Waiting
485949b20cae: Waiting
0435c69e347e: Waiting
53c06b559c6a: Waiting
d1072db285cc: Verifying Checksum
d1072db285cc: Download complete
858453671e67: Verifying Checksum
858453671e67: Download complete
3d07b1124f98: Verifying Checksum
3d07b1124f98: Download complete
e80174c8b43b: Verifying Checksum
e80174c8b43b: Download complete
0bcd130eda36: Download complete
53c06b559c6a: Verifying Checksum
53c06b559c6a: Download complete
d2bba2c7f11f: Verifying Checksum
d2bba2c7f11f: Download complete
658ab44dc43b: Verifying Checksum
658ab44dc43b: Download complete
24cb4b2de85f: Verifying Checksum
24cb4b2de85f: Download complete
48dbc0e8d85d: Verifying Checksum
48dbc0e8d85d: Download complete
937571f0d8ac: Verifying Checksum
937571f0d8ac: Download complete
e80174c8b43b: Pull complete
162e3b1208cf: Verifying Checksum
162e3b1208cf: Download complete
d1072db285cc: Pull complete
5cceec7f4e01: Verifying Checksum
5cceec7f4e01: Download complete
858453671e67: Pull complete
3d07b1124f98: Pull complete
0055179ea37f: Verifying Checksum
0055179ea37f: Download complete
3460c0f48165: Verifying Checksum
3460c0f48165: Download complete
6c14da5a84fe: Verifying Checksum
6c14da5a84fe: Download complete
dbe6b609efc2: Verifying Checksum
dbe6b609efc2: Download complete
7c28acbb94b0: Verifying Checksum
7c28acbb94b0: Download complete
b1d22290f30b: Download complete
bd911dc1ff2e: Verifying Checksum
bd911dc1ff2e: Download complete
0bcd130eda36: Pull complete
ab1d753ecb80: Verifying Checksum
ab1d753ecb80: Download complete
7e11f22eab1a: Verifying Checksum
7e11f22eab1a: Download complete
719062edd34d: Verifying Checksum
719062edd34d: Download complete
d339f693838c: Download complete
44517ffde706: Verifying Checksum
44517ffde706: Download complete
0435c69e347e: Download complete
485949b20cae: Verifying Checksum
485949b20cae: Download complete
783d4fe863d2: Verifying Checksum
783d4fe863d2: Download complete
5cceec7f4e01: Pull complete
53c06b559c6a: Pull complete
d2bba2c7f11f: Pull complete
24cb4b2de85f: Pull complete
658ab44dc43b: Pull complete
937571f0d8ac: Pull complete
48dbc0e8d85d: Pull complete
6c14da5a84fe: Pull complete
162e3b1208cf: Pull complete
3460c0f48165: Pull complete
0055179ea37f: Pull complete
dbe6b609efc2: Pull complete
7c28acbb94b0: Pull complete
b1d22290f30b: Pull complete
719062edd34d: Pull complete
bd911dc1ff2e: Pull complete
783d4fe863d2: Pull complete
ab1d753ecb80: Pull complete
7e11f22eab1a: Pull complete
d339f693838c: Pull complete
44517ffde706: Pull complete
0435c69e347e: Pull complete
485949b20cae: Pull complete
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Downloaded newer image for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/7e45fdc8-b999-4bd4-b21a-3bfe94eed074).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/31cff84c-78e0-4d3b-92d6-c000254f6532).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:c773192618c608a7a0415dd95282f841f8e6bcdef7dd760a988c93b77a64bd57
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/44366b90-2997-4274-9416-29b86b9c44fd).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/35061e8f-800f-4ef3-a16d-d22ef723aff6).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 95, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 83, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/7ce80c29-7aa2-4627-89a5-bffd90130b90).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/14d5f6bc-2faf-470d-a1b6-9228cfa04971).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/e583beab-6ca7-4940-8428-04f8b12c3b1e).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/c8da2d68-203c-4cb1-925c-ff43446796c5).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 108, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/da39f20f-3447-4d27-87b1-1868ee143d82).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/21cd1f1f-2c7c-4aaf-a5b1-f70748b00b7b).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/3ea4f089-9281-4eef-ad63-220b05ef926b).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/e8fd0036-ec2e-4eae-9413-91f3ad7a41e9).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/1f9fc3dc-2005-46be-bd97-4aec26827767).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/674756f4-a69f-47f8-88bf-aeff096441a9).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/a6a33d6c-ef08-4f16-a602-2afe4f996566).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/01e4056c-2886-4b8d-b13a-ccaef2a0ff0b).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/dbe38c94-10c7-4c41-91bf-df1ff1602815).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/6a0f51ba-2065-4589-bd0a-1270f2b5b4d9).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/a45ecdda-3dc4-4947-a6d6-b8f049d33553).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/357b4c4f-869a-4ada-8790-8c8ff82246f1).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/4a1b49e0-b39b-4dc1-bf05-aef4d0fccfbf).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/5cb5b24e-53f9-43f0-8267-5bb44d3c2bcf).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/5f27b597-fa96-43a7-8436-46bbf4bd43b9).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/34eafaa0-e4bd-40ef-bf68-29b3dd66f717).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:feed210b5723c6f524b52ef6d7740a030f2d1a8f7c29a71c5e5b4481ceaad7f5
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/bb13fd52-1903-4707-bc88-fce0f5f3e491).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:264654a37596a44b0668b8ce6ac41082d713f6ee150b3fc6425fa78cc64e4f20
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/7b7fb516-2505-42fe-8040-b474ff8c3a04).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
3386e6af03b0: Pulling fs layer
49ac0bbe6c8e: Pulling fs layer
d1983a67e104: Pulling fs layer
1a0f3a523f04: Pulling fs layer
c703e8b6fa19: Pulling fs layer
35cfe729a390: Pulling fs layer
9eafba7dcf77: Pulling fs layer
356023b5809f: Pulling fs layer
2d2350eddfd7: Pulling fs layer
d7586a41e104: Pulling fs layer
2c00e43340b3: Pulling fs layer
4fcd3032b2c3: Pulling fs layer
b58aa3d6b872: Pulling fs layer
89234dcd9a9c: Pulling fs layer
cb0185d5728e: Pulling fs layer
3d6e58b109a9: Pulling fs layer
cc59cdfeca2c: Pulling fs layer
c0f30d4bde81: Pulling fs layer
489015e5df27: Pulling fs layer
187f7df4b589: Pulling fs layer
ac53ee7d20d5: Pulling fs layer
ca229918bf28: Pulling fs layer
6397ccfab52d: Pulling fs layer
4f4f31a03f53: Pulling fs layer
3381b4caf618: Pulling fs layer
5b5a13d57c39: Pulling fs layer
b9ba03a85542: Pulling fs layer
7d611a1b3fca: Pulling fs layer
1a0f3a523f04: Waiting
cc59cdfeca2c: Waiting
c703e8b6fa19: Waiting
35cfe729a390: Waiting
c0f30d4bde81: Waiting
9eafba7dcf77: Waiting
489015e5df27: Waiting
4fcd3032b2c3: Waiting
6397ccfab52d: Waiting
356023b5809f: Waiting
4f4f31a03f53: Waiting
2d2350eddfd7: Waiting
b58aa3d6b872: Waiting
3381b4caf618: Waiting
d7586a41e104: Waiting
89234dcd9a9c: Waiting
5b5a13d57c39: Waiting
b9ba03a85542: Waiting
cb0185d5728e: Waiting
2c00e43340b3: Waiting
7d611a1b3fca: Waiting
ca229918bf28: Waiting
187f7df4b589: Waiting
d1983a67e104: Download complete
49ac0bbe6c8e: Verifying Checksum
49ac0bbe6c8e: Download complete
3386e6af03b0: Verifying Checksum
3386e6af03b0: Download complete
1a0f3a523f04: Download complete
c703e8b6fa19: Verifying Checksum
c703e8b6fa19: Download complete
9eafba7dcf77: Verifying Checksum
9eafba7dcf77: Download complete
356023b5809f: Verifying Checksum
356023b5809f: Download complete
d7586a41e104: Verifying Checksum
d7586a41e104: Download complete
2d2350eddfd7: Verifying Checksum
2d2350eddfd7: Download complete
4fcd3032b2c3: Verifying Checksum
4fcd3032b2c3: Download complete
2c00e43340b3: Verifying Checksum
2c00e43340b3: Download complete
35cfe729a390: Verifying Checksum
35cfe729a390: Download complete
89234dcd9a9c: Verifying Checksum
89234dcd9a9c: Download complete
3386e6af03b0: Pull complete
49ac0bbe6c8e: Pull complete
3d6e58b109a9: Verifying Checksum
3d6e58b109a9: Download complete
d1983a67e104: Pull complete
cb0185d5728e: Verifying Checksum
cb0185d5728e: Download complete
1a0f3a523f04: Pull complete
b58aa3d6b872: Verifying Checksum
b58aa3d6b872: Download complete
cc59cdfeca2c: Download complete
c0f30d4bde81: Verifying Checksum
c0f30d4bde81: Download complete
489015e5df27: Verifying Checksum
489015e5df27: Download complete
ac53ee7d20d5: Download complete
6397ccfab52d: Verifying Checksum
6397ccfab52d: Download complete
c703e8b6fa19: Pull complete
4f4f31a03f53: Verifying Checksum
4f4f31a03f53: Download complete
3381b4caf618: Verifying Checksum
3381b4caf618: Download complete
187f7df4b589: Download complete
5b5a13d57c39: Verifying Checksum
5b5a13d57c39: Download complete
b9ba03a85542: Download complete
7d611a1b3fca: Verifying Checksum
7d611a1b3fca: Download complete
ca229918bf28: Verifying Checksum
ca229918bf28: Download complete
35cfe729a390: Pull complete
9eafba7dcf77: Pull complete
356023b5809f: Pull complete
2d2350eddfd7: Pull complete
d7586a41e104: Pull complete
2c00e43340b3: Pull complete
4fcd3032b2c3: Pull complete
b58aa3d6b872: Pull complete
89234dcd9a9c: Pull complete
cb0185d5728e: Pull complete
3d6e58b109a9: Pull complete
cc59cdfeca2c: Pull complete
c0f30d4bde81: Pull complete
489015e5df27: Pull complete
187f7df4b589: Pull complete
ac53ee7d20d5: Pull complete
ca229918bf28: Pull complete
6397ccfab52d: Pull complete
4f4f31a03f53: Pull complete
3381b4caf618: Pull complete
5b5a13d57c39: Pull complete
b9ba03a85542: Pull complete
7d611a1b3fca: Pull complete
Digest: sha256:264654a37596a44b0668b8ce6ac41082d713f6ee150b3fc6425fa78cc64e4f20
Status: Downloaded newer image for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/564b1aa7-df88-4842-b4fd-7ecbb1af7352).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
3386e6af03b0: Pulling fs layer
49ac0bbe6c8e: Pulling fs layer
d1983a67e104: Pulling fs layer
1a0f3a523f04: Pulling fs layer
c703e8b6fa19: Pulling fs layer
1a0f3a523f04: Waiting
35cfe729a390: Pulling fs layer
9eafba7dcf77: Pulling fs layer
356023b5809f: Pulling fs layer
2d2350eddfd7: Pulling fs layer
d7586a41e104: Pulling fs layer
2c00e43340b3: Pulling fs layer
4fcd3032b2c3: Pulling fs layer
c703e8b6fa19: Waiting
b58aa3d6b872: Pulling fs layer
89234dcd9a9c: Pulling fs layer
cb0185d5728e: Pulling fs layer
3d6e58b109a9: Pulling fs layer
cc59cdfeca2c: Pulling fs layer
c0f30d4bde81: Pulling fs layer
489015e5df27: Pulling fs layer
187f7df4b589: Pulling fs layer
ac53ee7d20d5: Pulling fs layer
ca229918bf28: Pulling fs layer
6397ccfab52d: Pulling fs layer
35cfe729a390: Waiting
4f4f31a03f53: Pulling fs layer
356023b5809f: Waiting
3381b4caf618: Pulling fs layer
d7586a41e104: Waiting
5b5a13d57c39: Pulling fs layer
b9ba03a85542: Pulling fs layer
2c00e43340b3: Waiting
7d611a1b3fca: Pulling fs layer
4fcd3032b2c3: Waiting
b58aa3d6b872: Waiting
cc59cdfeca2c: Waiting
89234dcd9a9c: Waiting
cb0185d5728e: Waiting
2d2350eddfd7: Waiting
3d6e58b109a9: Waiting
489015e5df27: Waiting
187f7df4b589: Waiting
ac53ee7d20d5: Waiting
4f4f31a03f53: Waiting
3381b4caf618: Waiting
5b5a13d57c39: Waiting
ca229918bf28: Waiting
6397ccfab52d: Waiting
b9ba03a85542: Waiting
7d611a1b3fca: Waiting
c0f30d4bde81: Waiting
49ac0bbe6c8e: Verifying Checksum
49ac0bbe6c8e: Download complete
d1983a67e104: Verifying Checksum
d1983a67e104: Download complete
3386e6af03b0: Verifying Checksum
3386e6af03b0: Download complete
1a0f3a523f04: Verifying Checksum
1a0f3a523f04: Download complete
c703e8b6fa19: Verifying Checksum
c703e8b6fa19: Download complete
9eafba7dcf77: Download complete
356023b5809f: Verifying Checksum
356023b5809f: Download complete
2d2350eddfd7: Verifying Checksum
2d2350eddfd7: Download complete
d7586a41e104: Verifying Checksum
d7586a41e104: Download complete
4fcd3032b2c3: Verifying Checksum
4fcd3032b2c3: Download complete
2c00e43340b3: Verifying Checksum
89234dcd9a9c: Verifying Checksum
89234dcd9a9c: Download complete
35cfe729a390: Verifying Checksum
35cfe729a390: Download complete
3386e6af03b0: Pull complete
49ac0bbe6c8e: Pull complete
d1983a67e104: Pull complete
cb0185d5728e: Verifying Checksum
cb0185d5728e: Download complete
1a0f3a523f04: Pull complete
3d6e58b109a9: Verifying Checksum
3d6e58b109a9: Download complete
b58aa3d6b872: Verifying Checksum
b58aa3d6b872: Download complete
cc59cdfeca2c: Verifying Checksum
cc59cdfeca2c: Download complete
c0f30d4bde81: Verifying Checksum
c0f30d4bde81: Download complete
489015e5df27: Verifying Checksum
489015e5df27: Download complete
ac53ee7d20d5: Verifying Checksum
ac53ee7d20d5: Download complete
6397ccfab52d: Verifying Checksum
6397ccfab52d: Download complete
c703e8b6fa19: Pull complete
4f4f31a03f53: Verifying Checksum
4f4f31a03f53: Download complete
3381b4caf618: Verifying Checksum
3381b4caf618: Download complete
187f7df4b589: Verifying Checksum
187f7df4b589: Download complete
5b5a13d57c39: Verifying Checksum
5b5a13d57c39: Download complete
b9ba03a85542: Download complete
7d611a1b3fca: Verifying Checksum
7d611a1b3fca: Download complete
ca229918bf28: Verifying Checksum
ca229918bf28: Download complete
35cfe729a390: Pull complete
9eafba7dcf77: Pull complete
356023b5809f: Pull complete
2d2350eddfd7: Pull complete
d7586a41e104: Pull complete
2c00e43340b3: Pull complete
4fcd3032b2c3: Pull complete
b58aa3d6b872: Pull complete
89234dcd9a9c: Pull complete
cb0185d5728e: Pull complete
3d6e58b109a9: Pull complete
cc59cdfeca2c: Pull complete
c0f30d4bde81: Pull complete
489015e5df27: Pull complete
187f7df4b589: Pull complete
ac53ee7d20d5: Pull complete
ca229918bf28: Pull complete
6397ccfab52d: Pull complete
4f4f31a03f53: Pull complete
3381b4caf618: Pull complete
5b5a13d57c39: Pull complete
b9ba03a85542: Pull complete
7d611a1b3fca: Pull complete
Digest: sha256:264654a37596a44b0668b8ce6ac41082d713f6ee150b3fc6425fa78cc64e4f20
Status: Downloaded newer image for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/cbd2d7fe-87fe-43a3-a0a0-1bef1f2b0c78).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
3386e6af03b0: Pulling fs layer
49ac0bbe6c8e: Pulling fs layer
d1983a67e104: Pulling fs layer
1a0f3a523f04: Pulling fs layer
c703e8b6fa19: Pulling fs layer
35cfe729a390: Pulling fs layer
9eafba7dcf77: Pulling fs layer
356023b5809f: Pulling fs layer
2d2350eddfd7: Pulling fs layer
d7586a41e104: Pulling fs layer
2c00e43340b3: Pulling fs layer
4fcd3032b2c3: Pulling fs layer
b58aa3d6b872: Pulling fs layer
89234dcd9a9c: Pulling fs layer
cb0185d5728e: Pulling fs layer
3d6e58b109a9: Pulling fs layer
cc59cdfeca2c: Pulling fs layer
c0f30d4bde81: Pulling fs layer
489015e5df27: Pulling fs layer
c703e8b6fa19: Waiting
187f7df4b589: Pulling fs layer
ac53ee7d20d5: Pulling fs layer
ca229918bf28: Pulling fs layer
6397ccfab52d: Pulling fs layer
4f4f31a03f53: Pulling fs layer
3381b4caf618: Pulling fs layer
5b5a13d57c39: Pulling fs layer
b9ba03a85542: Pulling fs layer
7d611a1b3fca: Pulling fs layer
356023b5809f: Waiting
2d2350eddfd7: Waiting
d7586a41e104: Waiting
2c00e43340b3: Waiting
187f7df4b589: Waiting
ac53ee7d20d5: Waiting
4fcd3032b2c3: Waiting
b58aa3d6b872: Waiting
89234dcd9a9c: Waiting
cb0185d5728e: Waiting
3d6e58b109a9: Waiting
cc59cdfeca2c: Waiting
ca229918bf28: Waiting
c0f30d4bde81: Waiting
6397ccfab52d: Waiting
489015e5df27: Waiting
35cfe729a390: Waiting
4f4f31a03f53: Waiting
7d611a1b3fca: Waiting
3381b4caf618: Waiting
5b5a13d57c39: Waiting
b9ba03a85542: Waiting
1a0f3a523f04: Waiting
9eafba7dcf77: Waiting
d1983a67e104: Verifying Checksum
d1983a67e104: Download complete
49ac0bbe6c8e: Download complete
1a0f3a523f04: Verifying Checksum
1a0f3a523f04: Download complete
3386e6af03b0: Verifying Checksum
3386e6af03b0: Download complete
c703e8b6fa19: Verifying Checksum
c703e8b6fa19: Download complete
9eafba7dcf77: Verifying Checksum
9eafba7dcf77: Download complete
356023b5809f: Verifying Checksum
356023b5809f: Download complete
d7586a41e104: Verifying Checksum
d7586a41e104: Download complete
2d2350eddfd7: Verifying Checksum
2d2350eddfd7: Download complete
4fcd3032b2c3: Verifying Checksum
4fcd3032b2c3: Download complete
2c00e43340b3: Verifying Checksum
2c00e43340b3: Download complete
89234dcd9a9c: Verifying Checksum
89234dcd9a9c: Download complete
35cfe729a390: Verifying Checksum
35cfe729a390: Download complete
3386e6af03b0: Pull complete
49ac0bbe6c8e: Pull complete
cb0185d5728e: Verifying Checksum
cb0185d5728e: Download complete
3d6e58b109a9: Verifying Checksum
3d6e58b109a9: Download complete
d1983a67e104: Pull complete
1a0f3a523f04: Pull complete
b58aa3d6b872: Verifying Checksum
b58aa3d6b872: Download complete
c0f30d4bde81: Download complete
cc59cdfeca2c: Verifying Checksum
cc59cdfeca2c: Download complete
489015e5df27: Verifying Checksum
489015e5df27: Download complete
ac53ee7d20d5: Verifying Checksum
ac53ee7d20d5: Download complete
6397ccfab52d: Verifying Checksum
6397ccfab52d: Download complete
4f4f31a03f53: Verifying Checksum
4f4f31a03f53: Download complete
c703e8b6fa19: Pull complete
3381b4caf618: Verifying Checksum
3381b4caf618: Download complete
187f7df4b589: Verifying Checksum
187f7df4b589: Download complete
5b5a13d57c39: Verifying Checksum
5b5a13d57c39: Download complete
b9ba03a85542: Verifying Checksum
b9ba03a85542: Download complete
7d611a1b3fca: Verifying Checksum
7d611a1b3fca: Download complete
ca229918bf28: Verifying Checksum
ca229918bf28: Download complete
35cfe729a390: Pull complete
9eafba7dcf77: Pull complete
356023b5809f: Pull complete
2d2350eddfd7: Pull complete
d7586a41e104: Pull complete
2c00e43340b3: Pull complete
4fcd3032b2c3: Pull complete
b58aa3d6b872: Pull complete
89234dcd9a9c: Pull complete
cb0185d5728e: Pull complete
3d6e58b109a9: Pull complete
cc59cdfeca2c: Pull complete
c0f30d4bde81: Pull complete
489015e5df27: Pull complete
187f7df4b589: Pull complete
ac53ee7d20d5: Pull complete
ca229918bf28: Pull complete
6397ccfab52d: Pull complete
4f4f31a03f53: Pull complete
3381b4caf618: Pull complete
5b5a13d57c39: Pull complete
b9ba03a85542: Pull complete
7d611a1b3fca: Pull complete
Digest: sha256:264654a37596a44b0668b8ce6ac41082d713f6ee150b3fc6425fa78cc64e4f20
Status: Downloaded newer image for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/72bd8463-aa4a-4b2f-a674-f472babfb6c3).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
3386e6af03b0: Pulling fs layer
49ac0bbe6c8e: Pulling fs layer
d1983a67e104: Pulling fs layer
1a0f3a523f04: Pulling fs layer
c703e8b6fa19: Pulling fs layer
35cfe729a390: Pulling fs layer
9eafba7dcf77: Pulling fs layer
356023b5809f: Pulling fs layer
2d2350eddfd7: Pulling fs layer
d7586a41e104: Pulling fs layer
2c00e43340b3: Pulling fs layer
4fcd3032b2c3: Pulling fs layer
b58aa3d6b872: Pulling fs layer
89234dcd9a9c: Pulling fs layer
cb0185d5728e: Pulling fs layer
3d6e58b109a9: Pulling fs layer
cc59cdfeca2c: Pulling fs layer
c0f30d4bde81: Pulling fs layer
489015e5df27: Pulling fs layer
187f7df4b589: Pulling fs layer
ac53ee7d20d5: Pulling fs layer
ca229918bf28: Pulling fs layer
6397ccfab52d: Pulling fs layer
4f4f31a03f53: Pulling fs layer
3381b4caf618: Pulling fs layer
5b5a13d57c39: Pulling fs layer
b9ba03a85542: Pulling fs layer
7d611a1b3fca: Pulling fs layer
1a0f3a523f04: Waiting
c703e8b6fa19: Waiting
35cfe729a390: Waiting
9eafba7dcf77: Waiting
3d6e58b109a9: Waiting
356023b5809f: Waiting
2d2350eddfd7: Waiting
d7586a41e104: Waiting
2c00e43340b3: Waiting
4fcd3032b2c3: Waiting
cc59cdfeca2c: Waiting
b58aa3d6b872: Waiting
4f4f31a03f53: Waiting
c0f30d4bde81: Waiting
489015e5df27: Waiting
3381b4caf618: Waiting
5b5a13d57c39: Waiting
187f7df4b589: Waiting
ac53ee7d20d5: Waiting
ca229918bf28: Waiting
b9ba03a85542: Waiting
89234dcd9a9c: Waiting
7d611a1b3fca: Waiting
6397ccfab52d: Waiting
cb0185d5728e: Waiting
49ac0bbe6c8e: Verifying Checksum
49ac0bbe6c8e: Download complete
d1983a67e104: Verifying Checksum
d1983a67e104: Download complete
1a0f3a523f04: Verifying Checksum
1a0f3a523f04: Download complete
3386e6af03b0: Verifying Checksum
3386e6af03b0: Download complete
c703e8b6fa19: Verifying Checksum
c703e8b6fa19: Download complete
9eafba7dcf77: Verifying Checksum
9eafba7dcf77: Download complete
356023b5809f: Verifying Checksum
356023b5809f: Download complete
2d2350eddfd7: Verifying Checksum
2d2350eddfd7: Download complete
d7586a41e104: Verifying Checksum
d7586a41e104: Download complete
4fcd3032b2c3: Download complete
2c00e43340b3: Verifying Checksum
2c00e43340b3: Download complete
35cfe729a390: Verifying Checksum
35cfe729a390: Download complete
3386e6af03b0: Pull complete
89234dcd9a9c: Verifying Checksum
89234dcd9a9c: Download complete
49ac0bbe6c8e: Pull complete
d1983a67e104: Pull complete
1a0f3a523f04: Pull complete
3d6e58b109a9: Verifying Checksum
3d6e58b109a9: Download complete
cb0185d5728e: Verifying Checksum
cb0185d5728e: Download complete
b58aa3d6b872: Verifying Checksum
b58aa3d6b872: Download complete
cc59cdfeca2c: Download complete
c0f30d4bde81: Verifying Checksum
c0f30d4bde81: Download complete
489015e5df27: Verifying Checksum
489015e5df27: Download complete
ac53ee7d20d5: Download complete
c703e8b6fa19: Pull complete
6397ccfab52d: Verifying Checksum
6397ccfab52d: Download complete
4f4f31a03f53: Verifying Checksum
4f4f31a03f53: Download complete
187f7df4b589: Verifying Checksum
187f7df4b589: Download complete
3381b4caf618: Download complete
5b5a13d57c39: Verifying Checksum
5b5a13d57c39: Download complete
b9ba03a85542: Download complete
7d611a1b3fca: Verifying Checksum
7d611a1b3fca: Download complete
ca229918bf28: Verifying Checksum
ca229918bf28: Download complete
35cfe729a390: Pull complete
9eafba7dcf77: Pull complete
356023b5809f: Pull complete
2d2350eddfd7: Pull complete
d7586a41e104: Pull complete
2c00e43340b3: Pull complete
4fcd3032b2c3: Pull complete
b58aa3d6b872: Pull complete
89234dcd9a9c: Pull complete
cb0185d5728e: Pull complete
3d6e58b109a9: Pull complete
cc59cdfeca2c: Pull complete
c0f30d4bde81: Pull complete
489015e5df27: Pull complete
187f7df4b589: Pull complete
ac53ee7d20d5: Pull complete
ca229918bf28: Pull complete
6397ccfab52d: Pull complete
4f4f31a03f53: Pull complete
3381b4caf618: Pull complete
5b5a13d57c39: Pull complete
b9ba03a85542: Pull complete
7d611a1b3fca: Pull complete
Digest: sha256:264654a37596a44b0668b8ce6ac41082d713f6ee150b3fc6425fa78cc64e4f20
Status: Downloaded newer image for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/71aad1bd-5408-4718-b0ea-5e1a0055bfc6).
Autosynth is still having trouble generating cloudbuild. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-cloudbuild'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:264654a37596a44b0668b8ce6ac41082d713f6ee150b3fc6425fa78cc64e4f20
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudbuild/artman_cloudbuild.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/devtools/cloudbuild/v1/cloudbuild.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto/cloudbuild.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudbuild-v1/google/cloud/cloudbuild_v1/proto.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 109, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 96, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/3cf73926-ecfb-4d28-998b-2b45a4de79e8).
| 2020-01-08T20:52:40Z | [] | [] |
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/cloudbuild/synth.py", line 42, in <module>
'README.rst'
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in move
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 156, in <listcomp>
_tracked_paths.relativize(e) for e in _expand_paths(excludes, source)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 58, in _expand_paths
for p in root.glob(path)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/transforms.py", line 57, in <genexpr>
p
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 1080, in glob
selector = _make_selector(tuple(pattern_parts))
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/pathlib.py", line 459, in _make_selector
raise ValueError("Invalid pattern: '**' can only be an entire path component")
ValueError: Invalid pattern: '**' can only be an entire path component
| 5,662 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-10106 | d897d56bce03d1fda98b79afb08264e51d46c421 | diff --git a/spanner/google/cloud/spanner_v1/instance.py b/spanner/google/cloud/spanner_v1/instance.py
--- a/spanner/google/cloud/spanner_v1/instance.py
+++ b/spanner/google/cloud/spanner_v1/instance.py
@@ -262,13 +262,13 @@ def update(self):
.. note::
- Updates the ``display_name`` and ``node_count``. To change those
- values before updating, set them via
+ Updates the ``display_name`` and ``node_count``. To change those
+ values before updating, set them via
- .. code:: python
+ .. code:: python
- instance.display_name = 'New display name'
- instance.node_count = 5
+ instance.display_name = 'New display name'
+ instance.node_count = 5
before calling :meth:`update`.
| Spanner: Building the docs fails
Building the docs emits a warning, which is interpreted as error:
```
$ nox -f noxfile.py -s docs
...
Traceback (most recent call last):
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/cmd/build.py", line 276, in build_main
app.build(args.force_all, filenames)
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/application.py", line 353, in build
self.builder.build_update()
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/builders/__init__.py", line 299, in build_update
len(to_build))
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/builders/__init__.py", line 361, in build
self.write(docnames, list(updated_docnames), method)
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/builders/__init__.py", line 535, in write
self._write_serial(sorted(docnames))
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/builders/__init__.py", line 545, in _write_serial
self.write_doc(docname, doctree)
File "/usr/local/lib/python3.7/contextlib.py", line 119, in __exit__
next(self.gen)
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/util/logging.py", line 219, in pending_warnings
memhandler.flushTo(logger)
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/util/logging.py", line 184, in flushTo
logger.handle(record)
File "/usr/local/lib/python3.7/logging/__init__.py", line 1478, in handle
self.callHandlers(record)
File "/usr/local/lib/python3.7/logging/__init__.py", line 1540, in callHandlers
hdlr.handle(record)
File "/usr/local/lib/python3.7/logging/__init__.py", line 850, in handle
rv = self.filter(record)
File "/usr/local/lib/python3.7/logging/__init__.py", line 712, in filter
result = f.filter(record)
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/util/logging.py", line 404, in filter
raise SphinxWarning(location + ":" + message)
sphinx.errors.SphinxWarning: /home/peter/workspace/google-cloud-python/spanner/docs/instance-api.rst:11:Could not lex literal_block as "python". Highlighting skipped.
```
| 2020-01-13T10:52:34Z | [] | [] |
Traceback (most recent call last):
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/cmd/build.py", line 276, in build_main
app.build(args.force_all, filenames)
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/application.py", line 353, in build
self.builder.build_update()
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/builders/__init__.py", line 299, in build_update
len(to_build))
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/builders/__init__.py", line 361, in build
self.write(docnames, list(updated_docnames), method)
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/builders/__init__.py", line 535, in write
self._write_serial(sorted(docnames))
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/builders/__init__.py", line 545, in _write_serial
self.write_doc(docname, doctree)
File "/usr/local/lib/python3.7/contextlib.py", line 119, in __exit__
next(self.gen)
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/util/logging.py", line 219, in pending_warnings
memhandler.flushTo(logger)
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/util/logging.py", line 184, in flushTo
logger.handle(record)
File "/usr/local/lib/python3.7/logging/__init__.py", line 1478, in handle
self.callHandlers(record)
File "/usr/local/lib/python3.7/logging/__init__.py", line 1540, in callHandlers
hdlr.handle(record)
File "/usr/local/lib/python3.7/logging/__init__.py", line 850, in handle
rv = self.filter(record)
File "/usr/local/lib/python3.7/logging/__init__.py", line 712, in filter
result = f.filter(record)
File "/home/peter/workspace/google-cloud-python/spanner/.nox/docs/lib/python3.7/site-packages/sphinx/util/logging.py", line 404, in filter
raise SphinxWarning(location + ":" + message)
sphinx.errors.SphinxWarning: /home/peter/workspace/google-cloud-python/spanner/docs/instance-api.rst:11:Could not lex literal_block as "python". Highlighting skipped.
| 5,666 |
||||
googleapis/google-cloud-python | googleapis__google-cloud-python-10355 | 71705da53be1397ac29aad5f0f8af181fd2b5e52 | diff --git a/asset/synth.py b/asset/synth.py
--- a/asset/synth.py
+++ b/asset/synth.py
@@ -29,15 +29,13 @@
for version in versions:
if version == "v1p1beta1":
config_path = "/google/cloud/asset/v1p1beta1/artman_cloudasset_v1p1beta1.yaml"
- artman_output_name = f"cloudasset-{version}"
else:
config_path = f"/google/cloud/asset/artman_cloudasset_{version}.yaml"
- artman_output_name=f"asset-{version}"
library = gapic.py_library(
"asset",
version,
config_path=config_path,
- artman_output_name=artman_output_name,
+ artman_output_name=f"asset-{version}",
include_protos=True,
)
| Synthesis failed for asset
Hello! Autosynth couldn't regenerate asset. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-asset'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/asset/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
0a01a72a686c: Pulling fs layer
cc899a5544da: Pulling fs layer
19197c550755: Pulling fs layer
716d454e56b6: Pulling fs layer
7b23138015ed: Pulling fs layer
1ab82626a034: Pulling fs layer
fbfa3130dc78: Pulling fs layer
058d8e20338f: Pulling fs layer
82bf592756c0: Pulling fs layer
b8b059d09388: Pulling fs layer
74ee6185fe93: Pulling fs layer
fb11b4128c3a: Pulling fs layer
5e68af124c84: Pulling fs layer
bdab1001d7a1: Pulling fs layer
13171bc96b0b: Pulling fs layer
de6284e6f670: Pulling fs layer
0c014e0bc09d: Pulling fs layer
60db2a683e62: Pulling fs layer
128b454a5219: Pulling fs layer
93680fb38355: Pulling fs layer
c0584c9d4a57: Pulling fs layer
c0a1b5c08685: Pulling fs layer
5264bd56e4a2: Pulling fs layer
65cdc1555d35: Pulling fs layer
5b598c514f23: Pulling fs layer
46f64c337a8e: Pulling fs layer
6359d404896f: Pulling fs layer
903c4a9dfd8a: Pulling fs layer
716d454e56b6: Waiting
7b23138015ed: Waiting
1ab82626a034: Waiting
fbfa3130dc78: Waiting
058d8e20338f: Waiting
82bf592756c0: Waiting
b8b059d09388: Waiting
93680fb38355: Waiting
46f64c337a8e: Waiting
c0584c9d4a57: Waiting
6359d404896f: Waiting
c0a1b5c08685: Waiting
74ee6185fe93: Waiting
903c4a9dfd8a: Waiting
5264bd56e4a2: Waiting
5b598c514f23: Waiting
fb11b4128c3a: Waiting
65cdc1555d35: Waiting
bdab1001d7a1: Waiting
13171bc96b0b: Waiting
5e68af124c84: Waiting
60db2a683e62: Waiting
128b454a5219: Waiting
de6284e6f670: Waiting
19197c550755: Download complete
716d454e56b6: Verifying Checksum
716d454e56b6: Download complete
cc899a5544da: Verifying Checksum
cc899a5544da: Download complete
7b23138015ed: Verifying Checksum
7b23138015ed: Download complete
0a01a72a686c: Download complete
058d8e20338f: Verifying Checksum
058d8e20338f: Download complete
fbfa3130dc78: Verifying Checksum
fbfa3130dc78: Download complete
b8b059d09388: Verifying Checksum
b8b059d09388: Download complete
82bf592756c0: Verifying Checksum
82bf592756c0: Download complete
0a01a72a686c: Pull complete
cc899a5544da: Pull complete
19197c550755: Pull complete
716d454e56b6: Pull complete
fb11b4128c3a: Verifying Checksum
fb11b4128c3a: Download complete
7b23138015ed: Pull complete
74ee6185fe93: Verifying Checksum
74ee6185fe93: Download complete
bdab1001d7a1: Verifying Checksum
bdab1001d7a1: Download complete
13171bc96b0b: Verifying Checksum
13171bc96b0b: Download complete
de6284e6f670: Verifying Checksum
de6284e6f670: Download complete
0c014e0bc09d: Download complete
60db2a683e62: Download complete
128b454a5219: Download complete
1ab82626a034: Verifying Checksum
1ab82626a034: Download complete
c0584c9d4a57: Verifying Checksum
c0584c9d4a57: Download complete
1ab82626a034: Pull complete
fbfa3130dc78: Pull complete
058d8e20338f: Pull complete
82bf592756c0: Pull complete
b8b059d09388: Pull complete
74ee6185fe93: Pull complete
fb11b4128c3a: Pull complete
c0a1b5c08685: Verifying Checksum
c0a1b5c08685: Download complete
5264bd56e4a2: Verifying Checksum
5264bd56e4a2: Download complete
65cdc1555d35: Verifying Checksum
65cdc1555d35: Download complete
5b598c514f23: Verifying Checksum
5b598c514f23: Download complete
5e68af124c84: Verifying Checksum
5e68af124c84: Download complete
46f64c337a8e: Verifying Checksum
46f64c337a8e: Download complete
6359d404896f: Verifying Checksum
6359d404896f: Download complete
903c4a9dfd8a: Verifying Checksum
903c4a9dfd8a: Download complete
5e68af124c84: Pull complete
bdab1001d7a1: Pull complete
93680fb38355: Verifying Checksum
93680fb38355: Download complete
13171bc96b0b: Pull complete
de6284e6f670: Pull complete
0c014e0bc09d: Pull complete
60db2a683e62: Pull complete
128b454a5219: Pull complete
93680fb38355: Pull complete
c0584c9d4a57: Pull complete
c0a1b5c08685: Pull complete
5264bd56e4a2: Pull complete
65cdc1555d35: Pull complete
5b598c514f23: Pull complete
46f64c337a8e: Pull complete
6359d404896f: Pull complete
903c4a9dfd8a: Pull complete
Digest: sha256:19e945954fc960a4bdfee6cb34695898ab21a8cf0bac063ee39b91f00a1faec8
Status: Downloaded newer image for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1p2beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto.
synthtool > Running generator for google/cloud/asset/v1p1beta1/artman_cloudasset_v1p1beta1.yaml.
synthtool > Removing obsolete file google/cloud/asset_v1p1beta1/gapic/asset_service_client_config.py...
synthtool > Removing obsolete file google/cloud/asset_v1/proto/__init__.py...
synthtool > Removing obsolete file tests/unit/gapic/v1p1beta1/test_asset_service_client_v1p1beta1.py...
synthtool > Removing obsolete file google/cloud/asset_v1/gapic/asset_service_client.py...
synthtool > Removing obsolete file google/cloud/asset_v1/proto/assets_pb2_grpc.py...
synthtool > Removing obsolete file google/cloud/asset_v1/proto/asset_service.proto...
synthtool > Removing obsolete file google/cloud/asset_v1/gapic/__init__.py...
synthtool > Removing obsolete file google/cloud/asset_v1/proto/asset_service_pb2.py...
synthtool > Removing obsolete file .coveragerc...
synthtool > Removing obsolete file .flake8...
synthtool > Removing obsolete file google/cloud/asset_v1p1beta1/__init__.py...
synthtool > Removing obsolete file google/cloud/asset_v1/proto/assets_pb2.py...
synthtool > Removing obsolete file google/cloud/asset_v1p1beta1/proto/asset_service_pb2.py...
synthtool > Removing obsolete file google/cloud/asset_v1p1beta1/proto/asset_service_pb2_grpc.py...
synthtool > Removing obsolete file docs/_templates/layout.html...
synthtool > Removing obsolete file google/cloud/asset_v1/gapic/transports/asset_service_grpc_transport.py...
synthtool > Removing obsolete file google/cloud/asset_v1/proto/assets.proto...
synthtool > Removing obsolete file google/cloud/asset_v1p1beta1/proto/assets_pb2.py...
synthtool > Removing obsolete file google/cloud/asset_v1/__init__.py...
synthtool > Removing obsolete file google/cloud/asset_v1/gapic/asset_service_client_config.py...
synthtool > Removing obsolete file tests/unit/gapic/v1/test_asset_service_client_v1.py...
synthtool > Removing obsolete file google/cloud/asset_v1p1beta1/gapic/transports/asset_service_grpc_transport.py...
synthtool > Removing obsolete file docs/gapic/v1/types.rst...
synthtool > Removing obsolete file google/cloud/asset_v1/gapic/enums.py...
synthtool > Removing obsolete file docs/gapic/v1/api.rst...
synthtool > Removing obsolete file google/cloud/asset_v1p1beta1/gapic/asset_service_client.py...
synthtool > Removing obsolete file google/cloud/asset_v1p1beta1/proto/assets_pb2_grpc.py...
synthtool > Removing obsolete file google/cloud/asset_v1p1beta1/types.py...
synthtool > Removing obsolete file google/cloud/asset_v1/types.py...
synthtool > Removing obsolete file google/cloud/asset_v1/gapic/transports/__init__.py...
synthtool > Removing obsolete file google/cloud/asset_v1/proto/asset_service_pb2_grpc.py...
synthtool > Removing obsolete file docs/_static/custom.css...
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/asset/synth.py", line 41, in <module>
include_protos=True,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 149, in _generate_code
f"Unable to find generated output of artman: {genfiles}."
FileNotFoundError: Unable to find generated output of artman: /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudasset-v1p1beta1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/9bf75ddc-d643-409d-b0e5-6e6330408b26).
| Autosynth is still having trouble generating asset. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-asset'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/asset/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:19e945954fc960a4bdfee6cb34695898ab21a8cf0bac063ee39b91f00a1faec8
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1p2beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto.
synthtool > Running generator for google/cloud/asset/v1p1beta1/artman_cloudasset_v1p1beta1.yaml.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/asset/synth.py", line 41, in <module>
include_protos=True,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 149, in _generate_code
f"Unable to find generated output of artman: {genfiles}."
FileNotFoundError: Unable to find generated output of artman: /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudasset-v1p1beta1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/25bbdf74-44ce-4b1b-8cf8-5b9a7e8e76c0).
Autosynth is still having trouble generating asset. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-asset'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/asset/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:19e945954fc960a4bdfee6cb34695898ab21a8cf0bac063ee39b91f00a1faec8
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1p2beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto.
synthtool > Running generator for google/cloud/asset/v1p1beta1/artman_cloudasset_v1p1beta1.yaml.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/asset/synth.py", line 41, in <module>
include_protos=True,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 149, in _generate_code
f"Unable to find generated output of artman: {genfiles}."
FileNotFoundError: Unable to find generated output of artman: /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudasset-v1p1beta1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/db7ea246-5ad3-4221-bd54-5b2500e694d1).
Autosynth is still having trouble generating asset. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-asset'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/asset/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
0a01a72a686c: Pulling fs layer
cc899a5544da: Pulling fs layer
19197c550755: Pulling fs layer
716d454e56b6: Pulling fs layer
7b23138015ed: Pulling fs layer
1ab82626a034: Pulling fs layer
fbfa3130dc78: Pulling fs layer
058d8e20338f: Pulling fs layer
82bf592756c0: Pulling fs layer
b8b059d09388: Pulling fs layer
74ee6185fe93: Pulling fs layer
fb11b4128c3a: Pulling fs layer
5e68af124c84: Pulling fs layer
bdab1001d7a1: Pulling fs layer
13171bc96b0b: Pulling fs layer
de6284e6f670: Pulling fs layer
0c014e0bc09d: Pulling fs layer
60db2a683e62: Pulling fs layer
128b454a5219: Pulling fs layer
93680fb38355: Pulling fs layer
c0584c9d4a57: Pulling fs layer
c0a1b5c08685: Pulling fs layer
5264bd56e4a2: Pulling fs layer
65cdc1555d35: Pulling fs layer
5b598c514f23: Pulling fs layer
46f64c337a8e: Pulling fs layer
6359d404896f: Pulling fs layer
903c4a9dfd8a: Pulling fs layer
c0584c9d4a57: Waiting
13171bc96b0b: Waiting
c0a1b5c08685: Waiting
de6284e6f670: Waiting
5264bd56e4a2: Waiting
0c014e0bc09d: Waiting
65cdc1555d35: Waiting
60db2a683e62: Waiting
5b598c514f23: Waiting
46f64c337a8e: Waiting
6359d404896f: Waiting
903c4a9dfd8a: Waiting
93680fb38355: Waiting
128b454a5219: Waiting
716d454e56b6: Waiting
82bf592756c0: Waiting
b8b059d09388: Waiting
74ee6185fe93: Waiting
7b23138015ed: Waiting
1ab82626a034: Waiting
058d8e20338f: Waiting
fb11b4128c3a: Waiting
bdab1001d7a1: Waiting
fbfa3130dc78: Waiting
cc899a5544da: Verifying Checksum
cc899a5544da: Download complete
19197c550755: Download complete
716d454e56b6: Verifying Checksum
716d454e56b6: Download complete
0a01a72a686c: Verifying Checksum
0a01a72a686c: Download complete
0a01a72a686c: Pull complete
cc899a5544da: Pull complete
19197c550755: Pull complete
716d454e56b6: Pull complete
7b23138015ed: Verifying Checksum
7b23138015ed: Download complete
fbfa3130dc78: Verifying Checksum
fbfa3130dc78: Download complete
7b23138015ed: Pull complete
058d8e20338f: Verifying Checksum
058d8e20338f: Download complete
b8b059d09388: Verifying Checksum
b8b059d09388: Download complete
82bf592756c0: Verifying Checksum
82bf592756c0: Download complete
fb11b4128c3a: Verifying Checksum
fb11b4128c3a: Download complete
74ee6185fe93: Verifying Checksum
74ee6185fe93: Download complete
1ab82626a034: Verifying Checksum
1ab82626a034: Download complete
bdab1001d7a1: Verifying Checksum
bdab1001d7a1: Download complete
de6284e6f670: Download complete
13171bc96b0b: Verifying Checksum
0c014e0bc09d: Verifying Checksum
0c014e0bc09d: Download complete
128b454a5219: Verifying Checksum
128b454a5219: Download complete
60db2a683e62: Verifying Checksum
60db2a683e62: Download complete
c0584c9d4a57: Verifying Checksum
c0584c9d4a57: Download complete
1ab82626a034: Pull complete
fbfa3130dc78: Pull complete
058d8e20338f: Pull complete
82bf592756c0: Pull complete
b8b059d09388: Pull complete
74ee6185fe93: Pull complete
fb11b4128c3a: Pull complete
5e68af124c84: Verifying Checksum
5e68af124c84: Download complete
5264bd56e4a2: Download complete
65cdc1555d35: Verifying Checksum
65cdc1555d35: Download complete
5b598c514f23: Verifying Checksum
5b598c514f23: Download complete
46f64c337a8e: Verifying Checksum
46f64c337a8e: Download complete
5e68af124c84: Pull complete
bdab1001d7a1: Pull complete
6359d404896f: Verifying Checksum
6359d404896f: Download complete
13171bc96b0b: Pull complete
de6284e6f670: Pull complete
0c014e0bc09d: Pull complete
60db2a683e62: Pull complete
128b454a5219: Pull complete
903c4a9dfd8a: Verifying Checksum
903c4a9dfd8a: Download complete
c0a1b5c08685: Verifying Checksum
c0a1b5c08685: Download complete
93680fb38355: Verifying Checksum
93680fb38355: Download complete
93680fb38355: Pull complete
c0584c9d4a57: Pull complete
c0a1b5c08685: Pull complete
5264bd56e4a2: Pull complete
65cdc1555d35: Pull complete
5b598c514f23: Pull complete
46f64c337a8e: Pull complete
6359d404896f: Pull complete
903c4a9dfd8a: Pull complete
Digest: sha256:19e945954fc960a4bdfee6cb34695898ab21a8cf0bac063ee39b91f00a1faec8
Status: Downloaded newer image for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1p2beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto.
synthtool > Running generator for google/cloud/asset/v1p1beta1/artman_cloudasset_v1p1beta1.yaml.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/asset/synth.py", line 41, in <module>
include_protos=True,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 149, in _generate_code
f"Unable to find generated output of artman: {genfiles}."
FileNotFoundError: Unable to find generated output of artman: /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudasset-v1p1beta1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/a7f2184c-06fa-446b-89ad-51cbc8dde6c7).
Autosynth is still having trouble generating asset. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-asset'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/asset/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
0a01a72a686c: Pulling fs layer
cc899a5544da: Pulling fs layer
19197c550755: Pulling fs layer
716d454e56b6: Pulling fs layer
7b23138015ed: Pulling fs layer
1ab82626a034: Pulling fs layer
716d454e56b6: Waiting
7b23138015ed: Waiting
fbfa3130dc78: Pulling fs layer
058d8e20338f: Pulling fs layer
82bf592756c0: Pulling fs layer
fbfa3130dc78: Waiting
1ab82626a034: Waiting
058d8e20338f: Waiting
b8b059d09388: Pulling fs layer
74ee6185fe93: Pulling fs layer
fb11b4128c3a: Pulling fs layer
5e68af124c84: Pulling fs layer
bdab1001d7a1: Pulling fs layer
13171bc96b0b: Pulling fs layer
de6284e6f670: Pulling fs layer
b8b059d09388: Waiting
0c014e0bc09d: Pulling fs layer
82bf592756c0: Waiting
74ee6185fe93: Waiting
60db2a683e62: Pulling fs layer
128b454a5219: Pulling fs layer
93680fb38355: Pulling fs layer
c0584c9d4a57: Pulling fs layer
c0a1b5c08685: Pulling fs layer
5264bd56e4a2: Pulling fs layer
65cdc1555d35: Pulling fs layer
5b598c514f23: Pulling fs layer
46f64c337a8e: Pulling fs layer
6359d404896f: Pulling fs layer
903c4a9dfd8a: Pulling fs layer
13171bc96b0b: Waiting
de6284e6f670: Waiting
0c014e0bc09d: Waiting
5264bd56e4a2: Waiting
65cdc1555d35: Waiting
5b598c514f23: Waiting
60db2a683e62: Waiting
46f64c337a8e: Waiting
128b454a5219: Waiting
5e68af124c84: Waiting
93680fb38355: Waiting
c0584c9d4a57: Waiting
c0a1b5c08685: Waiting
bdab1001d7a1: Waiting
903c4a9dfd8a: Waiting
fb11b4128c3a: Waiting
cc899a5544da: Verifying Checksum
cc899a5544da: Download complete
19197c550755: Verifying Checksum
19197c550755: Download complete
716d454e56b6: Verifying Checksum
716d454e56b6: Download complete
0a01a72a686c: Verifying Checksum
0a01a72a686c: Download complete
7b23138015ed: Verifying Checksum
7b23138015ed: Download complete
fbfa3130dc78: Verifying Checksum
fbfa3130dc78: Download complete
058d8e20338f: Verifying Checksum
058d8e20338f: Download complete
b8b059d09388: Verifying Checksum
b8b059d09388: Download complete
82bf592756c0: Verifying Checksum
82bf592756c0: Download complete
fb11b4128c3a: Verifying Checksum
fb11b4128c3a: Download complete
74ee6185fe93: Verifying Checksum
74ee6185fe93: Download complete
bdab1001d7a1: Verifying Checksum
bdab1001d7a1: Download complete
1ab82626a034: Verifying Checksum
1ab82626a034: Download complete
0a01a72a686c: Pull complete
13171bc96b0b: Verifying Checksum
13171bc96b0b: Download complete
de6284e6f670: Download complete
cc899a5544da: Pull complete
19197c550755: Pull complete
5e68af124c84: Download complete
716d454e56b6: Pull complete
0c014e0bc09d: Verifying Checksum
0c014e0bc09d: Download complete
60db2a683e62: Verifying Checksum
60db2a683e62: Download complete
128b454a5219: Verifying Checksum
128b454a5219: Download complete
c0584c9d4a57: Verifying Checksum
c0584c9d4a57: Download complete
5264bd56e4a2: Verifying Checksum
5264bd56e4a2: Download complete
65cdc1555d35: Verifying Checksum
65cdc1555d35: Download complete
5b598c514f23: Verifying Checksum
5b598c514f23: Download complete
7b23138015ed: Pull complete
46f64c337a8e: Verifying Checksum
46f64c337a8e: Download complete
93680fb38355: Verifying Checksum
93680fb38355: Download complete
6359d404896f: Verifying Checksum
6359d404896f: Download complete
903c4a9dfd8a: Verifying Checksum
903c4a9dfd8a: Download complete
c0a1b5c08685: Verifying Checksum
c0a1b5c08685: Download complete
1ab82626a034: Pull complete
fbfa3130dc78: Pull complete
058d8e20338f: Pull complete
82bf592756c0: Pull complete
b8b059d09388: Pull complete
74ee6185fe93: Pull complete
fb11b4128c3a: Pull complete
5e68af124c84: Pull complete
bdab1001d7a1: Pull complete
13171bc96b0b: Pull complete
de6284e6f670: Pull complete
0c014e0bc09d: Pull complete
60db2a683e62: Pull complete
128b454a5219: Pull complete
93680fb38355: Pull complete
c0584c9d4a57: Pull complete
c0a1b5c08685: Pull complete
5264bd56e4a2: Pull complete
65cdc1555d35: Pull complete
5b598c514f23: Pull complete
46f64c337a8e: Pull complete
6359d404896f: Pull complete
903c4a9dfd8a: Pull complete
Digest: sha256:19e945954fc960a4bdfee6cb34695898ab21a8cf0bac063ee39b91f00a1faec8
Status: Downloaded newer image for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1p2beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto.
synthtool > Running generator for google/cloud/asset/v1p1beta1/artman_cloudasset_v1p1beta1.yaml.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/asset/synth.py", line 41, in <module>
include_protos=True,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 149, in _generate_code
f"Unable to find generated output of artman: {genfiles}."
FileNotFoundError: Unable to find generated output of artman: /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudasset-v1p1beta1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/f8b4b4b8-80cd-4edf-8a98-24b0dc85d02b).
Autosynth is still having trouble generating asset. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-asset'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/asset/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:19e945954fc960a4bdfee6cb34695898ab21a8cf0bac063ee39b91f00a1faec8
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1p2beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto.
synthtool > Running generator for google/cloud/asset/v1p1beta1/artman_cloudasset_v1p1beta1.yaml.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/asset/synth.py", line 41, in <module>
include_protos=True,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 149, in _generate_code
f"Unable to find generated output of artman: {genfiles}."
FileNotFoundError: Unable to find generated output of artman: /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudasset-v1p1beta1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/89962669-634c-40fb-83d2-b8156ca83bc0).
Autosynth is still having trouble generating asset. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-asset'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/asset/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:19e945954fc960a4bdfee6cb34695898ab21a8cf0bac063ee39b91f00a1faec8
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1p2beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto.
synthtool > Running generator for google/cloud/asset/v1p1beta1/artman_cloudasset_v1p1beta1.yaml.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/asset/synth.py", line 41, in <module>
include_protos=True,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 149, in _generate_code
f"Unable to find generated output of artman: {genfiles}."
FileNotFoundError: Unable to find generated output of artman: /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudasset-v1p1beta1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/1e052f84-6f46-4191-b7b9-e4a59f7048aa).
Autosynth is still having trouble generating asset. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-asset'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/asset/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:19e945954fc960a4bdfee6cb34695898ab21a8cf0bac063ee39b91f00a1faec8
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1beta1/google/cloud/asset_v1beta1/proto.
synthtool > Running generator for google/cloud/asset/artman_cloudasset_v1p2beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/assets.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/assets.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/asset/v1p2beta1/asset_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto/asset_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/asset-v1p2beta1/google/cloud/asset_v1p2beta1/proto.
synthtool > Running generator for google/cloud/asset/v1p1beta1/artman_cloudasset_v1p1beta1.yaml.
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/asset/synth.py", line 41, in <module>
include_protos=True,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 149, in _generate_code
f"Unable to find generated output of artman: {genfiles}."
FileNotFoundError: Unable to find generated output of artman: /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudasset-v1p1beta1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/32d0880b-ca4d-4377-a663-8b51a8d5abbc).
| 2020-02-06T17:30:29Z | [] | [] |
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/asset/synth.py", line 41, in <module>
include_protos=True,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 149, in _generate_code
f"Unable to find generated output of artman: {genfiles}."
FileNotFoundError: Unable to find generated output of artman: /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/cloudasset-v1p1beta1.
| 5,675 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-10356 | 71705da53be1397ac29aad5f0f8af181fd2b5e52 | diff --git a/recommender/synth.py b/recommender/synth.py
--- a/recommender/synth.py
+++ b/recommender/synth.py
@@ -19,7 +19,7 @@
from synthtool import gcp
from synthtool.languages import python
-gapic = gcp.GAPICBazel()
+gapic = gcp.GAPICGenerator()
versions = ["v1beta1", "v1"]
common = gcp.CommonTemplates()
@@ -28,7 +28,11 @@
# Generate Cloud Recommender
# ----------------------------------------------------------------------------
for version in versions:
- library = gapic.py_library("recommender", version)
+ library = gapic.py_library(
+ "recommender",
+ version,
+ config_path=f"/google/cloud/recommender/{version}/artman_recommender_{version}.yaml",
+ )
s.move(library, excludes=["nox.py", "docs/index.rst", "README.rst", "setup.py"])
# Fix docstring with regex pattern that breaks docgen
@@ -37,7 +41,12 @@
# Fix more regex in docstrings
s.replace("google/**/*_pb2.py", ":math:`(/)", "\g<1>")
s.replace("google/**/*_pb2.py", "`/\.", "/.")
-s.replace("google/**/*_pb2.py", "(regex\s+)(/.*?/)\.", "\g<1>``\g<2>``.", flags=re.MULTILINE | re.DOTALL,)
+s.replace(
+ "google/**/*_pb2.py",
+ "(regex\s+)(/.*?/)\.",
+ "\g<1>``\g<2>``.",
+ flags=re.MULTILINE | re.DOTALL,
+)
# Fix docstring with JSON example by wrapping with backticks
s.replace(
| Synthesis failed for recommender
Hello! Autosynth couldn't regenerate recommender. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Cloning googleapis.
synthtool > Generating code for: google/cloud/recommender/v1beta1.
synthtool > Generated code into /tmpfs/tmp/tmpzcw7c9x0.
synthtool > Generating code for: google/cloud/recommender/v1.
synthtool > Generated code into /tmpfs/tmp/tmp582105as.
synthtool > Replaced '(/\\^.*\\$/)' in google/cloud/recommender_v1beta1/gapic/recommender_client.py.
synthtool > Replaced ':math:`(/)' in google/cloud/recommender_v1beta1/proto/recommender_service_pb2.py.
synthtool > Replaced ':math:`(/)' in google/cloud/recommender_v1/proto/recommender_service_pb2.py.
synthtool > Replaced '`/\\.' in google/cloud/recommender_v1beta1/proto/recommender_service_pb2.py.
synthtool > Replaced '`/\\.' in google/cloud/recommender_v1/proto/recommender_service_pb2.py.
synthtool > Replaced '(regex\\s+)(/.*?/)\\.' in google/cloud/recommender_v1beta1/proto/recommender_service_pb2.py.
synthtool > Replaced '(regex\\s+)(/.*?/)\\.' in google/cloud/recommender_v1/proto/recommender_service_pb2.py.
synthtool > Replaced '( - Example: )(\\{.*?\\})' in google/cloud/recommender_v1beta1/proto/recommendation_pb2.py.
synthtool > Replaced '( - Example: )(\\{.*?\\})' in google/cloud/recommender_v1/proto/recommendation_pb2.py.
synthtool > Replaced '(\\# -\\*- coding: utf-8 -\\*-\\n)(\\# Generated by the protocol buffer compiler\\. DO NOT EDIT!.*?import sys)' in google/cloud/recommender_v1beta1/proto/recommender_service_pb2.py.
synthtool > Replaced '(\\# -\\*- coding: utf-8 -\\*-\\n)(\\# Generated by the protocol buffer compiler\\. DO NOT EDIT!.*?import sys)' in google/cloud/recommender_v1beta1/proto/recommendation_pb2.py.
synthtool > Replaced '(\\# -\\*- coding: utf-8 -\\*-\\n)(\\# Generated by the protocol buffer compiler\\. DO NOT EDIT!.*?import sys)' in google/cloud/recommender_v1/proto/recommender_service_pb2.py.
synthtool > Replaced '(\\# -\\*- coding: utf-8 -\\*-\\n)(\\# Generated by the protocol buffer compiler\\. DO NOT EDIT!.*?import sys)' in google/cloud/recommender_v1/proto/recommendation_pb2.py.
synthtool > Replaced '(\\# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!)\n(import grpc)' in google/cloud/recommender_v1beta1/proto/recommendation_pb2_grpc.py.
synthtool > Replaced '(\\# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!)\n(import grpc)' in google/cloud/recommender_v1beta1/proto/recommender_service_pb2_grpc.py.
synthtool > Replaced '(\\# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!)\n(import grpc)' in google/cloud/recommender_v1/proto/recommendation_pb2_grpc.py.
synthtool > Replaced '(\\# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!)\n(import grpc)' in google/cloud/recommender_v1/proto/recommender_service_pb2_grpc.py.
.coveragerc
.flake8
MANIFEST.in
docs/_static/custom.css
docs/_templates/layout.html
noxfile.py.j2
setup.cfg
synthtool > Replaced '\\["2\\.7", ' in noxfile.py.
Running session blacken
Creating virtual environment (virtualenv) using python3.6 in .nox/blacken
pip install black==19.3b0
black docs google tests noxfile.py setup.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/__init__.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/__init__.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/__init__.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/gapic/enums.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/gapic/recommender_client_config.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/gapic/transports/recommender_grpc_transport.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/proto/recommendation_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/docs/conf.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/proto/recommender_service_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/types.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/__init__.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/gapic/enums.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/gapic/recommender_client.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/gapic/recommender_client_config.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/gapic/transports/recommender_grpc_transport.py
error: cannot format /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/proto/recommender_service_pb2.py: Cannot parse: 435:10: __doc__ = """Request for the ``ListRecommendations`` method.
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/proto/recommendation_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/gapic/recommender_client.py
error: cannot format /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/proto/recommendation_pb2.py: Cannot parse: 681:10: __doc__ = """A recommendation along with a suggested action. E.g., a
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/types.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/proto/recommender_service_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/noxfile.py
error: cannot format /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/proto/recommender_service_pb2.py: Cannot parse: 435:10: __doc__ = """Request for the ``ListRecommendations`` method.
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/tests/unit/gapic/v1/test_recommender_client_v1.py
error: cannot format /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/proto/recommendation_pb2.py: Cannot parse: 681:10: __doc__ = """A recommendation along with a suggested action. E.g., a
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/tests/unit/gapic/v1beta1/test_recommender_client_v1beta1.py
All done! 💥 💔 💥
23 files reformatted, 7 files left unchanged, 4 files failed to reformat.
Command black docs google tests noxfile.py setup.py failed with exit code 123
Session blacken failed.
synthtool > Failed executing nox -s blacken:
None
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 62, in <module>
s.shell.run(["nox", "-s", "blacken"], hide_output=False)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['nox', '-s', 'blacken']' returned non-zero exit status 1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/1e052f84-6f46-4191-b7b9-e4a59f7048aa).
| Autosynth is still having trouble generating recommender. :sob:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-recommender'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/recommender/synth.py.
synthtool > Ensuring dependencies.
synthtool > Cloning googleapis.
synthtool > Generating code for: google/cloud/recommender/v1beta1.
synthtool > Generated code into /tmpfs/tmp/tmpfdo9z7ha.
synthtool > Generating code for: google/cloud/recommender/v1.
synthtool > Generated code into /tmpfs/tmp/tmp35xhc2pe.
synthtool > Replaced '(/\\^.*\\$/)' in google/cloud/recommender_v1beta1/gapic/recommender_client.py.
synthtool > Replaced ':math:`(/)' in google/cloud/recommender_v1beta1/proto/recommender_service_pb2.py.
synthtool > Replaced ':math:`(/)' in google/cloud/recommender_v1/proto/recommender_service_pb2.py.
synthtool > Replaced '`/\\.' in google/cloud/recommender_v1beta1/proto/recommender_service_pb2.py.
synthtool > Replaced '`/\\.' in google/cloud/recommender_v1/proto/recommender_service_pb2.py.
synthtool > Replaced '(regex\\s+)(/.*?/)\\.' in google/cloud/recommender_v1beta1/proto/recommender_service_pb2.py.
synthtool > Replaced '(regex\\s+)(/.*?/)\\.' in google/cloud/recommender_v1/proto/recommender_service_pb2.py.
synthtool > Replaced '( - Example: )(\\{.*?\\})' in google/cloud/recommender_v1beta1/proto/recommendation_pb2.py.
synthtool > Replaced '( - Example: )(\\{.*?\\})' in google/cloud/recommender_v1/proto/recommendation_pb2.py.
synthtool > Replaced '(\\# -\\*- coding: utf-8 -\\*-\\n)(\\# Generated by the protocol buffer compiler\\. DO NOT EDIT!.*?import sys)' in google/cloud/recommender_v1beta1/proto/recommender_service_pb2.py.
synthtool > Replaced '(\\# -\\*- coding: utf-8 -\\*-\\n)(\\# Generated by the protocol buffer compiler\\. DO NOT EDIT!.*?import sys)' in google/cloud/recommender_v1beta1/proto/recommendation_pb2.py.
synthtool > Replaced '(\\# -\\*- coding: utf-8 -\\*-\\n)(\\# Generated by the protocol buffer compiler\\. DO NOT EDIT!.*?import sys)' in google/cloud/recommender_v1/proto/recommender_service_pb2.py.
synthtool > Replaced '(\\# -\\*- coding: utf-8 -\\*-\\n)(\\# Generated by the protocol buffer compiler\\. DO NOT EDIT!.*?import sys)' in google/cloud/recommender_v1/proto/recommendation_pb2.py.
synthtool > Replaced '(\\# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!)\n(import grpc)' in google/cloud/recommender_v1beta1/proto/recommendation_pb2_grpc.py.
synthtool > Replaced '(\\# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!)\n(import grpc)' in google/cloud/recommender_v1beta1/proto/recommender_service_pb2_grpc.py.
synthtool > Replaced '(\\# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!)\n(import grpc)' in google/cloud/recommender_v1/proto/recommendation_pb2_grpc.py.
synthtool > Replaced '(\\# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!)\n(import grpc)' in google/cloud/recommender_v1/proto/recommender_service_pb2_grpc.py.
.coveragerc
.flake8
MANIFEST.in
docs/_static/custom.css
docs/_templates/layout.html
noxfile.py.j2
setup.cfg
synthtool > Replaced '\\["2\\.7", ' in noxfile.py.
Running session blacken
Creating virtual environment (virtualenv) using python3.6 in .nox/blacken
pip install black==19.3b0
black docs google tests noxfile.py setup.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/__init__.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/__init__.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/__init__.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/gapic/enums.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/gapic/recommender_client_config.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/gapic/transports/recommender_grpc_transport.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/proto/recommendation_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/docs/conf.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/proto/recommender_service_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/types.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/__init__.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/gapic/enums.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/gapic/recommender_client.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/gapic/recommender_client_config.py
error: cannot format /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/proto/recommender_service_pb2.py: Cannot parse: 435:10: __doc__ = """Request for the ``ListRecommendations`` method.
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/gapic/transports/recommender_grpc_transport.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/proto/recommendation_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/gapic/recommender_client.py
error: cannot format /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1/proto/recommendation_pb2.py: Cannot parse: 681:10: __doc__ = """A recommendation along with a suggested action. E.g., a
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/types.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/proto/recommender_service_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/noxfile.py
error: cannot format /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/proto/recommender_service_pb2.py: Cannot parse: 435:10: __doc__ = """Request for the ``ListRecommendations`` method.
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/tests/unit/gapic/v1beta1/test_recommender_client_v1beta1.py
error: cannot format /tmpfs/src/git/autosynth/working_repo/recommender/google/cloud/recommender_v1beta1/proto/recommendation_pb2.py: Cannot parse: 681:10: __doc__ = """A recommendation along with a suggested action. E.g., a
reformatted /tmpfs/src/git/autosynth/working_repo/recommender/tests/unit/gapic/v1/test_recommender_client_v1.py
All done! 💥 💔 💥
23 files reformatted, 7 files left unchanged, 4 files failed to reformat.
Command black docs google tests noxfile.py setup.py failed with exit code 123
Session blacken failed.
synthtool > Failed executing nox -s blacken:
None
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 62, in <module>
s.shell.run(["nox", "-s", "blacken"], hide_output=False)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['nox', '-s', 'blacken']' returned non-zero exit status 1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/32d0880b-ca4d-4377-a663-8b51a8d5abbc).
| 2020-02-06T21:04:04Z | [] | [] |
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/recommender/synth.py", line 62, in <module>
s.shell.run(["nox", "-s", "blacken"], hide_output=False)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['nox', '-s', 'blacken']' returned non-zero exit status 1.
| 5,676 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-11339 | 3a6894e831350094a2b4bf12bdca63b484b3da15 | diff --git a/packages/google-cloud-redis/docs/conf.py b/packages/google-cloud-redis/docs/conf.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/docs/conf.py
@@ -0,0 +1,384 @@
+# -*- coding: utf-8 -*-
+# Copyright 2021 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+# google-cloud-redis documentation build configuration file
+#
+# This file is execfile()d with the current directory set to its
+# containing dir.
+#
+# Note that not all possible configuration values are present in this
+# autogenerated file.
+#
+# All configuration values have a default; values that are commented out
+# serve to show the default.
+
+import os
+import shlex
+import sys
+
+# If extensions (or modules to document with autodoc) are in another directory,
+# add these directories to sys.path here. If the directory is relative to the
+# documentation root, use os.path.abspath to make it absolute, like shown here.
+sys.path.insert(0, os.path.abspath(".."))
+
+# For plugins that can not read conf.py.
+# See also: https://github.com/docascode/sphinx-docfx-yaml/issues/85
+sys.path.insert(0, os.path.abspath("."))
+
+__version__ = ""
+
+# -- General configuration ------------------------------------------------
+
+# If your documentation needs a minimal Sphinx version, state it here.
+needs_sphinx = "1.5.5"
+
+# Add any Sphinx extension module names here, as strings. They can be
+# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
+# ones.
+extensions = [
+ "sphinx.ext.autodoc",
+ "sphinx.ext.autosummary",
+ "sphinx.ext.intersphinx",
+ "sphinx.ext.coverage",
+ "sphinx.ext.doctest",
+ "sphinx.ext.napoleon",
+ "sphinx.ext.todo",
+ "sphinx.ext.viewcode",
+ "recommonmark",
+]
+
+# autodoc/autosummary flags
+autoclass_content = "both"
+autodoc_default_options = {"members": True}
+autosummary_generate = True
+
+
+# Add any paths that contain templates here, relative to this directory.
+templates_path = ["_templates"]
+
+# The suffix(es) of source filenames.
+# You can specify multiple suffix as a list of string:
+# source_suffix = ['.rst', '.md']
+source_suffix = [".rst", ".md"]
+
+# The encoding of source files.
+# source_encoding = 'utf-8-sig'
+
+# The root toctree document.
+root_doc = "index"
+
+# General information about the project.
+project = "google-cloud-redis"
+copyright = "2019, Google"
+author = "Google APIs"
+
+# The version info for the project you're documenting, acts as replacement for
+# |version| and |release|, also used in various other places throughout the
+# built documents.
+#
+# The full version, including alpha/beta/rc tags.
+release = __version__
+# The short X.Y version.
+version = ".".join(release.split(".")[0:2])
+
+# The language for content autogenerated by Sphinx. Refer to documentation
+# for a list of supported languages.
+#
+# This is also used if you do content translation via gettext catalogs.
+# Usually you set "language" from the command line for these cases.
+language = None
+
+# There are two options for replacing |today|: either, you set today to some
+# non-false value, then it is used:
+# today = ''
+# Else, today_fmt is used as the format for a strftime call.
+# today_fmt = '%B %d, %Y'
+
+# List of patterns, relative to source directory, that match files and
+# directories to ignore when looking for source files.
+exclude_patterns = [
+ "_build",
+ "**/.nox/**/*",
+ "samples/AUTHORING_GUIDE.md",
+ "samples/CONTRIBUTING.md",
+ "samples/snippets/README.rst",
+]
+
+# The reST default role (used for this markup: `text`) to use for all
+# documents.
+# default_role = None
+
+# If true, '()' will be appended to :func: etc. cross-reference text.
+# add_function_parentheses = True
+
+# If true, the current module name will be prepended to all description
+# unit titles (such as .. function::).
+# add_module_names = True
+
+# If true, sectionauthor and moduleauthor directives will be shown in the
+# output. They are ignored by default.
+# show_authors = False
+
+# The name of the Pygments (syntax highlighting) style to use.
+pygments_style = "sphinx"
+
+# A list of ignored prefixes for module index sorting.
+# modindex_common_prefix = []
+
+# If true, keep warnings as "system message" paragraphs in the built documents.
+# keep_warnings = False
+
+# If true, `todo` and `todoList` produce output, else they produce nothing.
+todo_include_todos = True
+
+
+# -- Options for HTML output ----------------------------------------------
+
+# The theme to use for HTML and HTML Help pages. See the documentation for
+# a list of builtin themes.
+html_theme = "alabaster"
+
+# Theme options are theme-specific and customize the look and feel of a theme
+# further. For a list of options available for each theme, see the
+# documentation.
+html_theme_options = {
+ "description": "Google Cloud Client Libraries for google-cloud-redis",
+ "github_user": "googleapis",
+ "github_repo": "python-redis",
+ "github_banner": True,
+ "font_family": "'Roboto', Georgia, sans",
+ "head_font_family": "'Roboto', Georgia, serif",
+ "code_font_family": "'Roboto Mono', 'Consolas', monospace",
+}
+
+# Add any paths that contain custom themes here, relative to this directory.
+# html_theme_path = []
+
+# The name for this set of Sphinx documents. If None, it defaults to
+# "<project> v<release> documentation".
+# html_title = None
+
+# A shorter title for the navigation bar. Default is the same as html_title.
+# html_short_title = None
+
+# The name of an image file (relative to this directory) to place at the top
+# of the sidebar.
+# html_logo = None
+
+# The name of an image file (within the static path) to use as favicon of the
+# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
+# pixels large.
+# html_favicon = None
+
+# Add any paths that contain custom static files (such as style sheets) here,
+# relative to this directory. They are copied after the builtin static files,
+# so a file named "default.css" will overwrite the builtin "default.css".
+html_static_path = ["_static"]
+
+# Add any extra paths that contain custom files (such as robots.txt or
+# .htaccess) here, relative to this directory. These files are copied
+# directly to the root of the documentation.
+# html_extra_path = []
+
+# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
+# using the given strftime format.
+# html_last_updated_fmt = '%b %d, %Y'
+
+# If true, SmartyPants will be used to convert quotes and dashes to
+# typographically correct entities.
+# html_use_smartypants = True
+
+# Custom sidebar templates, maps document names to template names.
+# html_sidebars = {}
+
+# Additional templates that should be rendered to pages, maps page names to
+# template names.
+# html_additional_pages = {}
+
+# If false, no module index is generated.
+# html_domain_indices = True
+
+# If false, no index is generated.
+# html_use_index = True
+
+# If true, the index is split into individual pages for each letter.
+# html_split_index = False
+
+# If true, links to the reST sources are added to the pages.
+# html_show_sourcelink = True
+
+# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
+# html_show_sphinx = True
+
+# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
+# html_show_copyright = True
+
+# If true, an OpenSearch description file will be output, and all pages will
+# contain a <link> tag referring to it. The value of this option must be the
+# base URL from which the finished HTML is served.
+# html_use_opensearch = ''
+
+# This is the file name suffix for HTML files (e.g. ".xhtml").
+# html_file_suffix = None
+
+# Language to be used for generating the HTML full-text search index.
+# Sphinx supports the following languages:
+# 'da', 'de', 'en', 'es', 'fi', 'fr', 'hu', 'it', 'ja'
+# 'nl', 'no', 'pt', 'ro', 'ru', 'sv', 'tr'
+# html_search_language = 'en'
+
+# A dictionary with options for the search language support, empty by default.
+# Now only 'ja' uses this config value
+# html_search_options = {'type': 'default'}
+
+# The name of a javascript file (relative to the configuration directory) that
+# implements a search results scorer. If empty, the default will be used.
+# html_search_scorer = 'scorer.js'
+
+# Output file base name for HTML help builder.
+htmlhelp_basename = "google-cloud-redis-doc"
+
+# -- Options for warnings ------------------------------------------------------
+
+
+suppress_warnings = [
+ # Temporarily suppress this to avoid "more than one target found for
+ # cross-reference" warning, which are intractable for us to avoid while in
+ # a mono-repo.
+ # See https://github.com/sphinx-doc/sphinx/blob
+ # /2a65ffeef5c107c19084fabdd706cdff3f52d93c/sphinx/domains/python.py#L843
+ "ref.python"
+]
+
+# -- Options for LaTeX output ---------------------------------------------
+
+latex_elements = {
+ # The paper size ('letterpaper' or 'a4paper').
+ #'papersize': 'letterpaper',
+ # The font size ('10pt', '11pt' or '12pt').
+ #'pointsize': '10pt',
+ # Additional stuff for the LaTeX preamble.
+ #'preamble': '',
+ # Latex figure (float) alignment
+ #'figure_align': 'htbp',
+}
+
+# Grouping the document tree into LaTeX files. List of tuples
+# (source start file, target name, title,
+# author, documentclass [howto, manual, or own class]).
+latex_documents = [
+ (
+ root_doc,
+ "google-cloud-redis.tex",
+ "google-cloud-redis Documentation",
+ author,
+ "manual",
+ )
+]
+
+# The name of an image file (relative to this directory) to place at the top of
+# the title page.
+# latex_logo = None
+
+# For "manual" documents, if this is true, then toplevel headings are parts,
+# not chapters.
+# latex_use_parts = False
+
+# If true, show page references after internal links.
+# latex_show_pagerefs = False
+
+# If true, show URL addresses after external links.
+# latex_show_urls = False
+
+# Documents to append as an appendix to all manuals.
+# latex_appendices = []
+
+# If false, no module index is generated.
+# latex_domain_indices = True
+
+
+# -- Options for manual page output ---------------------------------------
+
+# One entry per manual page. List of tuples
+# (source start file, name, description, authors, manual section).
+man_pages = [
+ (
+ root_doc,
+ "google-cloud-redis",
+ "google-cloud-redis Documentation",
+ [author],
+ 1,
+ )
+]
+
+# If true, show URL addresses after external links.
+# man_show_urls = False
+
+
+# -- Options for Texinfo output -------------------------------------------
+
+# Grouping the document tree into Texinfo files. List of tuples
+# (source start file, target name, title, author,
+# dir menu entry, description, category)
+texinfo_documents = [
+ (
+ root_doc,
+ "google-cloud-redis",
+ "google-cloud-redis Documentation",
+ author,
+ "google-cloud-redis",
+ "google-cloud-redis Library",
+ "APIs",
+ )
+]
+
+# Documents to append as an appendix to all manuals.
+# texinfo_appendices = []
+
+# If false, no module index is generated.
+# texinfo_domain_indices = True
+
+# How to display URL addresses: 'footnote', 'no', or 'inline'.
+# texinfo_show_urls = 'footnote'
+
+# If true, do not generate a @detailmenu in the "Top" node's menu.
+# texinfo_no_detailmenu = False
+
+
+# Example configuration for intersphinx: refer to the Python standard library.
+intersphinx_mapping = {
+ "python": ("https://python.readthedocs.org/en/latest/", None),
+ "google-auth": ("https://googleapis.dev/python/google-auth/latest/", None),
+ "google.api_core": (
+ "https://googleapis.dev/python/google-api-core/latest/",
+ None,
+ ),
+ "grpc": ("https://grpc.github.io/grpc/python/", None),
+ "proto-plus": ("https://proto-plus-python.readthedocs.io/en/latest/", None),
+ "protobuf": ("https://googleapis.dev/python/protobuf/latest/", None),
+}
+
+
+# Napoleon settings
+napoleon_google_docstring = True
+napoleon_numpy_docstring = True
+napoleon_include_private_with_doc = False
+napoleon_include_special_with_doc = True
+napoleon_use_admonition_for_examples = False
+napoleon_use_admonition_for_notes = False
+napoleon_use_admonition_for_references = False
+napoleon_use_ivar = False
+napoleon_use_param = True
+napoleon_use_rtype = True
diff --git a/packages/google-cloud-redis/google/cloud/redis/__init__.py b/packages/google-cloud-redis/google/cloud/redis/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis/__init__.py
@@ -0,0 +1,85 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from google.cloud.redis import gapic_version as package_version
+
+__version__ = package_version.__version__
+
+
+from google.cloud.redis_v1.services.cloud_redis.async_client import (
+ CloudRedisAsyncClient,
+)
+from google.cloud.redis_v1.services.cloud_redis.client import CloudRedisClient
+from google.cloud.redis_v1.types.cloud_redis import (
+ CreateInstanceRequest,
+ DeleteInstanceRequest,
+ ExportInstanceRequest,
+ FailoverInstanceRequest,
+ GcsDestination,
+ GcsSource,
+ GetInstanceAuthStringRequest,
+ GetInstanceRequest,
+ ImportInstanceRequest,
+ InputConfig,
+ Instance,
+ InstanceAuthString,
+ ListInstancesRequest,
+ ListInstancesResponse,
+ LocationMetadata,
+ MaintenancePolicy,
+ MaintenanceSchedule,
+ NodeInfo,
+ OperationMetadata,
+ OutputConfig,
+ PersistenceConfig,
+ RescheduleMaintenanceRequest,
+ TlsCertificate,
+ UpdateInstanceRequest,
+ UpgradeInstanceRequest,
+ WeeklyMaintenanceWindow,
+ ZoneMetadata,
+)
+
+__all__ = (
+ "CloudRedisClient",
+ "CloudRedisAsyncClient",
+ "CreateInstanceRequest",
+ "DeleteInstanceRequest",
+ "ExportInstanceRequest",
+ "FailoverInstanceRequest",
+ "GcsDestination",
+ "GcsSource",
+ "GetInstanceAuthStringRequest",
+ "GetInstanceRequest",
+ "ImportInstanceRequest",
+ "InputConfig",
+ "Instance",
+ "InstanceAuthString",
+ "ListInstancesRequest",
+ "ListInstancesResponse",
+ "LocationMetadata",
+ "MaintenancePolicy",
+ "MaintenanceSchedule",
+ "NodeInfo",
+ "OperationMetadata",
+ "OutputConfig",
+ "PersistenceConfig",
+ "RescheduleMaintenanceRequest",
+ "TlsCertificate",
+ "UpdateInstanceRequest",
+ "UpgradeInstanceRequest",
+ "WeeklyMaintenanceWindow",
+ "ZoneMetadata",
+)
diff --git a/packages/google-cloud-redis/google/cloud/redis/gapic_version.py b/packages/google-cloud-redis/google/cloud/redis/gapic_version.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis/gapic_version.py
@@ -0,0 +1,16 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+__version__ = "2.13.0" # {x-release-please-version}
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/__init__.py b/packages/google-cloud-redis/google/cloud/redis_v1/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/__init__.py
@@ -0,0 +1,82 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from google.cloud.redis_v1 import gapic_version as package_version
+
+__version__ = package_version.__version__
+
+
+from .services.cloud_redis import CloudRedisAsyncClient, CloudRedisClient
+from .types.cloud_redis import (
+ CreateInstanceRequest,
+ DeleteInstanceRequest,
+ ExportInstanceRequest,
+ FailoverInstanceRequest,
+ GcsDestination,
+ GcsSource,
+ GetInstanceAuthStringRequest,
+ GetInstanceRequest,
+ ImportInstanceRequest,
+ InputConfig,
+ Instance,
+ InstanceAuthString,
+ ListInstancesRequest,
+ ListInstancesResponse,
+ LocationMetadata,
+ MaintenancePolicy,
+ MaintenanceSchedule,
+ NodeInfo,
+ OperationMetadata,
+ OutputConfig,
+ PersistenceConfig,
+ RescheduleMaintenanceRequest,
+ TlsCertificate,
+ UpdateInstanceRequest,
+ UpgradeInstanceRequest,
+ WeeklyMaintenanceWindow,
+ ZoneMetadata,
+)
+
+__all__ = (
+ "CloudRedisAsyncClient",
+ "CloudRedisClient",
+ "CreateInstanceRequest",
+ "DeleteInstanceRequest",
+ "ExportInstanceRequest",
+ "FailoverInstanceRequest",
+ "GcsDestination",
+ "GcsSource",
+ "GetInstanceAuthStringRequest",
+ "GetInstanceRequest",
+ "ImportInstanceRequest",
+ "InputConfig",
+ "Instance",
+ "InstanceAuthString",
+ "ListInstancesRequest",
+ "ListInstancesResponse",
+ "LocationMetadata",
+ "MaintenancePolicy",
+ "MaintenanceSchedule",
+ "NodeInfo",
+ "OperationMetadata",
+ "OutputConfig",
+ "PersistenceConfig",
+ "RescheduleMaintenanceRequest",
+ "TlsCertificate",
+ "UpdateInstanceRequest",
+ "UpgradeInstanceRequest",
+ "WeeklyMaintenanceWindow",
+ "ZoneMetadata",
+)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/gapic_version.py b/packages/google-cloud-redis/google/cloud/redis_v1/gapic_version.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/gapic_version.py
@@ -0,0 +1,16 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+__version__ = "2.13.0" # {x-release-please-version}
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/services/__init__.py b/packages/google-cloud-redis/google/cloud/redis_v1/services/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/services/__init__.py
@@ -0,0 +1,15 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/__init__.py b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/__init__.py
@@ -0,0 +1,22 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .async_client import CloudRedisAsyncClient
+from .client import CloudRedisClient
+
+__all__ = (
+ "CloudRedisClient",
+ "CloudRedisAsyncClient",
+)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/async_client.py b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/async_client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/async_client.py
@@ -0,0 +1,2027 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import functools
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+)
+
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.api_core.client_options import ClientOptions
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.redis_v1 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.cloud.location import locations_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import empty_pb2 # type: ignore
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.redis_v1.services.cloud_redis import pagers
+from google.cloud.redis_v1.types import cloud_redis
+
+from .client import CloudRedisClient
+from .transports.base import DEFAULT_CLIENT_INFO, CloudRedisTransport
+from .transports.grpc_asyncio import CloudRedisGrpcAsyncIOTransport
+
+
+class CloudRedisAsyncClient:
+ """Configures and manages Cloud Memorystore for Redis instances
+
+ Google Cloud Memorystore for Redis v1
+
+ The ``redis.googleapis.com`` service implements the Google Cloud
+ Memorystore for Redis API and defines the following resource model
+ for managing Redis instances:
+
+ - The service works with a collection of cloud projects, named:
+ ``/projects/*``
+ - Each project has a collection of available locations, named:
+ ``/locations/*``
+ - Each location has a collection of Redis instances, named:
+ ``/instances/*``
+ - As such, Redis instances are resources of the form:
+ ``/projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+
+ Note that location_id must be referring to a GCP ``region``; for
+ example:
+
+ - ``projects/redpepper-1290/locations/us-central1/instances/my-redis``
+ """
+
+ _client: CloudRedisClient
+
+ DEFAULT_ENDPOINT = CloudRedisClient.DEFAULT_ENDPOINT
+ DEFAULT_MTLS_ENDPOINT = CloudRedisClient.DEFAULT_MTLS_ENDPOINT
+
+ instance_path = staticmethod(CloudRedisClient.instance_path)
+ parse_instance_path = staticmethod(CloudRedisClient.parse_instance_path)
+ common_billing_account_path = staticmethod(
+ CloudRedisClient.common_billing_account_path
+ )
+ parse_common_billing_account_path = staticmethod(
+ CloudRedisClient.parse_common_billing_account_path
+ )
+ common_folder_path = staticmethod(CloudRedisClient.common_folder_path)
+ parse_common_folder_path = staticmethod(CloudRedisClient.parse_common_folder_path)
+ common_organization_path = staticmethod(CloudRedisClient.common_organization_path)
+ parse_common_organization_path = staticmethod(
+ CloudRedisClient.parse_common_organization_path
+ )
+ common_project_path = staticmethod(CloudRedisClient.common_project_path)
+ parse_common_project_path = staticmethod(CloudRedisClient.parse_common_project_path)
+ common_location_path = staticmethod(CloudRedisClient.common_location_path)
+ parse_common_location_path = staticmethod(
+ CloudRedisClient.parse_common_location_path
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ CloudRedisAsyncClient: The constructed client.
+ """
+ return CloudRedisClient.from_service_account_info.__func__(CloudRedisAsyncClient, info, *args, **kwargs) # type: ignore
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ CloudRedisAsyncClient: The constructed client.
+ """
+ return CloudRedisClient.from_service_account_file.__func__(CloudRedisAsyncClient, filename, *args, **kwargs) # type: ignore
+
+ from_service_account_json = from_service_account_file
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ return CloudRedisClient.get_mtls_endpoint_and_cert_source(client_options) # type: ignore
+
+ @property
+ def transport(self) -> CloudRedisTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ CloudRedisTransport: The transport used by the client instance.
+ """
+ return self._client.transport
+
+ get_transport_class = functools.partial(
+ type(CloudRedisClient).get_transport_class, type(CloudRedisClient)
+ )
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Union[str, CloudRedisTransport] = "grpc_asyncio",
+ client_options: Optional[ClientOptions] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the cloud redis client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, ~.CloudRedisTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (ClientOptions): Custom options for the client. It
+ won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ self._client = CloudRedisClient(
+ credentials=credentials,
+ transport=transport,
+ client_options=client_options,
+ client_info=client_info,
+ )
+
+ async def list_instances(
+ self,
+ request: Optional[Union[cloud_redis.ListInstancesRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListInstancesAsyncPager:
+ r"""Lists all Redis instances owned by a project in either the
+ specified location (region) or all locations.
+
+ The location should have the following format:
+
+ - ``projects/{project_id}/locations/{location_id}``
+
+ If ``location_id`` is specified as ``-`` (wildcard), then all
+ regions available to the project are queried, and the results
+ are aggregated.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ async def sample_list_instances():
+ # Create a client
+ client = redis_v1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.ListInstancesRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_instances(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1.types.ListInstancesRequest, dict]]):
+ The request object. Request for
+ [ListInstances][google.cloud.redis.v1.CloudRedis.ListInstances].
+ parent (:class:`str`):
+ Required. The resource name of the instance location
+ using the form:
+ ``projects/{project_id}/locations/{location_id}`` where
+ ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.redis_v1.services.cloud_redis.pagers.ListInstancesAsyncPager:
+ Response for
+ [ListInstances][google.cloud.redis.v1.CloudRedis.ListInstances].
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.ListInstancesRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_instances,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListInstancesAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_instance(
+ self,
+ request: Optional[Union[cloud_redis.GetInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.Instance:
+ r"""Gets the details of a specific Redis instance.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ async def sample_get_instance():
+ # Create a client
+ client = redis_v1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.GetInstanceRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_instance(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1.types.GetInstanceRequest, dict]]):
+ The request object. Request for
+ [GetInstance][google.cloud.redis.v1.CloudRedis.GetInstance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.redis_v1.types.Instance:
+ A Memorystore for Redis instance.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.GetInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_instance_auth_string(
+ self,
+ request: Optional[Union[cloud_redis.GetInstanceAuthStringRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.InstanceAuthString:
+ r"""Gets the AUTH string for a Redis instance. If AUTH is
+ not enabled for the instance the response will be empty.
+ This information is not included in the details returned
+ to GetInstance.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ async def sample_get_instance_auth_string():
+ # Create a client
+ client = redis_v1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.GetInstanceAuthStringRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_instance_auth_string(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1.types.GetInstanceAuthStringRequest, dict]]):
+ The request object. Request for
+ [GetInstanceAuthString][google.cloud.redis.v1.CloudRedis.GetInstanceAuthString].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.redis_v1.types.InstanceAuthString:
+ Instance AUTH string details.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.GetInstanceAuthStringRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_instance_auth_string,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def create_instance(
+ self,
+ request: Optional[Union[cloud_redis.CreateInstanceRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ instance_id: Optional[str] = None,
+ instance: Optional[cloud_redis.Instance] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Creates a Redis instance based on the specified tier and memory
+ size.
+
+ By default, the instance is accessible from the project's
+ `default network <https://cloud.google.com/vpc/docs/vpc>`__.
+
+ The creation is executed asynchronously and callers may check
+ the returned operation to track its progress. Once the operation
+ is completed the Redis instance will be fully functional.
+ Completed longrunning.Operation will contain the new instance
+ object in the response field.
+
+ The returned operation is automatically deleted after a few
+ hours, so there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ async def sample_create_instance():
+ # Create a client
+ client = redis_v1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ instance = redis_v1.Instance()
+ instance.name = "name_value"
+ instance.tier = "STANDARD_HA"
+ instance.memory_size_gb = 1499
+
+ request = redis_v1.CreateInstanceRequest(
+ parent="parent_value",
+ instance_id="instance_id_value",
+ instance=instance,
+ )
+
+ # Make the request
+ operation = client.create_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1.types.CreateInstanceRequest, dict]]):
+ The request object. Request for
+ [CreateInstance][google.cloud.redis.v1.CloudRedis.CreateInstance].
+ parent (:class:`str`):
+ Required. The resource name of the instance location
+ using the form:
+ ``projects/{project_id}/locations/{location_id}`` where
+ ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ instance_id (:class:`str`):
+ Required. The logical name of the Redis instance in the
+ customer project with the following restrictions:
+
+ - Must contain only lowercase letters, numbers, and
+ hyphens.
+ - Must start with a letter.
+ - Must be between 1-40 characters.
+ - Must end with a number or a letter.
+ - Must be unique within the customer project / location
+
+ This corresponds to the ``instance_id`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ instance (:class:`google.cloud.redis_v1.types.Instance`):
+ Required. A Redis [Instance] resource
+ This corresponds to the ``instance`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, instance_id, instance])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.CreateInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if instance_id is not None:
+ request.instance_id = instance_id
+ if instance is not None:
+ request.instance = instance
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.create_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def update_instance(
+ self,
+ request: Optional[Union[cloud_redis.UpdateInstanceRequest, dict]] = None,
+ *,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ instance: Optional[cloud_redis.Instance] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Updates the metadata and configuration of a specific
+ Redis instance.
+ Completed longrunning.Operation will contain the new
+ instance object in the response field. The returned
+ operation is automatically deleted after a few hours, so
+ there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ async def sample_update_instance():
+ # Create a client
+ client = redis_v1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ instance = redis_v1.Instance()
+ instance.name = "name_value"
+ instance.tier = "STANDARD_HA"
+ instance.memory_size_gb = 1499
+
+ request = redis_v1.UpdateInstanceRequest(
+ instance=instance,
+ )
+
+ # Make the request
+ operation = client.update_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1.types.UpdateInstanceRequest, dict]]):
+ The request object. Request for
+ [UpdateInstance][google.cloud.redis.v1.CloudRedis.UpdateInstance].
+ update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`):
+ Required. Mask of fields to update. At least one path
+ must be supplied in this field. The elements of the
+ repeated paths field may only include these fields from
+ [Instance][google.cloud.redis.v1.Instance]:
+
+ - ``displayName``
+ - ``labels``
+ - ``memorySizeGb``
+ - ``redisConfig``
+ - ``replica_count``
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ instance (:class:`google.cloud.redis_v1.types.Instance`):
+ Required. Update description. Only fields specified in
+ update_mask are updated.
+
+ This corresponds to the ``instance`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([update_mask, instance])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.UpdateInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if update_mask is not None:
+ request.update_mask = update_mask
+ if instance is not None:
+ request.instance = instance
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.update_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("instance.name", request.instance.name),)
+ ),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def upgrade_instance(
+ self,
+ request: Optional[Union[cloud_redis.UpgradeInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ redis_version: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Upgrades Redis instance to the newer Redis version
+ specified in the request.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ async def sample_upgrade_instance():
+ # Create a client
+ client = redis_v1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.UpgradeInstanceRequest(
+ name="name_value",
+ redis_version="redis_version_value",
+ )
+
+ # Make the request
+ operation = client.upgrade_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1.types.UpgradeInstanceRequest, dict]]):
+ The request object. Request for
+ [UpgradeInstance][google.cloud.redis.v1.CloudRedis.UpgradeInstance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ redis_version (:class:`str`):
+ Required. Specifies the target
+ version of Redis software to upgrade to.
+
+ This corresponds to the ``redis_version`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, redis_version])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.UpgradeInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if redis_version is not None:
+ request.redis_version = redis_version
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.upgrade_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def import_instance(
+ self,
+ request: Optional[Union[cloud_redis.ImportInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ input_config: Optional[cloud_redis.InputConfig] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Import a Redis RDB snapshot file from Cloud Storage
+ into a Redis instance.
+ Redis may stop serving during this operation. Instance
+ state will be IMPORTING for entire operation. When
+ complete, the instance will contain only data from the
+ imported file.
+
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ async def sample_import_instance():
+ # Create a client
+ client = redis_v1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ input_config = redis_v1.InputConfig()
+ input_config.gcs_source.uri = "uri_value"
+
+ request = redis_v1.ImportInstanceRequest(
+ name="name_value",
+ input_config=input_config,
+ )
+
+ # Make the request
+ operation = client.import_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1.types.ImportInstanceRequest, dict]]):
+ The request object. Request for
+ [Import][google.cloud.redis.v1.CloudRedis.ImportInstance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ input_config (:class:`google.cloud.redis_v1.types.InputConfig`):
+ Required. Specify data to be
+ imported.
+
+ This corresponds to the ``input_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, input_config])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.ImportInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if input_config is not None:
+ request.input_config = input_config
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.import_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def export_instance(
+ self,
+ request: Optional[Union[cloud_redis.ExportInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ output_config: Optional[cloud_redis.OutputConfig] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Export Redis instance data into a Redis RDB format
+ file in Cloud Storage.
+ Redis will continue serving during this operation.
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ async def sample_export_instance():
+ # Create a client
+ client = redis_v1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ output_config = redis_v1.OutputConfig()
+ output_config.gcs_destination.uri = "uri_value"
+
+ request = redis_v1.ExportInstanceRequest(
+ name="name_value",
+ output_config=output_config,
+ )
+
+ # Make the request
+ operation = client.export_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1.types.ExportInstanceRequest, dict]]):
+ The request object. Request for
+ [Export][google.cloud.redis.v1.CloudRedis.ExportInstance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ output_config (:class:`google.cloud.redis_v1.types.OutputConfig`):
+ Required. Specify data to be
+ exported.
+
+ This corresponds to the ``output_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, output_config])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.ExportInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if output_config is not None:
+ request.output_config = output_config
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.export_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def failover_instance(
+ self,
+ request: Optional[Union[cloud_redis.FailoverInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ data_protection_mode: Optional[
+ cloud_redis.FailoverInstanceRequest.DataProtectionMode
+ ] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Initiates a failover of the primary node to current
+ replica node for a specific STANDARD tier Cloud
+ Memorystore for Redis instance.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ async def sample_failover_instance():
+ # Create a client
+ client = redis_v1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.FailoverInstanceRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.failover_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1.types.FailoverInstanceRequest, dict]]):
+ The request object. Request for
+ [Failover][google.cloud.redis.v1.CloudRedis.FailoverInstance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ data_protection_mode (:class:`google.cloud.redis_v1.types.FailoverInstanceRequest.DataProtectionMode`):
+ Optional. Available data protection modes that the user
+ can choose. If it's unspecified, data protection mode
+ will be LIMITED_DATA_LOSS by default.
+
+ This corresponds to the ``data_protection_mode`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, data_protection_mode])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.FailoverInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if data_protection_mode is not None:
+ request.data_protection_mode = data_protection_mode
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.failover_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def delete_instance(
+ self,
+ request: Optional[Union[cloud_redis.DeleteInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Deletes a specific Redis instance. Instance stops
+ serving and data is deleted.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ async def sample_delete_instance():
+ # Create a client
+ client = redis_v1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.DeleteInstanceRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1.types.DeleteInstanceRequest, dict]]):
+ The request object. Request for
+ [DeleteInstance][google.cloud.redis.v1.CloudRedis.DeleteInstance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.protobuf.empty_pb2.Empty` A generic empty message that you can re-use to avoid defining duplicated
+ empty messages in your APIs. A typical example is to
+ use it as the request or the response type of an API
+ method. For instance:
+
+ service Foo {
+ rpc Bar(google.protobuf.Empty) returns
+ (google.protobuf.Empty);
+
+ }
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.DeleteInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.delete_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ empty_pb2.Empty,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def reschedule_maintenance(
+ self,
+ request: Optional[Union[cloud_redis.RescheduleMaintenanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ reschedule_type: Optional[
+ cloud_redis.RescheduleMaintenanceRequest.RescheduleType
+ ] = None,
+ schedule_time: Optional[timestamp_pb2.Timestamp] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Reschedule maintenance for a given instance in a
+ given project and location.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ async def sample_reschedule_maintenance():
+ # Create a client
+ client = redis_v1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.RescheduleMaintenanceRequest(
+ name="name_value",
+ reschedule_type="SPECIFIC_TIME",
+ )
+
+ # Make the request
+ operation = client.reschedule_maintenance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1.types.RescheduleMaintenanceRequest, dict]]):
+ The request object. Request for
+ [RescheduleMaintenance][google.cloud.redis.v1.CloudRedis.RescheduleMaintenance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ reschedule_type (:class:`google.cloud.redis_v1.types.RescheduleMaintenanceRequest.RescheduleType`):
+ Required. If reschedule type is SPECIFIC_TIME, must set
+ up schedule_time as well.
+
+ This corresponds to the ``reschedule_type`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ schedule_time (:class:`google.protobuf.timestamp_pb2.Timestamp`):
+ Optional. Timestamp when the maintenance shall be
+ rescheduled to if reschedule_type=SPECIFIC_TIME, in RFC
+ 3339 format, for example ``2012-11-15T16:19:00.094Z``.
+
+ This corresponds to the ``schedule_time`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, reschedule_type, schedule_time])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.RescheduleMaintenanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if reschedule_type is not None:
+ request.reschedule_type = reschedule_type
+ if schedule_time is not None:
+ request.schedule_time = schedule_time
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.reschedule_maintenance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_operations(
+ self,
+ request: Optional[operations_pb2.ListOperationsRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.ListOperationsResponse:
+ r"""Lists operations that match the specified filter in the request.
+
+ Args:
+ request (:class:`~.operations_pb2.ListOperationsRequest`):
+ The request object. Request message for
+ `ListOperations` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.ListOperationsResponse:
+ Response message for ``ListOperations`` method.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.ListOperationsRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._client._transport.list_operations,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._client._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def delete_operation(
+ self,
+ request: Optional[operations_pb2.DeleteOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> None:
+ r"""Deletes a long-running operation.
+
+ This method indicates that the client is no longer interested
+ in the operation result. It does not cancel the operation.
+ If the server doesn't support this method, it returns
+ `google.rpc.Code.UNIMPLEMENTED`.
+
+ Args:
+ request (:class:`~.operations_pb2.DeleteOperationRequest`):
+ The request object. Request message for
+ `DeleteOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ None
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.DeleteOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._client._transport.delete_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ async def cancel_operation(
+ self,
+ request: Optional[operations_pb2.CancelOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> None:
+ r"""Starts asynchronous cancellation on a long-running operation.
+
+ The server makes a best effort to cancel the operation, but success
+ is not guaranteed. If the server doesn't support this method, it returns
+ `google.rpc.Code.UNIMPLEMENTED`.
+
+ Args:
+ request (:class:`~.operations_pb2.CancelOperationRequest`):
+ The request object. Request message for
+ `CancelOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ None
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.CancelOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._client._transport.cancel_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ async def get_location(
+ self,
+ request: Optional[locations_pb2.GetLocationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> locations_pb2.Location:
+ r"""Gets information about a location.
+
+ Args:
+ request (:class:`~.location_pb2.GetLocationRequest`):
+ The request object. Request message for
+ `GetLocation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.location_pb2.Location:
+ Location object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = locations_pb2.GetLocationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._client._transport.get_location,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_locations(
+ self,
+ request: Optional[locations_pb2.ListLocationsRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> locations_pb2.ListLocationsResponse:
+ r"""Lists information about the supported locations for this service.
+
+ Args:
+ request (:class:`~.location_pb2.ListLocationsRequest`):
+ The request object. Request message for
+ `ListLocations` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.location_pb2.ListLocationsResponse:
+ Response message for ``ListLocations`` method.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = locations_pb2.ListLocationsRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._client._transport.list_locations,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, exc_type, exc, tb):
+ await self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("CloudRedisAsyncClient",)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/client.py b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/client.py
@@ -0,0 +1,2267 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import os
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+ cast,
+)
+
+from google.api_core import client_options as client_options_lib
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.exceptions import MutualTLSChannelError # type: ignore
+from google.auth.transport import mtls # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.redis_v1 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.cloud.location import locations_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import empty_pb2 # type: ignore
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.redis_v1.services.cloud_redis import pagers
+from google.cloud.redis_v1.types import cloud_redis
+
+from .transports.base import DEFAULT_CLIENT_INFO, CloudRedisTransport
+from .transports.grpc import CloudRedisGrpcTransport
+from .transports.grpc_asyncio import CloudRedisGrpcAsyncIOTransport
+from .transports.rest import CloudRedisRestTransport
+
+
+class CloudRedisClientMeta(type):
+ """Metaclass for the CloudRedis client.
+
+ This provides class-level methods for building and retrieving
+ support objects (e.g. transport) without polluting the client instance
+ objects.
+ """
+
+ _transport_registry = OrderedDict() # type: Dict[str, Type[CloudRedisTransport]]
+ _transport_registry["grpc"] = CloudRedisGrpcTransport
+ _transport_registry["grpc_asyncio"] = CloudRedisGrpcAsyncIOTransport
+ _transport_registry["rest"] = CloudRedisRestTransport
+
+ def get_transport_class(
+ cls,
+ label: Optional[str] = None,
+ ) -> Type[CloudRedisTransport]:
+ """Returns an appropriate transport class.
+
+ Args:
+ label: The name of the desired transport. If none is
+ provided, then the first transport in the registry is used.
+
+ Returns:
+ The transport class to use.
+ """
+ # If a specific transport is requested, return that one.
+ if label:
+ return cls._transport_registry[label]
+
+ # No transport is requested; return the default (that is, the first one
+ # in the dictionary).
+ return next(iter(cls._transport_registry.values()))
+
+
+class CloudRedisClient(metaclass=CloudRedisClientMeta):
+ """Configures and manages Cloud Memorystore for Redis instances
+
+ Google Cloud Memorystore for Redis v1
+
+ The ``redis.googleapis.com`` service implements the Google Cloud
+ Memorystore for Redis API and defines the following resource model
+ for managing Redis instances:
+
+ - The service works with a collection of cloud projects, named:
+ ``/projects/*``
+ - Each project has a collection of available locations, named:
+ ``/locations/*``
+ - Each location has a collection of Redis instances, named:
+ ``/instances/*``
+ - As such, Redis instances are resources of the form:
+ ``/projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+
+ Note that location_id must be referring to a GCP ``region``; for
+ example:
+
+ - ``projects/redpepper-1290/locations/us-central1/instances/my-redis``
+ """
+
+ @staticmethod
+ def _get_default_mtls_endpoint(api_endpoint):
+ """Converts api endpoint to mTLS endpoint.
+
+ Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to
+ "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively.
+ Args:
+ api_endpoint (Optional[str]): the api endpoint to convert.
+ Returns:
+ str: converted mTLS api endpoint.
+ """
+ if not api_endpoint:
+ return api_endpoint
+
+ mtls_endpoint_re = re.compile(
+ r"(?P<name>[^.]+)(?P<mtls>\.mtls)?(?P<sandbox>\.sandbox)?(?P<googledomain>\.googleapis\.com)?"
+ )
+
+ m = mtls_endpoint_re.match(api_endpoint)
+ name, mtls, sandbox, googledomain = m.groups()
+ if mtls or not googledomain:
+ return api_endpoint
+
+ if sandbox:
+ return api_endpoint.replace(
+ "sandbox.googleapis.com", "mtls.sandbox.googleapis.com"
+ )
+
+ return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com")
+
+ DEFAULT_ENDPOINT = "redis.googleapis.com"
+ DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore
+ DEFAULT_ENDPOINT
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ CloudRedisClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_info(info)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ CloudRedisClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_file(filename)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ from_service_account_json = from_service_account_file
+
+ @property
+ def transport(self) -> CloudRedisTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ CloudRedisTransport: The transport used by the client
+ instance.
+ """
+ return self._transport
+
+ @staticmethod
+ def instance_path(
+ project: str,
+ location: str,
+ instance: str,
+ ) -> str:
+ """Returns a fully-qualified instance string."""
+ return "projects/{project}/locations/{location}/instances/{instance}".format(
+ project=project,
+ location=location,
+ instance=instance,
+ )
+
+ @staticmethod
+ def parse_instance_path(path: str) -> Dict[str, str]:
+ """Parses a instance path into its component segments."""
+ m = re.match(
+ r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)/instances/(?P<instance>.+?)$",
+ path,
+ )
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_billing_account_path(
+ billing_account: str,
+ ) -> str:
+ """Returns a fully-qualified billing_account string."""
+ return "billingAccounts/{billing_account}".format(
+ billing_account=billing_account,
+ )
+
+ @staticmethod
+ def parse_common_billing_account_path(path: str) -> Dict[str, str]:
+ """Parse a billing_account path into its component segments."""
+ m = re.match(r"^billingAccounts/(?P<billing_account>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_folder_path(
+ folder: str,
+ ) -> str:
+ """Returns a fully-qualified folder string."""
+ return "folders/{folder}".format(
+ folder=folder,
+ )
+
+ @staticmethod
+ def parse_common_folder_path(path: str) -> Dict[str, str]:
+ """Parse a folder path into its component segments."""
+ m = re.match(r"^folders/(?P<folder>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_organization_path(
+ organization: str,
+ ) -> str:
+ """Returns a fully-qualified organization string."""
+ return "organizations/{organization}".format(
+ organization=organization,
+ )
+
+ @staticmethod
+ def parse_common_organization_path(path: str) -> Dict[str, str]:
+ """Parse a organization path into its component segments."""
+ m = re.match(r"^organizations/(?P<organization>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_project_path(
+ project: str,
+ ) -> str:
+ """Returns a fully-qualified project string."""
+ return "projects/{project}".format(
+ project=project,
+ )
+
+ @staticmethod
+ def parse_common_project_path(path: str) -> Dict[str, str]:
+ """Parse a project path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_location_path(
+ project: str,
+ location: str,
+ ) -> str:
+ """Returns a fully-qualified location string."""
+ return "projects/{project}/locations/{location}".format(
+ project=project,
+ location=location,
+ )
+
+ @staticmethod
+ def parse_common_location_path(path: str) -> Dict[str, str]:
+ """Parse a location path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[client_options_lib.ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ use_client_cert = os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")
+ use_mtls_endpoint = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto")
+ if use_client_cert not in ("true", "false"):
+ raise ValueError(
+ "Environment variable `GOOGLE_API_USE_CLIENT_CERTIFICATE` must be either `true` or `false`"
+ )
+ if use_mtls_endpoint not in ("auto", "never", "always"):
+ raise MutualTLSChannelError(
+ "Environment variable `GOOGLE_API_USE_MTLS_ENDPOINT` must be `never`, `auto` or `always`"
+ )
+
+ # Figure out the client cert source to use.
+ client_cert_source = None
+ if use_client_cert == "true":
+ if client_options.client_cert_source:
+ client_cert_source = client_options.client_cert_source
+ elif mtls.has_default_client_cert_source():
+ client_cert_source = mtls.default_client_cert_source()
+
+ # Figure out which api endpoint to use.
+ if client_options.api_endpoint is not None:
+ api_endpoint = client_options.api_endpoint
+ elif use_mtls_endpoint == "always" or (
+ use_mtls_endpoint == "auto" and client_cert_source
+ ):
+ api_endpoint = cls.DEFAULT_MTLS_ENDPOINT
+ else:
+ api_endpoint = cls.DEFAULT_ENDPOINT
+
+ return api_endpoint, client_cert_source
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Optional[Union[str, CloudRedisTransport]] = None,
+ client_options: Optional[Union[client_options_lib.ClientOptions, dict]] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the cloud redis client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, CloudRedisTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (Optional[Union[google.api_core.client_options.ClientOptions, dict]]): Custom options for the
+ client. It won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ if isinstance(client_options, dict):
+ client_options = client_options_lib.from_dict(client_options)
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ client_options = cast(client_options_lib.ClientOptions, client_options)
+
+ api_endpoint, client_cert_source_func = self.get_mtls_endpoint_and_cert_source(
+ client_options
+ )
+
+ api_key_value = getattr(client_options, "api_key", None)
+ if api_key_value and credentials:
+ raise ValueError(
+ "client_options.api_key and credentials are mutually exclusive"
+ )
+
+ # Save or instantiate the transport.
+ # Ordinarily, we provide the transport, but allowing a custom transport
+ # instance provides an extensibility point for unusual situations.
+ if isinstance(transport, CloudRedisTransport):
+ # transport is a CloudRedisTransport instance.
+ if credentials or client_options.credentials_file or api_key_value:
+ raise ValueError(
+ "When providing a transport instance, "
+ "provide its credentials directly."
+ )
+ if client_options.scopes:
+ raise ValueError(
+ "When providing a transport instance, provide its scopes "
+ "directly."
+ )
+ self._transport = transport
+ else:
+ import google.auth._default # type: ignore
+
+ if api_key_value and hasattr(
+ google.auth._default, "get_api_key_credentials"
+ ):
+ credentials = google.auth._default.get_api_key_credentials(
+ api_key_value
+ )
+
+ Transport = type(self).get_transport_class(transport)
+ self._transport = Transport(
+ credentials=credentials,
+ credentials_file=client_options.credentials_file,
+ host=api_endpoint,
+ scopes=client_options.scopes,
+ client_cert_source_for_mtls=client_cert_source_func,
+ quota_project_id=client_options.quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=True,
+ api_audience=client_options.api_audience,
+ )
+
+ def list_instances(
+ self,
+ request: Optional[Union[cloud_redis.ListInstancesRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListInstancesPager:
+ r"""Lists all Redis instances owned by a project in either the
+ specified location (region) or all locations.
+
+ The location should have the following format:
+
+ - ``projects/{project_id}/locations/{location_id}``
+
+ If ``location_id`` is specified as ``-`` (wildcard), then all
+ regions available to the project are queried, and the results
+ are aggregated.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ def sample_list_instances():
+ # Create a client
+ client = redis_v1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.ListInstancesRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_instances(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1.types.ListInstancesRequest, dict]):
+ The request object. Request for
+ [ListInstances][google.cloud.redis.v1.CloudRedis.ListInstances].
+ parent (str):
+ Required. The resource name of the instance location
+ using the form:
+ ``projects/{project_id}/locations/{location_id}`` where
+ ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.redis_v1.services.cloud_redis.pagers.ListInstancesPager:
+ Response for
+ [ListInstances][google.cloud.redis.v1.CloudRedis.ListInstances].
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.ListInstancesRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.ListInstancesRequest):
+ request = cloud_redis.ListInstancesRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_instances]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListInstancesPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_instance(
+ self,
+ request: Optional[Union[cloud_redis.GetInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.Instance:
+ r"""Gets the details of a specific Redis instance.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ def sample_get_instance():
+ # Create a client
+ client = redis_v1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.GetInstanceRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_instance(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1.types.GetInstanceRequest, dict]):
+ The request object. Request for
+ [GetInstance][google.cloud.redis.v1.CloudRedis.GetInstance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.redis_v1.types.Instance:
+ A Memorystore for Redis instance.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.GetInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.GetInstanceRequest):
+ request = cloud_redis.GetInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_instance_auth_string(
+ self,
+ request: Optional[Union[cloud_redis.GetInstanceAuthStringRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.InstanceAuthString:
+ r"""Gets the AUTH string for a Redis instance. If AUTH is
+ not enabled for the instance the response will be empty.
+ This information is not included in the details returned
+ to GetInstance.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ def sample_get_instance_auth_string():
+ # Create a client
+ client = redis_v1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.GetInstanceAuthStringRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_instance_auth_string(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1.types.GetInstanceAuthStringRequest, dict]):
+ The request object. Request for
+ [GetInstanceAuthString][google.cloud.redis.v1.CloudRedis.GetInstanceAuthString].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.redis_v1.types.InstanceAuthString:
+ Instance AUTH string details.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.GetInstanceAuthStringRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.GetInstanceAuthStringRequest):
+ request = cloud_redis.GetInstanceAuthStringRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_instance_auth_string]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def create_instance(
+ self,
+ request: Optional[Union[cloud_redis.CreateInstanceRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ instance_id: Optional[str] = None,
+ instance: Optional[cloud_redis.Instance] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Creates a Redis instance based on the specified tier and memory
+ size.
+
+ By default, the instance is accessible from the project's
+ `default network <https://cloud.google.com/vpc/docs/vpc>`__.
+
+ The creation is executed asynchronously and callers may check
+ the returned operation to track its progress. Once the operation
+ is completed the Redis instance will be fully functional.
+ Completed longrunning.Operation will contain the new instance
+ object in the response field.
+
+ The returned operation is automatically deleted after a few
+ hours, so there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ def sample_create_instance():
+ # Create a client
+ client = redis_v1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ instance = redis_v1.Instance()
+ instance.name = "name_value"
+ instance.tier = "STANDARD_HA"
+ instance.memory_size_gb = 1499
+
+ request = redis_v1.CreateInstanceRequest(
+ parent="parent_value",
+ instance_id="instance_id_value",
+ instance=instance,
+ )
+
+ # Make the request
+ operation = client.create_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1.types.CreateInstanceRequest, dict]):
+ The request object. Request for
+ [CreateInstance][google.cloud.redis.v1.CloudRedis.CreateInstance].
+ parent (str):
+ Required. The resource name of the instance location
+ using the form:
+ ``projects/{project_id}/locations/{location_id}`` where
+ ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ instance_id (str):
+ Required. The logical name of the Redis instance in the
+ customer project with the following restrictions:
+
+ - Must contain only lowercase letters, numbers, and
+ hyphens.
+ - Must start with a letter.
+ - Must be between 1-40 characters.
+ - Must end with a number or a letter.
+ - Must be unique within the customer project / location
+
+ This corresponds to the ``instance_id`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ instance (google.cloud.redis_v1.types.Instance):
+ Required. A Redis [Instance] resource
+ This corresponds to the ``instance`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, instance_id, instance])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.CreateInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.CreateInstanceRequest):
+ request = cloud_redis.CreateInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if instance_id is not None:
+ request.instance_id = instance_id
+ if instance is not None:
+ request.instance = instance
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.create_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def update_instance(
+ self,
+ request: Optional[Union[cloud_redis.UpdateInstanceRequest, dict]] = None,
+ *,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ instance: Optional[cloud_redis.Instance] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Updates the metadata and configuration of a specific
+ Redis instance.
+ Completed longrunning.Operation will contain the new
+ instance object in the response field. The returned
+ operation is automatically deleted after a few hours, so
+ there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ def sample_update_instance():
+ # Create a client
+ client = redis_v1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ instance = redis_v1.Instance()
+ instance.name = "name_value"
+ instance.tier = "STANDARD_HA"
+ instance.memory_size_gb = 1499
+
+ request = redis_v1.UpdateInstanceRequest(
+ instance=instance,
+ )
+
+ # Make the request
+ operation = client.update_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1.types.UpdateInstanceRequest, dict]):
+ The request object. Request for
+ [UpdateInstance][google.cloud.redis.v1.CloudRedis.UpdateInstance].
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Required. Mask of fields to update. At least one path
+ must be supplied in this field. The elements of the
+ repeated paths field may only include these fields from
+ [Instance][google.cloud.redis.v1.Instance]:
+
+ - ``displayName``
+ - ``labels``
+ - ``memorySizeGb``
+ - ``redisConfig``
+ - ``replica_count``
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ instance (google.cloud.redis_v1.types.Instance):
+ Required. Update description. Only fields specified in
+ update_mask are updated.
+
+ This corresponds to the ``instance`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([update_mask, instance])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.UpdateInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.UpdateInstanceRequest):
+ request = cloud_redis.UpdateInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if update_mask is not None:
+ request.update_mask = update_mask
+ if instance is not None:
+ request.instance = instance
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.update_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("instance.name", request.instance.name),)
+ ),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def upgrade_instance(
+ self,
+ request: Optional[Union[cloud_redis.UpgradeInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ redis_version: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Upgrades Redis instance to the newer Redis version
+ specified in the request.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ def sample_upgrade_instance():
+ # Create a client
+ client = redis_v1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.UpgradeInstanceRequest(
+ name="name_value",
+ redis_version="redis_version_value",
+ )
+
+ # Make the request
+ operation = client.upgrade_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1.types.UpgradeInstanceRequest, dict]):
+ The request object. Request for
+ [UpgradeInstance][google.cloud.redis.v1.CloudRedis.UpgradeInstance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ redis_version (str):
+ Required. Specifies the target
+ version of Redis software to upgrade to.
+
+ This corresponds to the ``redis_version`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, redis_version])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.UpgradeInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.UpgradeInstanceRequest):
+ request = cloud_redis.UpgradeInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if redis_version is not None:
+ request.redis_version = redis_version
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.upgrade_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def import_instance(
+ self,
+ request: Optional[Union[cloud_redis.ImportInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ input_config: Optional[cloud_redis.InputConfig] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Import a Redis RDB snapshot file from Cloud Storage
+ into a Redis instance.
+ Redis may stop serving during this operation. Instance
+ state will be IMPORTING for entire operation. When
+ complete, the instance will contain only data from the
+ imported file.
+
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ def sample_import_instance():
+ # Create a client
+ client = redis_v1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ input_config = redis_v1.InputConfig()
+ input_config.gcs_source.uri = "uri_value"
+
+ request = redis_v1.ImportInstanceRequest(
+ name="name_value",
+ input_config=input_config,
+ )
+
+ # Make the request
+ operation = client.import_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1.types.ImportInstanceRequest, dict]):
+ The request object. Request for
+ [Import][google.cloud.redis.v1.CloudRedis.ImportInstance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ input_config (google.cloud.redis_v1.types.InputConfig):
+ Required. Specify data to be
+ imported.
+
+ This corresponds to the ``input_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, input_config])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.ImportInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.ImportInstanceRequest):
+ request = cloud_redis.ImportInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if input_config is not None:
+ request.input_config = input_config
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.import_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def export_instance(
+ self,
+ request: Optional[Union[cloud_redis.ExportInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ output_config: Optional[cloud_redis.OutputConfig] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Export Redis instance data into a Redis RDB format
+ file in Cloud Storage.
+ Redis will continue serving during this operation.
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ def sample_export_instance():
+ # Create a client
+ client = redis_v1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ output_config = redis_v1.OutputConfig()
+ output_config.gcs_destination.uri = "uri_value"
+
+ request = redis_v1.ExportInstanceRequest(
+ name="name_value",
+ output_config=output_config,
+ )
+
+ # Make the request
+ operation = client.export_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1.types.ExportInstanceRequest, dict]):
+ The request object. Request for
+ [Export][google.cloud.redis.v1.CloudRedis.ExportInstance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ output_config (google.cloud.redis_v1.types.OutputConfig):
+ Required. Specify data to be
+ exported.
+
+ This corresponds to the ``output_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, output_config])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.ExportInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.ExportInstanceRequest):
+ request = cloud_redis.ExportInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if output_config is not None:
+ request.output_config = output_config
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.export_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def failover_instance(
+ self,
+ request: Optional[Union[cloud_redis.FailoverInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ data_protection_mode: Optional[
+ cloud_redis.FailoverInstanceRequest.DataProtectionMode
+ ] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Initiates a failover of the primary node to current
+ replica node for a specific STANDARD tier Cloud
+ Memorystore for Redis instance.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ def sample_failover_instance():
+ # Create a client
+ client = redis_v1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.FailoverInstanceRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.failover_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1.types.FailoverInstanceRequest, dict]):
+ The request object. Request for
+ [Failover][google.cloud.redis.v1.CloudRedis.FailoverInstance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ data_protection_mode (google.cloud.redis_v1.types.FailoverInstanceRequest.DataProtectionMode):
+ Optional. Available data protection modes that the user
+ can choose. If it's unspecified, data protection mode
+ will be LIMITED_DATA_LOSS by default.
+
+ This corresponds to the ``data_protection_mode`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, data_protection_mode])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.FailoverInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.FailoverInstanceRequest):
+ request = cloud_redis.FailoverInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if data_protection_mode is not None:
+ request.data_protection_mode = data_protection_mode
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.failover_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def delete_instance(
+ self,
+ request: Optional[Union[cloud_redis.DeleteInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Deletes a specific Redis instance. Instance stops
+ serving and data is deleted.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ def sample_delete_instance():
+ # Create a client
+ client = redis_v1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.DeleteInstanceRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1.types.DeleteInstanceRequest, dict]):
+ The request object. Request for
+ [DeleteInstance][google.cloud.redis.v1.CloudRedis.DeleteInstance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.protobuf.empty_pb2.Empty` A generic empty message that you can re-use to avoid defining duplicated
+ empty messages in your APIs. A typical example is to
+ use it as the request or the response type of an API
+ method. For instance:
+
+ service Foo {
+ rpc Bar(google.protobuf.Empty) returns
+ (google.protobuf.Empty);
+
+ }
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.DeleteInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.DeleteInstanceRequest):
+ request = cloud_redis.DeleteInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.delete_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ empty_pb2.Empty,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def reschedule_maintenance(
+ self,
+ request: Optional[Union[cloud_redis.RescheduleMaintenanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ reschedule_type: Optional[
+ cloud_redis.RescheduleMaintenanceRequest.RescheduleType
+ ] = None,
+ schedule_time: Optional[timestamp_pb2.Timestamp] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Reschedule maintenance for a given instance in a
+ given project and location.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1
+
+ def sample_reschedule_maintenance():
+ # Create a client
+ client = redis_v1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1.RescheduleMaintenanceRequest(
+ name="name_value",
+ reschedule_type="SPECIFIC_TIME",
+ )
+
+ # Make the request
+ operation = client.reschedule_maintenance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1.types.RescheduleMaintenanceRequest, dict]):
+ The request object. Request for
+ [RescheduleMaintenance][google.cloud.redis.v1.CloudRedis.RescheduleMaintenance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ reschedule_type (google.cloud.redis_v1.types.RescheduleMaintenanceRequest.RescheduleType):
+ Required. If reschedule type is SPECIFIC_TIME, must set
+ up schedule_time as well.
+
+ This corresponds to the ``reschedule_type`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ schedule_time (google.protobuf.timestamp_pb2.Timestamp):
+ Optional. Timestamp when the maintenance shall be
+ rescheduled to if reschedule_type=SPECIFIC_TIME, in RFC
+ 3339 format, for example ``2012-11-15T16:19:00.094Z``.
+
+ This corresponds to the ``schedule_time`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, reschedule_type, schedule_time])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.RescheduleMaintenanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.RescheduleMaintenanceRequest):
+ request = cloud_redis.RescheduleMaintenanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if reschedule_type is not None:
+ request.reschedule_type = reschedule_type
+ if schedule_time is not None:
+ request.schedule_time = schedule_time
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.reschedule_maintenance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=cloud_redis.OperationMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def __enter__(self) -> "CloudRedisClient":
+ return self
+
+ def __exit__(self, type, value, traceback):
+ """Releases underlying transport's resources.
+
+ .. warning::
+ ONLY use as a context manager if the transport is NOT shared
+ with other clients! Exiting the with block will CLOSE the transport
+ and may cause errors in other clients!
+ """
+ self.transport.close()
+
+ def list_operations(
+ self,
+ request: Optional[operations_pb2.ListOperationsRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.ListOperationsResponse:
+ r"""Lists operations that match the specified filter in the request.
+
+ Args:
+ request (:class:`~.operations_pb2.ListOperationsRequest`):
+ The request object. Request message for
+ `ListOperations` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.ListOperationsResponse:
+ Response message for ``ListOperations`` method.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.ListOperationsRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._transport.list_operations,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def delete_operation(
+ self,
+ request: Optional[operations_pb2.DeleteOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> None:
+ r"""Deletes a long-running operation.
+
+ This method indicates that the client is no longer interested
+ in the operation result. It does not cancel the operation.
+ If the server doesn't support this method, it returns
+ `google.rpc.Code.UNIMPLEMENTED`.
+
+ Args:
+ request (:class:`~.operations_pb2.DeleteOperationRequest`):
+ The request object. Request message for
+ `DeleteOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ None
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.DeleteOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._transport.delete_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ def cancel_operation(
+ self,
+ request: Optional[operations_pb2.CancelOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> None:
+ r"""Starts asynchronous cancellation on a long-running operation.
+
+ The server makes a best effort to cancel the operation, but success
+ is not guaranteed. If the server doesn't support this method, it returns
+ `google.rpc.Code.UNIMPLEMENTED`.
+
+ Args:
+ request (:class:`~.operations_pb2.CancelOperationRequest`):
+ The request object. Request message for
+ `CancelOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ None
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.CancelOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._transport.cancel_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ def get_location(
+ self,
+ request: Optional[locations_pb2.GetLocationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> locations_pb2.Location:
+ r"""Gets information about a location.
+
+ Args:
+ request (:class:`~.location_pb2.GetLocationRequest`):
+ The request object. Request message for
+ `GetLocation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.location_pb2.Location:
+ Location object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = locations_pb2.GetLocationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._transport.get_location,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_locations(
+ self,
+ request: Optional[locations_pb2.ListLocationsRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> locations_pb2.ListLocationsResponse:
+ r"""Lists information about the supported locations for this service.
+
+ Args:
+ request (:class:`~.location_pb2.ListLocationsRequest`):
+ The request object. Request message for
+ `ListLocations` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.location_pb2.ListLocationsResponse:
+ Response message for ``ListLocations`` method.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = locations_pb2.ListLocationsRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._transport.list_locations,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("CloudRedisClient",)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/pagers.py b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/pagers.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/pagers.py
@@ -0,0 +1,155 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import (
+ Any,
+ AsyncIterator,
+ Awaitable,
+ Callable,
+ Iterator,
+ Optional,
+ Sequence,
+ Tuple,
+)
+
+from google.cloud.redis_v1.types import cloud_redis
+
+
+class ListInstancesPager:
+ """A pager for iterating through ``list_instances`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.redis_v1.types.ListInstancesResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``instances`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListInstances`` requests and continue to iterate
+ through the ``instances`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.redis_v1.types.ListInstancesResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., cloud_redis.ListInstancesResponse],
+ request: cloud_redis.ListInstancesRequest,
+ response: cloud_redis.ListInstancesResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.redis_v1.types.ListInstancesRequest):
+ The initial request object.
+ response (google.cloud.redis_v1.types.ListInstancesResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = cloud_redis.ListInstancesRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[cloud_redis.ListInstancesResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[cloud_redis.Instance]:
+ for page in self.pages:
+ yield from page.instances
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListInstancesAsyncPager:
+ """A pager for iterating through ``list_instances`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.redis_v1.types.ListInstancesResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``instances`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListInstances`` requests and continue to iterate
+ through the ``instances`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.redis_v1.types.ListInstancesResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[cloud_redis.ListInstancesResponse]],
+ request: cloud_redis.ListInstancesRequest,
+ response: cloud_redis.ListInstancesResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.redis_v1.types.ListInstancesRequest):
+ The initial request object.
+ response (google.cloud.redis_v1.types.ListInstancesResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = cloud_redis.ListInstancesRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[cloud_redis.ListInstancesResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[cloud_redis.Instance]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.instances:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/__init__.py b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/__init__.py
@@ -0,0 +1,36 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+from typing import Dict, Type
+
+from .base import CloudRedisTransport
+from .grpc import CloudRedisGrpcTransport
+from .grpc_asyncio import CloudRedisGrpcAsyncIOTransport
+from .rest import CloudRedisRestInterceptor, CloudRedisRestTransport
+
+# Compile a registry of transports.
+_transport_registry = OrderedDict() # type: Dict[str, Type[CloudRedisTransport]]
+_transport_registry["grpc"] = CloudRedisGrpcTransport
+_transport_registry["grpc_asyncio"] = CloudRedisGrpcAsyncIOTransport
+_transport_registry["rest"] = CloudRedisRestTransport
+
+__all__ = (
+ "CloudRedisTransport",
+ "CloudRedisGrpcTransport",
+ "CloudRedisGrpcAsyncIOTransport",
+ "CloudRedisRestTransport",
+ "CloudRedisRestInterceptor",
+)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/base.py b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/base.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/base.py
@@ -0,0 +1,361 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import abc
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Union
+
+import google.api_core
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1, operations_v1
+from google.api_core import retry as retries
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.cloud.location import locations_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.redis_v1 import gapic_version as package_version
+from google.cloud.redis_v1.types import cloud_redis
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+class CloudRedisTransport(abc.ABC):
+ """Abstract transport class for CloudRedis."""
+
+ AUTH_SCOPES = ("https://www.googleapis.com/auth/cloud-platform",)
+
+ DEFAULT_HOST: str = "redis.googleapis.com"
+
+ def __init__(
+ self,
+ *,
+ host: str = DEFAULT_HOST,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ **kwargs,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A list of scopes.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ """
+
+ scopes_kwargs = {"scopes": scopes, "default_scopes": self.AUTH_SCOPES}
+
+ # Save the scopes.
+ self._scopes = scopes
+
+ # If no credentials are provided, then determine the appropriate
+ # defaults.
+ if credentials and credentials_file:
+ raise core_exceptions.DuplicateCredentialArgs(
+ "'credentials_file' and 'credentials' are mutually exclusive"
+ )
+
+ if credentials_file is not None:
+ credentials, _ = google.auth.load_credentials_from_file(
+ credentials_file, **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ elif credentials is None:
+ credentials, _ = google.auth.default(
+ **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ # Don't apply audience if the credentials file passed from user.
+ if hasattr(credentials, "with_gdch_audience"):
+ credentials = credentials.with_gdch_audience(
+ api_audience if api_audience else host
+ )
+
+ # If the credentials are service account credentials, then always try to use self signed JWT.
+ if (
+ always_use_jwt_access
+ and isinstance(credentials, service_account.Credentials)
+ and hasattr(service_account.Credentials, "with_always_use_jwt_access")
+ ):
+ credentials = credentials.with_always_use_jwt_access(True)
+
+ # Save the credentials.
+ self._credentials = credentials
+
+ # Save the hostname. Default to port 443 (HTTPS) if none is specified.
+ if ":" not in host:
+ host += ":443"
+ self._host = host
+
+ def _prep_wrapped_messages(self, client_info):
+ # Precompute the wrapped methods.
+ self._wrapped_methods = {
+ self.list_instances: gapic_v1.method.wrap_method(
+ self.list_instances,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.get_instance: gapic_v1.method.wrap_method(
+ self.get_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.get_instance_auth_string: gapic_v1.method.wrap_method(
+ self.get_instance_auth_string,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.create_instance: gapic_v1.method.wrap_method(
+ self.create_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.update_instance: gapic_v1.method.wrap_method(
+ self.update_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.upgrade_instance: gapic_v1.method.wrap_method(
+ self.upgrade_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.import_instance: gapic_v1.method.wrap_method(
+ self.import_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.export_instance: gapic_v1.method.wrap_method(
+ self.export_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.failover_instance: gapic_v1.method.wrap_method(
+ self.failover_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.delete_instance: gapic_v1.method.wrap_method(
+ self.delete_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.reschedule_maintenance: gapic_v1.method.wrap_method(
+ self.reschedule_maintenance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ }
+
+ def close(self):
+ """Closes resources associated with the transport.
+
+ .. warning::
+ Only call this method if the transport is NOT shared
+ with other clients - this may cause errors in other clients!
+ """
+ raise NotImplementedError()
+
+ @property
+ def operations_client(self):
+ """Return the client designed to process long-running operations."""
+ raise NotImplementedError()
+
+ @property
+ def list_instances(
+ self,
+ ) -> Callable[
+ [cloud_redis.ListInstancesRequest],
+ Union[
+ cloud_redis.ListInstancesResponse,
+ Awaitable[cloud_redis.ListInstancesResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.GetInstanceRequest],
+ Union[cloud_redis.Instance, Awaitable[cloud_redis.Instance]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_instance_auth_string(
+ self,
+ ) -> Callable[
+ [cloud_redis.GetInstanceAuthStringRequest],
+ Union[
+ cloud_redis.InstanceAuthString, Awaitable[cloud_redis.InstanceAuthString]
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def create_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.CreateInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def update_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.UpdateInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def upgrade_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.UpgradeInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def import_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.ImportInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def export_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.ExportInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def failover_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.FailoverInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def delete_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.DeleteInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def reschedule_maintenance(
+ self,
+ ) -> Callable[
+ [cloud_redis.RescheduleMaintenanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_operations(
+ self,
+ ) -> Callable[
+ [operations_pb2.ListOperationsRequest],
+ Union[
+ operations_pb2.ListOperationsResponse,
+ Awaitable[operations_pb2.ListOperationsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[
+ [operations_pb2.GetOperationRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def cancel_operation(
+ self,
+ ) -> Callable[[operations_pb2.CancelOperationRequest], None,]:
+ raise NotImplementedError()
+
+ @property
+ def delete_operation(
+ self,
+ ) -> Callable[[operations_pb2.DeleteOperationRequest], None,]:
+ raise NotImplementedError()
+
+ @property
+ def get_location(
+ self,
+ ) -> Callable[
+ [locations_pb2.GetLocationRequest],
+ Union[locations_pb2.Location, Awaitable[locations_pb2.Location]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_locations(
+ self,
+ ) -> Callable[
+ [locations_pb2.ListLocationsRequest],
+ Union[
+ locations_pb2.ListLocationsResponse,
+ Awaitable[locations_pb2.ListLocationsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def kind(self) -> str:
+ raise NotImplementedError()
+
+
+__all__ = ("CloudRedisTransport",)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/grpc.py b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/grpc.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/grpc.py
@@ -0,0 +1,719 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers, operations_v1
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.cloud.location import locations_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+
+from google.cloud.redis_v1.types import cloud_redis
+
+from .base import DEFAULT_CLIENT_INFO, CloudRedisTransport
+
+
+class CloudRedisGrpcTransport(CloudRedisTransport):
+ """gRPC backend transport for CloudRedis.
+
+ Configures and manages Cloud Memorystore for Redis instances
+
+ Google Cloud Memorystore for Redis v1
+
+ The ``redis.googleapis.com`` service implements the Google Cloud
+ Memorystore for Redis API and defines the following resource model
+ for managing Redis instances:
+
+ - The service works with a collection of cloud projects, named:
+ ``/projects/*``
+ - Each project has a collection of available locations, named:
+ ``/locations/*``
+ - Each location has a collection of Redis instances, named:
+ ``/instances/*``
+ - As such, Redis instances are resources of the form:
+ ``/projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+
+ Note that location_id must be referring to a GCP ``region``; for
+ example:
+
+ - ``projects/redpepper-1290/locations/us-central1/instances/my-redis``
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _stubs: Dict[str, Callable]
+
+ def __init__(
+ self,
+ *,
+ host: str = "redis.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[grpc.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ channel (Optional[grpc.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "redis.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> grpc.Channel:
+ """Create and return a gRPC channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ grpc.Channel: A gRPC channel object.
+
+ Raises:
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+
+ return grpc_helpers.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ @property
+ def grpc_channel(self) -> grpc.Channel:
+ """Return the channel designed to connect to this service."""
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsClient(self.grpc_channel)
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def list_instances(
+ self,
+ ) -> Callable[
+ [cloud_redis.ListInstancesRequest], cloud_redis.ListInstancesResponse
+ ]:
+ r"""Return a callable for the list instances method over gRPC.
+
+ Lists all Redis instances owned by a project in either the
+ specified location (region) or all locations.
+
+ The location should have the following format:
+
+ - ``projects/{project_id}/locations/{location_id}``
+
+ If ``location_id`` is specified as ``-`` (wildcard), then all
+ regions available to the project are queried, and the results
+ are aggregated.
+
+ Returns:
+ Callable[[~.ListInstancesRequest],
+ ~.ListInstancesResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_instances" not in self._stubs:
+ self._stubs["list_instances"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/ListInstances",
+ request_serializer=cloud_redis.ListInstancesRequest.serialize,
+ response_deserializer=cloud_redis.ListInstancesResponse.deserialize,
+ )
+ return self._stubs["list_instances"]
+
+ @property
+ def get_instance(
+ self,
+ ) -> Callable[[cloud_redis.GetInstanceRequest], cloud_redis.Instance]:
+ r"""Return a callable for the get instance method over gRPC.
+
+ Gets the details of a specific Redis instance.
+
+ Returns:
+ Callable[[~.GetInstanceRequest],
+ ~.Instance]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_instance" not in self._stubs:
+ self._stubs["get_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/GetInstance",
+ request_serializer=cloud_redis.GetInstanceRequest.serialize,
+ response_deserializer=cloud_redis.Instance.deserialize,
+ )
+ return self._stubs["get_instance"]
+
+ @property
+ def get_instance_auth_string(
+ self,
+ ) -> Callable[
+ [cloud_redis.GetInstanceAuthStringRequest], cloud_redis.InstanceAuthString
+ ]:
+ r"""Return a callable for the get instance auth string method over gRPC.
+
+ Gets the AUTH string for a Redis instance. If AUTH is
+ not enabled for the instance the response will be empty.
+ This information is not included in the details returned
+ to GetInstance.
+
+ Returns:
+ Callable[[~.GetInstanceAuthStringRequest],
+ ~.InstanceAuthString]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_instance_auth_string" not in self._stubs:
+ self._stubs["get_instance_auth_string"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/GetInstanceAuthString",
+ request_serializer=cloud_redis.GetInstanceAuthStringRequest.serialize,
+ response_deserializer=cloud_redis.InstanceAuthString.deserialize,
+ )
+ return self._stubs["get_instance_auth_string"]
+
+ @property
+ def create_instance(
+ self,
+ ) -> Callable[[cloud_redis.CreateInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the create instance method over gRPC.
+
+ Creates a Redis instance based on the specified tier and memory
+ size.
+
+ By default, the instance is accessible from the project's
+ `default network <https://cloud.google.com/vpc/docs/vpc>`__.
+
+ The creation is executed asynchronously and callers may check
+ the returned operation to track its progress. Once the operation
+ is completed the Redis instance will be fully functional.
+ Completed longrunning.Operation will contain the new instance
+ object in the response field.
+
+ The returned operation is automatically deleted after a few
+ hours, so there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.CreateInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_instance" not in self._stubs:
+ self._stubs["create_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/CreateInstance",
+ request_serializer=cloud_redis.CreateInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_instance"]
+
+ @property
+ def update_instance(
+ self,
+ ) -> Callable[[cloud_redis.UpdateInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the update instance method over gRPC.
+
+ Updates the metadata and configuration of a specific
+ Redis instance.
+ Completed longrunning.Operation will contain the new
+ instance object in the response field. The returned
+ operation is automatically deleted after a few hours, so
+ there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.UpdateInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_instance" not in self._stubs:
+ self._stubs["update_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/UpdateInstance",
+ request_serializer=cloud_redis.UpdateInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["update_instance"]
+
+ @property
+ def upgrade_instance(
+ self,
+ ) -> Callable[[cloud_redis.UpgradeInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the upgrade instance method over gRPC.
+
+ Upgrades Redis instance to the newer Redis version
+ specified in the request.
+
+ Returns:
+ Callable[[~.UpgradeInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "upgrade_instance" not in self._stubs:
+ self._stubs["upgrade_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/UpgradeInstance",
+ request_serializer=cloud_redis.UpgradeInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["upgrade_instance"]
+
+ @property
+ def import_instance(
+ self,
+ ) -> Callable[[cloud_redis.ImportInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the import instance method over gRPC.
+
+ Import a Redis RDB snapshot file from Cloud Storage
+ into a Redis instance.
+ Redis may stop serving during this operation. Instance
+ state will be IMPORTING for entire operation. When
+ complete, the instance will contain only data from the
+ imported file.
+
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.ImportInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "import_instance" not in self._stubs:
+ self._stubs["import_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/ImportInstance",
+ request_serializer=cloud_redis.ImportInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["import_instance"]
+
+ @property
+ def export_instance(
+ self,
+ ) -> Callable[[cloud_redis.ExportInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the export instance method over gRPC.
+
+ Export Redis instance data into a Redis RDB format
+ file in Cloud Storage.
+ Redis will continue serving during this operation.
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.ExportInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "export_instance" not in self._stubs:
+ self._stubs["export_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/ExportInstance",
+ request_serializer=cloud_redis.ExportInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["export_instance"]
+
+ @property
+ def failover_instance(
+ self,
+ ) -> Callable[[cloud_redis.FailoverInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the failover instance method over gRPC.
+
+ Initiates a failover of the primary node to current
+ replica node for a specific STANDARD tier Cloud
+ Memorystore for Redis instance.
+
+ Returns:
+ Callable[[~.FailoverInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "failover_instance" not in self._stubs:
+ self._stubs["failover_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/FailoverInstance",
+ request_serializer=cloud_redis.FailoverInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["failover_instance"]
+
+ @property
+ def delete_instance(
+ self,
+ ) -> Callable[[cloud_redis.DeleteInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the delete instance method over gRPC.
+
+ Deletes a specific Redis instance. Instance stops
+ serving and data is deleted.
+
+ Returns:
+ Callable[[~.DeleteInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_instance" not in self._stubs:
+ self._stubs["delete_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/DeleteInstance",
+ request_serializer=cloud_redis.DeleteInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_instance"]
+
+ @property
+ def reschedule_maintenance(
+ self,
+ ) -> Callable[[cloud_redis.RescheduleMaintenanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the reschedule maintenance method over gRPC.
+
+ Reschedule maintenance for a given instance in a
+ given project and location.
+
+ Returns:
+ Callable[[~.RescheduleMaintenanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "reschedule_maintenance" not in self._stubs:
+ self._stubs["reschedule_maintenance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/RescheduleMaintenance",
+ request_serializer=cloud_redis.RescheduleMaintenanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["reschedule_maintenance"]
+
+ def close(self):
+ self.grpc_channel.close()
+
+ @property
+ def delete_operation(
+ self,
+ ) -> Callable[[operations_pb2.DeleteOperationRequest], None]:
+ r"""Return a callable for the delete_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_operation" not in self._stubs:
+ self._stubs["delete_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/DeleteOperation",
+ request_serializer=operations_pb2.DeleteOperationRequest.SerializeToString,
+ response_deserializer=None,
+ )
+ return self._stubs["delete_operation"]
+
+ @property
+ def cancel_operation(
+ self,
+ ) -> Callable[[operations_pb2.CancelOperationRequest], None]:
+ r"""Return a callable for the cancel_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "cancel_operation" not in self._stubs:
+ self._stubs["cancel_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/CancelOperation",
+ request_serializer=operations_pb2.CancelOperationRequest.SerializeToString,
+ response_deserializer=None,
+ )
+ return self._stubs["cancel_operation"]
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+ @property
+ def list_operations(
+ self,
+ ) -> Callable[
+ [operations_pb2.ListOperationsRequest], operations_pb2.ListOperationsResponse
+ ]:
+ r"""Return a callable for the list_operations method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_operations" not in self._stubs:
+ self._stubs["list_operations"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/ListOperations",
+ request_serializer=operations_pb2.ListOperationsRequest.SerializeToString,
+ response_deserializer=operations_pb2.ListOperationsResponse.FromString,
+ )
+ return self._stubs["list_operations"]
+
+ @property
+ def list_locations(
+ self,
+ ) -> Callable[
+ [locations_pb2.ListLocationsRequest], locations_pb2.ListLocationsResponse
+ ]:
+ r"""Return a callable for the list locations method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_locations" not in self._stubs:
+ self._stubs["list_locations"] = self.grpc_channel.unary_unary(
+ "/google.cloud.location.Locations/ListLocations",
+ request_serializer=locations_pb2.ListLocationsRequest.SerializeToString,
+ response_deserializer=locations_pb2.ListLocationsResponse.FromString,
+ )
+ return self._stubs["list_locations"]
+
+ @property
+ def get_location(
+ self,
+ ) -> Callable[[locations_pb2.GetLocationRequest], locations_pb2.Location]:
+ r"""Return a callable for the list locations method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_location" not in self._stubs:
+ self._stubs["get_location"] = self.grpc_channel.unary_unary(
+ "/google.cloud.location.Locations/GetLocation",
+ request_serializer=locations_pb2.GetLocationRequest.SerializeToString,
+ response_deserializer=locations_pb2.Location.FromString,
+ )
+ return self._stubs["get_location"]
+
+ @property
+ def kind(self) -> str:
+ return "grpc"
+
+
+__all__ = ("CloudRedisGrpcTransport",)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/grpc_asyncio.py b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/grpc_asyncio.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/grpc_asyncio.py
@@ -0,0 +1,737 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers_async, operations_v1
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.cloud.location import locations_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+from grpc.experimental import aio # type: ignore
+
+from google.cloud.redis_v1.types import cloud_redis
+
+from .base import DEFAULT_CLIENT_INFO, CloudRedisTransport
+from .grpc import CloudRedisGrpcTransport
+
+
+class CloudRedisGrpcAsyncIOTransport(CloudRedisTransport):
+ """gRPC AsyncIO backend transport for CloudRedis.
+
+ Configures and manages Cloud Memorystore for Redis instances
+
+ Google Cloud Memorystore for Redis v1
+
+ The ``redis.googleapis.com`` service implements the Google Cloud
+ Memorystore for Redis API and defines the following resource model
+ for managing Redis instances:
+
+ - The service works with a collection of cloud projects, named:
+ ``/projects/*``
+ - Each project has a collection of available locations, named:
+ ``/locations/*``
+ - Each location has a collection of Redis instances, named:
+ ``/instances/*``
+ - As such, Redis instances are resources of the form:
+ ``/projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+
+ Note that location_id must be referring to a GCP ``region``; for
+ example:
+
+ - ``projects/redpepper-1290/locations/us-central1/instances/my-redis``
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _grpc_channel: aio.Channel
+ _stubs: Dict[str, Callable] = {}
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "redis.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> aio.Channel:
+ """Create and return a gRPC AsyncIO channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ aio.Channel: A gRPC AsyncIO channel object.
+ """
+
+ return grpc_helpers_async.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ def __init__(
+ self,
+ *,
+ host: str = "redis.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[aio.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ channel (Optional[aio.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsAsyncClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def grpc_channel(self) -> aio.Channel:
+ """Create the channel designed to connect to this service.
+
+ This property caches on the instance; repeated calls return
+ the same channel.
+ """
+ # Return the channel from cache.
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsAsyncClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsAsyncClient(
+ self.grpc_channel
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def list_instances(
+ self,
+ ) -> Callable[
+ [cloud_redis.ListInstancesRequest], Awaitable[cloud_redis.ListInstancesResponse]
+ ]:
+ r"""Return a callable for the list instances method over gRPC.
+
+ Lists all Redis instances owned by a project in either the
+ specified location (region) or all locations.
+
+ The location should have the following format:
+
+ - ``projects/{project_id}/locations/{location_id}``
+
+ If ``location_id`` is specified as ``-`` (wildcard), then all
+ regions available to the project are queried, and the results
+ are aggregated.
+
+ Returns:
+ Callable[[~.ListInstancesRequest],
+ Awaitable[~.ListInstancesResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_instances" not in self._stubs:
+ self._stubs["list_instances"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/ListInstances",
+ request_serializer=cloud_redis.ListInstancesRequest.serialize,
+ response_deserializer=cloud_redis.ListInstancesResponse.deserialize,
+ )
+ return self._stubs["list_instances"]
+
+ @property
+ def get_instance(
+ self,
+ ) -> Callable[[cloud_redis.GetInstanceRequest], Awaitable[cloud_redis.Instance]]:
+ r"""Return a callable for the get instance method over gRPC.
+
+ Gets the details of a specific Redis instance.
+
+ Returns:
+ Callable[[~.GetInstanceRequest],
+ Awaitable[~.Instance]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_instance" not in self._stubs:
+ self._stubs["get_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/GetInstance",
+ request_serializer=cloud_redis.GetInstanceRequest.serialize,
+ response_deserializer=cloud_redis.Instance.deserialize,
+ )
+ return self._stubs["get_instance"]
+
+ @property
+ def get_instance_auth_string(
+ self,
+ ) -> Callable[
+ [cloud_redis.GetInstanceAuthStringRequest],
+ Awaitable[cloud_redis.InstanceAuthString],
+ ]:
+ r"""Return a callable for the get instance auth string method over gRPC.
+
+ Gets the AUTH string for a Redis instance. If AUTH is
+ not enabled for the instance the response will be empty.
+ This information is not included in the details returned
+ to GetInstance.
+
+ Returns:
+ Callable[[~.GetInstanceAuthStringRequest],
+ Awaitable[~.InstanceAuthString]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_instance_auth_string" not in self._stubs:
+ self._stubs["get_instance_auth_string"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/GetInstanceAuthString",
+ request_serializer=cloud_redis.GetInstanceAuthStringRequest.serialize,
+ response_deserializer=cloud_redis.InstanceAuthString.deserialize,
+ )
+ return self._stubs["get_instance_auth_string"]
+
+ @property
+ def create_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.CreateInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the create instance method over gRPC.
+
+ Creates a Redis instance based on the specified tier and memory
+ size.
+
+ By default, the instance is accessible from the project's
+ `default network <https://cloud.google.com/vpc/docs/vpc>`__.
+
+ The creation is executed asynchronously and callers may check
+ the returned operation to track its progress. Once the operation
+ is completed the Redis instance will be fully functional.
+ Completed longrunning.Operation will contain the new instance
+ object in the response field.
+
+ The returned operation is automatically deleted after a few
+ hours, so there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.CreateInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_instance" not in self._stubs:
+ self._stubs["create_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/CreateInstance",
+ request_serializer=cloud_redis.CreateInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_instance"]
+
+ @property
+ def update_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.UpdateInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the update instance method over gRPC.
+
+ Updates the metadata and configuration of a specific
+ Redis instance.
+ Completed longrunning.Operation will contain the new
+ instance object in the response field. The returned
+ operation is automatically deleted after a few hours, so
+ there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.UpdateInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_instance" not in self._stubs:
+ self._stubs["update_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/UpdateInstance",
+ request_serializer=cloud_redis.UpdateInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["update_instance"]
+
+ @property
+ def upgrade_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.UpgradeInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the upgrade instance method over gRPC.
+
+ Upgrades Redis instance to the newer Redis version
+ specified in the request.
+
+ Returns:
+ Callable[[~.UpgradeInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "upgrade_instance" not in self._stubs:
+ self._stubs["upgrade_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/UpgradeInstance",
+ request_serializer=cloud_redis.UpgradeInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["upgrade_instance"]
+
+ @property
+ def import_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.ImportInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the import instance method over gRPC.
+
+ Import a Redis RDB snapshot file from Cloud Storage
+ into a Redis instance.
+ Redis may stop serving during this operation. Instance
+ state will be IMPORTING for entire operation. When
+ complete, the instance will contain only data from the
+ imported file.
+
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.ImportInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "import_instance" not in self._stubs:
+ self._stubs["import_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/ImportInstance",
+ request_serializer=cloud_redis.ImportInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["import_instance"]
+
+ @property
+ def export_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.ExportInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the export instance method over gRPC.
+
+ Export Redis instance data into a Redis RDB format
+ file in Cloud Storage.
+ Redis will continue serving during this operation.
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.ExportInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "export_instance" not in self._stubs:
+ self._stubs["export_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/ExportInstance",
+ request_serializer=cloud_redis.ExportInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["export_instance"]
+
+ @property
+ def failover_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.FailoverInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the failover instance method over gRPC.
+
+ Initiates a failover of the primary node to current
+ replica node for a specific STANDARD tier Cloud
+ Memorystore for Redis instance.
+
+ Returns:
+ Callable[[~.FailoverInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "failover_instance" not in self._stubs:
+ self._stubs["failover_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/FailoverInstance",
+ request_serializer=cloud_redis.FailoverInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["failover_instance"]
+
+ @property
+ def delete_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.DeleteInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the delete instance method over gRPC.
+
+ Deletes a specific Redis instance. Instance stops
+ serving and data is deleted.
+
+ Returns:
+ Callable[[~.DeleteInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_instance" not in self._stubs:
+ self._stubs["delete_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/DeleteInstance",
+ request_serializer=cloud_redis.DeleteInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_instance"]
+
+ @property
+ def reschedule_maintenance(
+ self,
+ ) -> Callable[
+ [cloud_redis.RescheduleMaintenanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the reschedule maintenance method over gRPC.
+
+ Reschedule maintenance for a given instance in a
+ given project and location.
+
+ Returns:
+ Callable[[~.RescheduleMaintenanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "reschedule_maintenance" not in self._stubs:
+ self._stubs["reschedule_maintenance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1.CloudRedis/RescheduleMaintenance",
+ request_serializer=cloud_redis.RescheduleMaintenanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["reschedule_maintenance"]
+
+ def close(self):
+ return self.grpc_channel.close()
+
+ @property
+ def delete_operation(
+ self,
+ ) -> Callable[[operations_pb2.DeleteOperationRequest], None]:
+ r"""Return a callable for the delete_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_operation" not in self._stubs:
+ self._stubs["delete_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/DeleteOperation",
+ request_serializer=operations_pb2.DeleteOperationRequest.SerializeToString,
+ response_deserializer=None,
+ )
+ return self._stubs["delete_operation"]
+
+ @property
+ def cancel_operation(
+ self,
+ ) -> Callable[[operations_pb2.CancelOperationRequest], None]:
+ r"""Return a callable for the cancel_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "cancel_operation" not in self._stubs:
+ self._stubs["cancel_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/CancelOperation",
+ request_serializer=operations_pb2.CancelOperationRequest.SerializeToString,
+ response_deserializer=None,
+ )
+ return self._stubs["cancel_operation"]
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+ @property
+ def list_operations(
+ self,
+ ) -> Callable[
+ [operations_pb2.ListOperationsRequest], operations_pb2.ListOperationsResponse
+ ]:
+ r"""Return a callable for the list_operations method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_operations" not in self._stubs:
+ self._stubs["list_operations"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/ListOperations",
+ request_serializer=operations_pb2.ListOperationsRequest.SerializeToString,
+ response_deserializer=operations_pb2.ListOperationsResponse.FromString,
+ )
+ return self._stubs["list_operations"]
+
+ @property
+ def list_locations(
+ self,
+ ) -> Callable[
+ [locations_pb2.ListLocationsRequest], locations_pb2.ListLocationsResponse
+ ]:
+ r"""Return a callable for the list locations method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_locations" not in self._stubs:
+ self._stubs["list_locations"] = self.grpc_channel.unary_unary(
+ "/google.cloud.location.Locations/ListLocations",
+ request_serializer=locations_pb2.ListLocationsRequest.SerializeToString,
+ response_deserializer=locations_pb2.ListLocationsResponse.FromString,
+ )
+ return self._stubs["list_locations"]
+
+ @property
+ def get_location(
+ self,
+ ) -> Callable[[locations_pb2.GetLocationRequest], locations_pb2.Location]:
+ r"""Return a callable for the list locations method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_location" not in self._stubs:
+ self._stubs["get_location"] = self.grpc_channel.unary_unary(
+ "/google.cloud.location.Locations/GetLocation",
+ request_serializer=locations_pb2.GetLocationRequest.SerializeToString,
+ response_deserializer=locations_pb2.Location.FromString,
+ )
+ return self._stubs["get_location"]
+
+
+__all__ = ("CloudRedisGrpcAsyncIOTransport",)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/rest.py b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/rest.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/services/cloud_redis/transports/rest.py
@@ -0,0 +1,2261 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import dataclasses
+import json # type: ignore
+import re
+from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import (
+ gapic_v1,
+ operations_v1,
+ path_template,
+ rest_helpers,
+ rest_streaming,
+)
+from google.api_core import exceptions as core_exceptions
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.auth.transport.requests import AuthorizedSession # type: ignore
+from google.cloud.location import locations_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import json_format
+import grpc # type: ignore
+from requests import __version__ as requests_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+
+from google.longrunning import operations_pb2 # type: ignore
+
+from google.cloud.redis_v1.types import cloud_redis
+
+from .base import CloudRedisTransport
+from .base import DEFAULT_CLIENT_INFO as BASE_DEFAULT_CLIENT_INFO
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=BASE_DEFAULT_CLIENT_INFO.gapic_version,
+ grpc_version=None,
+ rest_version=requests_version,
+)
+
+
+class CloudRedisRestInterceptor:
+ """Interceptor for CloudRedis.
+
+ Interceptors are used to manipulate requests, request metadata, and responses
+ in arbitrary ways.
+ Example use cases include:
+ * Logging
+ * Verifying requests according to service or custom semantics
+ * Stripping extraneous information from responses
+
+ These use cases and more can be enabled by injecting an
+ instance of a custom subclass when constructing the CloudRedisRestTransport.
+
+ .. code-block:: python
+ class MyCustomCloudRedisInterceptor(CloudRedisRestInterceptor):
+ def pre_create_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_create_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_delete_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_delete_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_export_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_export_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_failover_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_failover_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_instance_auth_string(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_instance_auth_string(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_import_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_import_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_instances(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_instances(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_reschedule_maintenance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_reschedule_maintenance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_update_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_update_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_upgrade_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_upgrade_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ transport = CloudRedisRestTransport(interceptor=MyCustomCloudRedisInterceptor())
+ client = CloudRedisClient(transport=transport)
+
+
+ """
+
+ def pre_create_instance(
+ self,
+ request: cloud_redis.CreateInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.CreateInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for create_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_create_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for create_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_delete_instance(
+ self,
+ request: cloud_redis.DeleteInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.DeleteInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for delete_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_delete_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for delete_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_export_instance(
+ self,
+ request: cloud_redis.ExportInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.ExportInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for export_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_export_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for export_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_failover_instance(
+ self,
+ request: cloud_redis.FailoverInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.FailoverInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for failover_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_failover_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for failover_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_instance(
+ self,
+ request: cloud_redis.GetInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.GetInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_get_instance(self, response: cloud_redis.Instance) -> cloud_redis.Instance:
+ """Post-rpc interceptor for get_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_instance_auth_string(
+ self,
+ request: cloud_redis.GetInstanceAuthStringRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.GetInstanceAuthStringRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_instance_auth_string
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_get_instance_auth_string(
+ self, response: cloud_redis.InstanceAuthString
+ ) -> cloud_redis.InstanceAuthString:
+ """Post-rpc interceptor for get_instance_auth_string
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_import_instance(
+ self,
+ request: cloud_redis.ImportInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.ImportInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for import_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_import_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for import_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_instances(
+ self,
+ request: cloud_redis.ListInstancesRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.ListInstancesRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_instances
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_list_instances(
+ self, response: cloud_redis.ListInstancesResponse
+ ) -> cloud_redis.ListInstancesResponse:
+ """Post-rpc interceptor for list_instances
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_reschedule_maintenance(
+ self,
+ request: cloud_redis.RescheduleMaintenanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.RescheduleMaintenanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for reschedule_maintenance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_reschedule_maintenance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for reschedule_maintenance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_update_instance(
+ self,
+ request: cloud_redis.UpdateInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.UpdateInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for update_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_update_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for update_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_upgrade_instance(
+ self,
+ request: cloud_redis.UpgradeInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.UpgradeInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for upgrade_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_upgrade_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for upgrade_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_location(
+ self,
+ request: locations_pb2.GetLocationRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[locations_pb2.GetLocationRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_location
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_get_location(
+ self, response: locations_pb2.Location
+ ) -> locations_pb2.Location:
+ """Post-rpc interceptor for get_location
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_locations(
+ self,
+ request: locations_pb2.ListLocationsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[locations_pb2.ListLocationsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_locations
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_list_locations(
+ self, response: locations_pb2.ListLocationsResponse
+ ) -> locations_pb2.ListLocationsResponse:
+ """Post-rpc interceptor for list_locations
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_cancel_operation(
+ self,
+ request: operations_pb2.CancelOperationRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[operations_pb2.CancelOperationRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for cancel_operation
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_cancel_operation(self, response: None) -> None:
+ """Post-rpc interceptor for cancel_operation
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_delete_operation(
+ self,
+ request: operations_pb2.DeleteOperationRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[operations_pb2.DeleteOperationRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for delete_operation
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_delete_operation(self, response: None) -> None:
+ """Post-rpc interceptor for delete_operation
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_operation(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[operations_pb2.GetOperationRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_get_operation(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_operations(
+ self,
+ request: operations_pb2.ListOperationsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[operations_pb2.ListOperationsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_operations
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_list_operations(
+ self, response: operations_pb2.ListOperationsResponse
+ ) -> operations_pb2.ListOperationsResponse:
+ """Post-rpc interceptor for list_operations
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+
+@dataclasses.dataclass
+class CloudRedisRestStub:
+ _session: AuthorizedSession
+ _host: str
+ _interceptor: CloudRedisRestInterceptor
+
+
+class CloudRedisRestTransport(CloudRedisTransport):
+ """REST backend transport for CloudRedis.
+
+ Configures and manages Cloud Memorystore for Redis instances
+
+ Google Cloud Memorystore for Redis v1
+
+ The ``redis.googleapis.com`` service implements the Google Cloud
+ Memorystore for Redis API and defines the following resource model
+ for managing Redis instances:
+
+ - The service works with a collection of cloud projects, named:
+ ``/projects/*``
+ - Each project has a collection of available locations, named:
+ ``/locations/*``
+ - Each location has a collection of Redis instances, named:
+ ``/instances/*``
+ - As such, Redis instances are resources of the form:
+ ``/projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+
+ Note that location_id must be referring to a GCP ``region``; for
+ example:
+
+ - ``projects/redpepper-1290/locations/us-central1/instances/my-redis``
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends JSON representations of protocol buffers over HTTP/1.1
+
+ """
+
+ def __init__(
+ self,
+ *,
+ host: str = "redis.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ url_scheme: str = "https",
+ interceptor: Optional[CloudRedisRestInterceptor] = None,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Callable[[], Tuple[bytes, bytes]]): Client
+ certificate to configure mutual TLS HTTP channel. It is ignored
+ if ``channel`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you are developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ url_scheme: the protocol scheme for the API endpoint. Normally
+ "https", but for testing or local servers,
+ "http" can be specified.
+ """
+ # Run the base constructor
+ # TODO(yon-mg): resolve other ctor params i.e. scopes, quota, etc.
+ # TODO: When custom host (api_endpoint) is set, `scopes` must *also* be set on the
+ # credentials object
+ maybe_url_match = re.match("^(?P<scheme>http(?:s)?://)?(?P<host>.*)$", host)
+ if maybe_url_match is None:
+ raise ValueError(
+ f"Unexpected hostname structure: {host}"
+ ) # pragma: NO COVER
+
+ url_match_items = maybe_url_match.groupdict()
+
+ host = f"{url_scheme}://{host}" if not url_match_items["scheme"] else host
+
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+ self._session = AuthorizedSession(
+ self._credentials, default_host=self.DEFAULT_HOST
+ )
+ self._operations_client: Optional[operations_v1.AbstractOperationsClient] = None
+ if client_cert_source_for_mtls:
+ self._session.configure_mtls_channel(client_cert_source_for_mtls)
+ self._interceptor = interceptor or CloudRedisRestInterceptor()
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def operations_client(self) -> operations_v1.AbstractOperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ http_options: Dict[str, List[Dict[str, str]]] = {
+ "google.longrunning.Operations.CancelOperation": [
+ {
+ "method": "post",
+ "uri": "/v1/{name=projects/*/locations/*/operations/*}:cancel",
+ },
+ ],
+ "google.longrunning.Operations.DeleteOperation": [
+ {
+ "method": "delete",
+ "uri": "/v1/{name=projects/*/locations/*/operations/*}",
+ },
+ ],
+ "google.longrunning.Operations.GetOperation": [
+ {
+ "method": "get",
+ "uri": "/v1/{name=projects/*/locations/*/operations/*}",
+ },
+ ],
+ "google.longrunning.Operations.ListOperations": [
+ {
+ "method": "get",
+ "uri": "/v1/{name=projects/*/locations/*}/operations",
+ },
+ ],
+ }
+
+ rest_transport = operations_v1.OperationsRestTransport(
+ host=self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ scopes=self._scopes,
+ http_options=http_options,
+ path_prefix="v1",
+ )
+
+ self._operations_client = operations_v1.AbstractOperationsClient(
+ transport=rest_transport
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ class _CreateInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("CreateInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "instanceId": "",
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.CreateInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the create instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.CreateInstanceRequest):
+ The request object. Request for
+ [CreateInstance][google.cloud.redis.v1.CloudRedis.CreateInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1/{parent=projects/*/locations/*}/instances",
+ "body": "instance",
+ },
+ ]
+ request, metadata = self._interceptor.pre_create_instance(request, metadata)
+ pb_request = cloud_redis.CreateInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_create_instance(resp)
+ return resp
+
+ class _DeleteInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("DeleteInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.DeleteInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the delete instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.DeleteInstanceRequest):
+ The request object. Request for
+ [DeleteInstance][google.cloud.redis.v1.CloudRedis.DeleteInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "delete",
+ "uri": "/v1/{name=projects/*/locations/*/instances/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_delete_instance(request, metadata)
+ pb_request = cloud_redis.DeleteInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_delete_instance(resp)
+ return resp
+
+ class _ExportInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("ExportInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.ExportInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the export instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.ExportInstanceRequest):
+ The request object. Request for
+ [Export][google.cloud.redis.v1.CloudRedis.ExportInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1/{name=projects/*/locations/*/instances/*}:export",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_export_instance(request, metadata)
+ pb_request = cloud_redis.ExportInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_export_instance(resp)
+ return resp
+
+ class _FailoverInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("FailoverInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.FailoverInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the failover instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.FailoverInstanceRequest):
+ The request object. Request for
+ [Failover][google.cloud.redis.v1.CloudRedis.FailoverInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1/{name=projects/*/locations/*/instances/*}:failover",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_failover_instance(
+ request, metadata
+ )
+ pb_request = cloud_redis.FailoverInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_failover_instance(resp)
+ return resp
+
+ class _GetInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("GetInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.GetInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.Instance:
+ r"""Call the get instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.GetInstanceRequest):
+ The request object. Request for
+ [GetInstance][google.cloud.redis.v1.CloudRedis.GetInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.cloud_redis.Instance:
+ A Memorystore for Redis instance.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{name=projects/*/locations/*/instances/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_instance(request, metadata)
+ pb_request = cloud_redis.GetInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = cloud_redis.Instance()
+ pb_resp = cloud_redis.Instance.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_instance(resp)
+ return resp
+
+ class _GetInstanceAuthString(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("GetInstanceAuthString")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.GetInstanceAuthStringRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.InstanceAuthString:
+ r"""Call the get instance auth string method over HTTP.
+
+ Args:
+ request (~.cloud_redis.GetInstanceAuthStringRequest):
+ The request object. Request for
+ [GetInstanceAuthString][google.cloud.redis.v1.CloudRedis.GetInstanceAuthString].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.cloud_redis.InstanceAuthString:
+ Instance AUTH string details.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{name=projects/*/locations/*/instances/*}/authString",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_instance_auth_string(
+ request, metadata
+ )
+ pb_request = cloud_redis.GetInstanceAuthStringRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = cloud_redis.InstanceAuthString()
+ pb_resp = cloud_redis.InstanceAuthString.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_instance_auth_string(resp)
+ return resp
+
+ class _ImportInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("ImportInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.ImportInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the import instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.ImportInstanceRequest):
+ The request object. Request for
+ [Import][google.cloud.redis.v1.CloudRedis.ImportInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1/{name=projects/*/locations/*/instances/*}:import",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_import_instance(request, metadata)
+ pb_request = cloud_redis.ImportInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_import_instance(resp)
+ return resp
+
+ class _ListInstances(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("ListInstances")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.ListInstancesRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.ListInstancesResponse:
+ r"""Call the list instances method over HTTP.
+
+ Args:
+ request (~.cloud_redis.ListInstancesRequest):
+ The request object. Request for
+ [ListInstances][google.cloud.redis.v1.CloudRedis.ListInstances].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.cloud_redis.ListInstancesResponse:
+ Response for
+ [ListInstances][google.cloud.redis.v1.CloudRedis.ListInstances].
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{parent=projects/*/locations/*}/instances",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_instances(request, metadata)
+ pb_request = cloud_redis.ListInstancesRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = cloud_redis.ListInstancesResponse()
+ pb_resp = cloud_redis.ListInstancesResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_instances(resp)
+ return resp
+
+ class _RescheduleMaintenance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("RescheduleMaintenance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.RescheduleMaintenanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the reschedule maintenance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.RescheduleMaintenanceRequest):
+ The request object. Request for
+ [RescheduleMaintenance][google.cloud.redis.v1.CloudRedis.RescheduleMaintenance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1/{name=projects/*/locations/*/instances/*}:rescheduleMaintenance",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_reschedule_maintenance(
+ request, metadata
+ )
+ pb_request = cloud_redis.RescheduleMaintenanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_reschedule_maintenance(resp)
+ return resp
+
+ class _UpdateInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("UpdateInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "updateMask": {},
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.UpdateInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the update instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.UpdateInstanceRequest):
+ The request object. Request for
+ [UpdateInstance][google.cloud.redis.v1.CloudRedis.UpdateInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "patch",
+ "uri": "/v1/{instance.name=projects/*/locations/*/instances/*}",
+ "body": "instance",
+ },
+ ]
+ request, metadata = self._interceptor.pre_update_instance(request, metadata)
+ pb_request = cloud_redis.UpdateInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_update_instance(resp)
+ return resp
+
+ class _UpgradeInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("UpgradeInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.UpgradeInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the upgrade instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.UpgradeInstanceRequest):
+ The request object. Request for
+ [UpgradeInstance][google.cloud.redis.v1.CloudRedis.UpgradeInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1/{name=projects/*/locations/*/instances/*}:upgrade",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_upgrade_instance(
+ request, metadata
+ )
+ pb_request = cloud_redis.UpgradeInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_upgrade_instance(resp)
+ return resp
+
+ @property
+ def create_instance(
+ self,
+ ) -> Callable[[cloud_redis.CreateInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._CreateInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def delete_instance(
+ self,
+ ) -> Callable[[cloud_redis.DeleteInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._DeleteInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def export_instance(
+ self,
+ ) -> Callable[[cloud_redis.ExportInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ExportInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def failover_instance(
+ self,
+ ) -> Callable[[cloud_redis.FailoverInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._FailoverInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_instance(
+ self,
+ ) -> Callable[[cloud_redis.GetInstanceRequest], cloud_redis.Instance]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_instance_auth_string(
+ self,
+ ) -> Callable[
+ [cloud_redis.GetInstanceAuthStringRequest], cloud_redis.InstanceAuthString
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetInstanceAuthString(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def import_instance(
+ self,
+ ) -> Callable[[cloud_redis.ImportInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ImportInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_instances(
+ self,
+ ) -> Callable[
+ [cloud_redis.ListInstancesRequest], cloud_redis.ListInstancesResponse
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListInstances(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def reschedule_maintenance(
+ self,
+ ) -> Callable[[cloud_redis.RescheduleMaintenanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._RescheduleMaintenance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def update_instance(
+ self,
+ ) -> Callable[[cloud_redis.UpdateInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._UpdateInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def upgrade_instance(
+ self,
+ ) -> Callable[[cloud_redis.UpgradeInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._UpgradeInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_location(self):
+ return self._GetLocation(self._session, self._host, self._interceptor) # type: ignore
+
+ class _GetLocation(CloudRedisRestStub):
+ def __call__(
+ self,
+ request: locations_pb2.GetLocationRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> locations_pb2.Location:
+
+ r"""Call the get location method over HTTP.
+
+ Args:
+ request (locations_pb2.GetLocationRequest):
+ The request object for GetLocation method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ locations_pb2.Location: Response from GetLocation method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{name=projects/*/locations/*}",
+ },
+ ]
+
+ request, metadata = self._interceptor.pre_get_location(request, metadata)
+ request_kwargs = json_format.MessageToDict(request)
+ transcoded_request = path_template.transcode(http_options, **request_kwargs)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(json.dumps(transcoded_request["query_params"]))
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ resp = locations_pb2.Location()
+ resp = json_format.Parse(response.content.decode("utf-8"), resp)
+ resp = self._interceptor.post_get_location(resp)
+ return resp
+
+ @property
+ def list_locations(self):
+ return self._ListLocations(self._session, self._host, self._interceptor) # type: ignore
+
+ class _ListLocations(CloudRedisRestStub):
+ def __call__(
+ self,
+ request: locations_pb2.ListLocationsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> locations_pb2.ListLocationsResponse:
+
+ r"""Call the list locations method over HTTP.
+
+ Args:
+ request (locations_pb2.ListLocationsRequest):
+ The request object for ListLocations method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ locations_pb2.ListLocationsResponse: Response from ListLocations method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{name=projects/*}/locations",
+ },
+ ]
+
+ request, metadata = self._interceptor.pre_list_locations(request, metadata)
+ request_kwargs = json_format.MessageToDict(request)
+ transcoded_request = path_template.transcode(http_options, **request_kwargs)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(json.dumps(transcoded_request["query_params"]))
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ resp = locations_pb2.ListLocationsResponse()
+ resp = json_format.Parse(response.content.decode("utf-8"), resp)
+ resp = self._interceptor.post_list_locations(resp)
+ return resp
+
+ @property
+ def cancel_operation(self):
+ return self._CancelOperation(self._session, self._host, self._interceptor) # type: ignore
+
+ class _CancelOperation(CloudRedisRestStub):
+ def __call__(
+ self,
+ request: operations_pb2.CancelOperationRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> None:
+
+ r"""Call the cancel operation method over HTTP.
+
+ Args:
+ request (operations_pb2.CancelOperationRequest):
+ The request object for CancelOperation method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1/{name=projects/*/locations/*/operations/*}:cancel",
+ },
+ ]
+
+ request, metadata = self._interceptor.pre_cancel_operation(
+ request, metadata
+ )
+ request_kwargs = json_format.MessageToDict(request)
+ transcoded_request = path_template.transcode(http_options, **request_kwargs)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(json.dumps(transcoded_request["query_params"]))
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ return self._interceptor.post_cancel_operation(None)
+
+ @property
+ def delete_operation(self):
+ return self._DeleteOperation(self._session, self._host, self._interceptor) # type: ignore
+
+ class _DeleteOperation(CloudRedisRestStub):
+ def __call__(
+ self,
+ request: operations_pb2.DeleteOperationRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> None:
+
+ r"""Call the delete operation method over HTTP.
+
+ Args:
+ request (operations_pb2.DeleteOperationRequest):
+ The request object for DeleteOperation method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "delete",
+ "uri": "/v1/{name=projects/*/locations/*/operations/*}",
+ },
+ ]
+
+ request, metadata = self._interceptor.pre_delete_operation(
+ request, metadata
+ )
+ request_kwargs = json_format.MessageToDict(request)
+ transcoded_request = path_template.transcode(http_options, **request_kwargs)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(json.dumps(transcoded_request["query_params"]))
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ return self._interceptor.post_delete_operation(None)
+
+ @property
+ def get_operation(self):
+ return self._GetOperation(self._session, self._host, self._interceptor) # type: ignore
+
+ class _GetOperation(CloudRedisRestStub):
+ def __call__(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+
+ r"""Call the get operation method over HTTP.
+
+ Args:
+ request (operations_pb2.GetOperationRequest):
+ The request object for GetOperation method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ operations_pb2.Operation: Response from GetOperation method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{name=projects/*/locations/*/operations/*}",
+ },
+ ]
+
+ request, metadata = self._interceptor.pre_get_operation(request, metadata)
+ request_kwargs = json_format.MessageToDict(request)
+ transcoded_request = path_template.transcode(http_options, **request_kwargs)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(json.dumps(transcoded_request["query_params"]))
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ resp = operations_pb2.Operation()
+ resp = json_format.Parse(response.content.decode("utf-8"), resp)
+ resp = self._interceptor.post_get_operation(resp)
+ return resp
+
+ @property
+ def list_operations(self):
+ return self._ListOperations(self._session, self._host, self._interceptor) # type: ignore
+
+ class _ListOperations(CloudRedisRestStub):
+ def __call__(
+ self,
+ request: operations_pb2.ListOperationsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.ListOperationsResponse:
+
+ r"""Call the list operations method over HTTP.
+
+ Args:
+ request (operations_pb2.ListOperationsRequest):
+ The request object for ListOperations method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ operations_pb2.ListOperationsResponse: Response from ListOperations method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{name=projects/*/locations/*}/operations",
+ },
+ ]
+
+ request, metadata = self._interceptor.pre_list_operations(request, metadata)
+ request_kwargs = json_format.MessageToDict(request)
+ transcoded_request = path_template.transcode(http_options, **request_kwargs)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(json.dumps(transcoded_request["query_params"]))
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ resp = operations_pb2.ListOperationsResponse()
+ resp = json_format.Parse(response.content.decode("utf-8"), resp)
+ resp = self._interceptor.post_list_operations(resp)
+ return resp
+
+ @property
+ def kind(self) -> str:
+ return "rest"
+
+ def close(self):
+ self._session.close()
+
+
+__all__ = ("CloudRedisRestTransport",)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/types/__init__.py b/packages/google-cloud-redis/google/cloud/redis_v1/types/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/types/__init__.py
@@ -0,0 +1,74 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .cloud_redis import (
+ CreateInstanceRequest,
+ DeleteInstanceRequest,
+ ExportInstanceRequest,
+ FailoverInstanceRequest,
+ GcsDestination,
+ GcsSource,
+ GetInstanceAuthStringRequest,
+ GetInstanceRequest,
+ ImportInstanceRequest,
+ InputConfig,
+ Instance,
+ InstanceAuthString,
+ ListInstancesRequest,
+ ListInstancesResponse,
+ LocationMetadata,
+ MaintenancePolicy,
+ MaintenanceSchedule,
+ NodeInfo,
+ OperationMetadata,
+ OutputConfig,
+ PersistenceConfig,
+ RescheduleMaintenanceRequest,
+ TlsCertificate,
+ UpdateInstanceRequest,
+ UpgradeInstanceRequest,
+ WeeklyMaintenanceWindow,
+ ZoneMetadata,
+)
+
+__all__ = (
+ "CreateInstanceRequest",
+ "DeleteInstanceRequest",
+ "ExportInstanceRequest",
+ "FailoverInstanceRequest",
+ "GcsDestination",
+ "GcsSource",
+ "GetInstanceAuthStringRequest",
+ "GetInstanceRequest",
+ "ImportInstanceRequest",
+ "InputConfig",
+ "Instance",
+ "InstanceAuthString",
+ "ListInstancesRequest",
+ "ListInstancesResponse",
+ "LocationMetadata",
+ "MaintenancePolicy",
+ "MaintenanceSchedule",
+ "NodeInfo",
+ "OperationMetadata",
+ "OutputConfig",
+ "PersistenceConfig",
+ "RescheduleMaintenanceRequest",
+ "TlsCertificate",
+ "UpdateInstanceRequest",
+ "UpgradeInstanceRequest",
+ "WeeklyMaintenanceWindow",
+ "ZoneMetadata",
+)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1/types/cloud_redis.py b/packages/google-cloud-redis/google/cloud/redis_v1/types/cloud_redis.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1/types/cloud_redis.py
@@ -0,0 +1,1347 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import duration_pb2 # type: ignore
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+from google.type import dayofweek_pb2 # type: ignore
+from google.type import timeofday_pb2 # type: ignore
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.redis.v1",
+ manifest={
+ "NodeInfo",
+ "Instance",
+ "PersistenceConfig",
+ "RescheduleMaintenanceRequest",
+ "MaintenancePolicy",
+ "WeeklyMaintenanceWindow",
+ "MaintenanceSchedule",
+ "ListInstancesRequest",
+ "ListInstancesResponse",
+ "GetInstanceRequest",
+ "GetInstanceAuthStringRequest",
+ "InstanceAuthString",
+ "CreateInstanceRequest",
+ "UpdateInstanceRequest",
+ "UpgradeInstanceRequest",
+ "DeleteInstanceRequest",
+ "GcsSource",
+ "InputConfig",
+ "ImportInstanceRequest",
+ "GcsDestination",
+ "OutputConfig",
+ "ExportInstanceRequest",
+ "FailoverInstanceRequest",
+ "OperationMetadata",
+ "LocationMetadata",
+ "ZoneMetadata",
+ "TlsCertificate",
+ },
+)
+
+
+class NodeInfo(proto.Message):
+ r"""Node specific properties.
+
+ Attributes:
+ id (str):
+ Output only. Node identifying string. e.g.
+ 'node-0', 'node-1'
+ zone (str):
+ Output only. Location of the node.
+ """
+
+ id: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ zone: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class Instance(proto.Message):
+ r"""A Memorystore for Redis instance.
+
+ Attributes:
+ name (str):
+ Required. Unique name of the resource in this scope
+ including project and location using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+
+ Note: Redis instances are managed and addressed at regional
+ level so location_id here refers to a GCP region; however,
+ users may choose which specific zone (or collection of zones
+ for cross-zone instances) an instance should be provisioned
+ in. Refer to
+ [location_id][google.cloud.redis.v1.Instance.location_id]
+ and
+ [alternative_location_id][google.cloud.redis.v1.Instance.alternative_location_id]
+ fields for more details.
+ display_name (str):
+ An arbitrary and optional user-provided name
+ for the instance.
+ labels (MutableMapping[str, str]):
+ Resource labels to represent user provided
+ metadata
+ location_id (str):
+ Optional. The zone where the instance will be
+ provisioned. If not provided, the service will
+ choose a zone from the specified region for the
+ instance. For standard tier, additional nodes
+ will be added across multiple zones for
+ protection against zonal failures. If specified,
+ at least one node will be provisioned in this
+ zone.
+ alternative_location_id (str):
+ Optional. If specified, at least one node will be
+ provisioned in this zone in addition to the zone specified
+ in location_id. Only applicable to standard tier. If
+ provided, it must be a different zone from the one provided
+ in [location_id]. Additional nodes beyond the first 2 will
+ be placed in zones selected by the service.
+ redis_version (str):
+ Optional. The version of Redis software. If not provided,
+ latest supported version will be used. Currently, the
+ supported values are:
+
+ - ``REDIS_3_2`` for Redis 3.2 compatibility
+ - ``REDIS_4_0`` for Redis 4.0 compatibility (default)
+ - ``REDIS_5_0`` for Redis 5.0 compatibility
+ - ``REDIS_6_X`` for Redis 6.x compatibility
+ reserved_ip_range (str):
+ Optional. For DIRECT_PEERING mode, the CIDR range of
+ internal addresses that are reserved for this instance.
+ Range must be unique and non-overlapping with existing
+ subnets in an authorized network. For PRIVATE_SERVICE_ACCESS
+ mode, the name of one allocated IP address ranges associated
+ with this private service access connection. If not
+ provided, the service will choose an unused /29 block, for
+ example, 10.0.0.0/29 or 192.168.0.0/29. For
+ READ_REPLICAS_ENABLED the default block size is /28.
+ secondary_ip_range (str):
+ Optional. Additional IP range for node placement. Required
+ when enabling read replicas on an existing instance. For
+ DIRECT_PEERING mode value must be a CIDR range of size /28,
+ or "auto". For PRIVATE_SERVICE_ACCESS mode value must be the
+ name of an allocated address range associated with the
+ private service access connection, or "auto".
+ host (str):
+ Output only. Hostname or IP address of the
+ exposed Redis endpoint used by clients to
+ connect to the service.
+ port (int):
+ Output only. The port number of the exposed
+ Redis endpoint.
+ current_location_id (str):
+ Output only. The current zone where the Redis primary node
+ is located. In basic tier, this will always be the same as
+ [location_id]. In standard tier, this can be the zone of any
+ node in the instance.
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time the instance was
+ created.
+ state (google.cloud.redis_v1.types.Instance.State):
+ Output only. The current state of this
+ instance.
+ status_message (str):
+ Output only. Additional information about the
+ current status of this instance, if available.
+ redis_configs (MutableMapping[str, str]):
+ Optional. Redis configuration parameters, according to
+ http://redis.io/topics/config. Currently, the only supported
+ parameters are:
+
+ Redis version 3.2 and newer:
+
+ - maxmemory-policy
+ - notify-keyspace-events
+
+ Redis version 4.0 and newer:
+
+ - activedefrag
+ - lfu-decay-time
+ - lfu-log-factor
+ - maxmemory-gb
+
+ Redis version 5.0 and newer:
+
+ - stream-node-max-bytes
+ - stream-node-max-entries
+ tier (google.cloud.redis_v1.types.Instance.Tier):
+ Required. The service tier of the instance.
+ memory_size_gb (int):
+ Required. Redis memory size in GiB.
+ authorized_network (str):
+ Optional. The full name of the Google Compute Engine
+ `network <https://cloud.google.com/vpc/docs/vpc>`__ to which
+ the instance is connected. If left unspecified, the
+ ``default`` network will be used.
+ persistence_iam_identity (str):
+ Output only. Cloud IAM identity used by import / export
+ operations to transfer data to/from Cloud Storage. Format is
+ "serviceAccount:<service_account_email>". The value may
+ change over time for a given instance so should be checked
+ before each import/export operation.
+ connect_mode (google.cloud.redis_v1.types.Instance.ConnectMode):
+ Optional. The network connect mode of the Redis instance. If
+ not provided, the connect mode defaults to DIRECT_PEERING.
+ auth_enabled (bool):
+ Optional. Indicates whether OSS Redis AUTH is
+ enabled for the instance. If set to "true" AUTH
+ is enabled on the instance. Default value is
+ "false" meaning AUTH is disabled.
+ server_ca_certs (MutableSequence[google.cloud.redis_v1.types.TlsCertificate]):
+ Output only. List of server CA certificates
+ for the instance.
+ transit_encryption_mode (google.cloud.redis_v1.types.Instance.TransitEncryptionMode):
+ Optional. The TLS mode of the Redis instance.
+ If not provided, TLS is disabled for the
+ instance.
+ maintenance_policy (google.cloud.redis_v1.types.MaintenancePolicy):
+ Optional. The maintenance policy for the
+ instance. If not provided, maintenance events
+ can be performed at any time.
+ maintenance_schedule (google.cloud.redis_v1.types.MaintenanceSchedule):
+ Output only. Date and time of upcoming
+ maintenance events which have been scheduled.
+ replica_count (int):
+ Optional. The number of replica nodes. The valid range for
+ the Standard Tier with read replicas enabled is [1-5] and
+ defaults to 2. If read replicas are not enabled for a
+ Standard Tier instance, the only valid value is 1 and the
+ default is 1. The valid value for basic tier is 0 and the
+ default is also 0.
+ nodes (MutableSequence[google.cloud.redis_v1.types.NodeInfo]):
+ Output only. Info per node.
+ read_endpoint (str):
+ Output only. Hostname or IP address of the
+ exposed readonly Redis endpoint. Standard tier
+ only. Targets all healthy replica nodes in
+ instance. Replication is asynchronous and
+ replica nodes will exhibit some lag behind the
+ primary. Write requests must target 'host'.
+ read_endpoint_port (int):
+ Output only. The port number of the exposed
+ readonly redis endpoint. Standard tier only.
+ Write requests should target 'port'.
+ read_replicas_mode (google.cloud.redis_v1.types.Instance.ReadReplicasMode):
+ Optional. Read replicas mode for the instance. Defaults to
+ READ_REPLICAS_DISABLED.
+ customer_managed_key (str):
+ Optional. The KMS key reference that the
+ customer provides when trying to create the
+ instance.
+ persistence_config (google.cloud.redis_v1.types.PersistenceConfig):
+ Optional. Persistence configuration
+ parameters
+ suspension_reasons (MutableSequence[google.cloud.redis_v1.types.Instance.SuspensionReason]):
+ Optional. reasons that causes instance in
+ "SUSPENDED" state.
+ maintenance_version (str):
+ Optional. The self service update maintenance version. The
+ version is date based such as "20210712_00_00".
+ available_maintenance_versions (MutableSequence[str]):
+ Optional. The available maintenance versions
+ that an instance could update to.
+ """
+
+ class State(proto.Enum):
+ r"""Represents the different states of a Redis instance.
+
+ Values:
+ STATE_UNSPECIFIED (0):
+ Not set.
+ CREATING (1):
+ Redis instance is being created.
+ READY (2):
+ Redis instance has been created and is fully
+ usable.
+ UPDATING (3):
+ Redis instance configuration is being
+ updated. Certain kinds of updates may cause the
+ instance to become unusable while the update is
+ in progress.
+ DELETING (4):
+ Redis instance is being deleted.
+ REPAIRING (5):
+ Redis instance is being repaired and may be
+ unusable.
+ MAINTENANCE (6):
+ Maintenance is being performed on this Redis
+ instance.
+ IMPORTING (8):
+ Redis instance is importing data
+ (availability may be affected).
+ FAILING_OVER (9):
+ Redis instance is failing over (availability
+ may be affected).
+ """
+ STATE_UNSPECIFIED = 0
+ CREATING = 1
+ READY = 2
+ UPDATING = 3
+ DELETING = 4
+ REPAIRING = 5
+ MAINTENANCE = 6
+ IMPORTING = 8
+ FAILING_OVER = 9
+
+ class Tier(proto.Enum):
+ r"""Available service tiers to choose from
+
+ Values:
+ TIER_UNSPECIFIED (0):
+ Not set.
+ BASIC (1):
+ BASIC tier: standalone instance
+ STANDARD_HA (3):
+ STANDARD_HA tier: highly available primary/replica instances
+ """
+ TIER_UNSPECIFIED = 0
+ BASIC = 1
+ STANDARD_HA = 3
+
+ class ConnectMode(proto.Enum):
+ r"""Available connection modes.
+
+ Values:
+ CONNECT_MODE_UNSPECIFIED (0):
+ Not set.
+ DIRECT_PEERING (1):
+ Connect via direct peering to the Memorystore
+ for Redis hosted service.
+ PRIVATE_SERVICE_ACCESS (2):
+ Connect your Memorystore for Redis instance
+ using Private Service Access. Private services
+ access provides an IP address range for multiple
+ Google Cloud services, including Memorystore.
+ """
+ CONNECT_MODE_UNSPECIFIED = 0
+ DIRECT_PEERING = 1
+ PRIVATE_SERVICE_ACCESS = 2
+
+ class TransitEncryptionMode(proto.Enum):
+ r"""Available TLS modes.
+
+ Values:
+ TRANSIT_ENCRYPTION_MODE_UNSPECIFIED (0):
+ Not set.
+ SERVER_AUTHENTICATION (1):
+ Client to Server traffic encryption enabled
+ with server authentication.
+ DISABLED (2):
+ TLS is disabled for the instance.
+ """
+ TRANSIT_ENCRYPTION_MODE_UNSPECIFIED = 0
+ SERVER_AUTHENTICATION = 1
+ DISABLED = 2
+
+ class ReadReplicasMode(proto.Enum):
+ r"""Read replicas mode.
+
+ Values:
+ READ_REPLICAS_MODE_UNSPECIFIED (0):
+ If not set, Memorystore Redis backend will default to
+ READ_REPLICAS_DISABLED.
+ READ_REPLICAS_DISABLED (1):
+ If disabled, read endpoint will not be
+ provided and the instance cannot scale up or
+ down the number of replicas.
+ READ_REPLICAS_ENABLED (2):
+ If enabled, read endpoint will be provided
+ and the instance can scale up and down the
+ number of replicas. Not valid for basic tier.
+ """
+ READ_REPLICAS_MODE_UNSPECIFIED = 0
+ READ_REPLICAS_DISABLED = 1
+ READ_REPLICAS_ENABLED = 2
+
+ class SuspensionReason(proto.Enum):
+ r"""Possible reasons for the instance to be in a "SUSPENDED"
+ state.
+
+ Values:
+ SUSPENSION_REASON_UNSPECIFIED (0):
+ Not set.
+ CUSTOMER_MANAGED_KEY_ISSUE (1):
+ Something wrong with the CMEK key provided by
+ customer.
+ """
+ SUSPENSION_REASON_UNSPECIFIED = 0
+ CUSTOMER_MANAGED_KEY_ISSUE = 1
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ display_name: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ labels: MutableMapping[str, str] = proto.MapField(
+ proto.STRING,
+ proto.STRING,
+ number=3,
+ )
+ location_id: str = proto.Field(
+ proto.STRING,
+ number=4,
+ )
+ alternative_location_id: str = proto.Field(
+ proto.STRING,
+ number=5,
+ )
+ redis_version: str = proto.Field(
+ proto.STRING,
+ number=7,
+ )
+ reserved_ip_range: str = proto.Field(
+ proto.STRING,
+ number=9,
+ )
+ secondary_ip_range: str = proto.Field(
+ proto.STRING,
+ number=30,
+ )
+ host: str = proto.Field(
+ proto.STRING,
+ number=10,
+ )
+ port: int = proto.Field(
+ proto.INT32,
+ number=11,
+ )
+ current_location_id: str = proto.Field(
+ proto.STRING,
+ number=12,
+ )
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=13,
+ message=timestamp_pb2.Timestamp,
+ )
+ state: State = proto.Field(
+ proto.ENUM,
+ number=14,
+ enum=State,
+ )
+ status_message: str = proto.Field(
+ proto.STRING,
+ number=15,
+ )
+ redis_configs: MutableMapping[str, str] = proto.MapField(
+ proto.STRING,
+ proto.STRING,
+ number=16,
+ )
+ tier: Tier = proto.Field(
+ proto.ENUM,
+ number=17,
+ enum=Tier,
+ )
+ memory_size_gb: int = proto.Field(
+ proto.INT32,
+ number=18,
+ )
+ authorized_network: str = proto.Field(
+ proto.STRING,
+ number=20,
+ )
+ persistence_iam_identity: str = proto.Field(
+ proto.STRING,
+ number=21,
+ )
+ connect_mode: ConnectMode = proto.Field(
+ proto.ENUM,
+ number=22,
+ enum=ConnectMode,
+ )
+ auth_enabled: bool = proto.Field(
+ proto.BOOL,
+ number=23,
+ )
+ server_ca_certs: MutableSequence["TlsCertificate"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=25,
+ message="TlsCertificate",
+ )
+ transit_encryption_mode: TransitEncryptionMode = proto.Field(
+ proto.ENUM,
+ number=26,
+ enum=TransitEncryptionMode,
+ )
+ maintenance_policy: "MaintenancePolicy" = proto.Field(
+ proto.MESSAGE,
+ number=27,
+ message="MaintenancePolicy",
+ )
+ maintenance_schedule: "MaintenanceSchedule" = proto.Field(
+ proto.MESSAGE,
+ number=28,
+ message="MaintenanceSchedule",
+ )
+ replica_count: int = proto.Field(
+ proto.INT32,
+ number=31,
+ )
+ nodes: MutableSequence["NodeInfo"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=32,
+ message="NodeInfo",
+ )
+ read_endpoint: str = proto.Field(
+ proto.STRING,
+ number=33,
+ )
+ read_endpoint_port: int = proto.Field(
+ proto.INT32,
+ number=34,
+ )
+ read_replicas_mode: ReadReplicasMode = proto.Field(
+ proto.ENUM,
+ number=35,
+ enum=ReadReplicasMode,
+ )
+ customer_managed_key: str = proto.Field(
+ proto.STRING,
+ number=36,
+ )
+ persistence_config: "PersistenceConfig" = proto.Field(
+ proto.MESSAGE,
+ number=37,
+ message="PersistenceConfig",
+ )
+ suspension_reasons: MutableSequence[SuspensionReason] = proto.RepeatedField(
+ proto.ENUM,
+ number=38,
+ enum=SuspensionReason,
+ )
+ maintenance_version: str = proto.Field(
+ proto.STRING,
+ number=39,
+ )
+ available_maintenance_versions: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=40,
+ )
+
+
+class PersistenceConfig(proto.Message):
+ r"""Configuration of the persistence functionality.
+
+ Attributes:
+ persistence_mode (google.cloud.redis_v1.types.PersistenceConfig.PersistenceMode):
+ Optional. Controls whether Persistence
+ features are enabled. If not provided, the
+ existing value will be used.
+ rdb_snapshot_period (google.cloud.redis_v1.types.PersistenceConfig.SnapshotPeriod):
+ Optional. Period between RDB snapshots. Snapshots will be
+ attempted every period starting from the provided snapshot
+ start time. For example, a start time of 01/01/2033 06:45
+ and SIX_HOURS snapshot period will do nothing until
+ 01/01/2033, and then trigger snapshots every day at 06:45,
+ 12:45, 18:45, and 00:45 the next day, and so on. If not
+ provided, TWENTY_FOUR_HOURS will be used as default.
+ rdb_next_snapshot_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The next time that a snapshot
+ attempt is scheduled to occur.
+ rdb_snapshot_start_time (google.protobuf.timestamp_pb2.Timestamp):
+ Optional. Date and time that the first
+ snapshot was/will be attempted, and to which
+ future snapshots will be aligned. If not
+ provided, the current time will be used.
+ """
+
+ class PersistenceMode(proto.Enum):
+ r"""Available Persistence modes.
+
+ Values:
+ PERSISTENCE_MODE_UNSPECIFIED (0):
+ Not set.
+ DISABLED (1):
+ Persistence is disabled for the instance,
+ and any existing snapshots are deleted.
+ RDB (2):
+ RDB based Persistence is enabled.
+ """
+ PERSISTENCE_MODE_UNSPECIFIED = 0
+ DISABLED = 1
+ RDB = 2
+
+ class SnapshotPeriod(proto.Enum):
+ r"""Available snapshot periods for scheduling.
+
+ Values:
+ SNAPSHOT_PERIOD_UNSPECIFIED (0):
+ Not set.
+ ONE_HOUR (3):
+ Snapshot every 1 hour.
+ SIX_HOURS (4):
+ Snapshot every 6 hours.
+ TWELVE_HOURS (5):
+ Snapshot every 12 hours.
+ TWENTY_FOUR_HOURS (6):
+ Snapshot every 24 hours.
+ """
+ SNAPSHOT_PERIOD_UNSPECIFIED = 0
+ ONE_HOUR = 3
+ SIX_HOURS = 4
+ TWELVE_HOURS = 5
+ TWENTY_FOUR_HOURS = 6
+
+ persistence_mode: PersistenceMode = proto.Field(
+ proto.ENUM,
+ number=1,
+ enum=PersistenceMode,
+ )
+ rdb_snapshot_period: SnapshotPeriod = proto.Field(
+ proto.ENUM,
+ number=2,
+ enum=SnapshotPeriod,
+ )
+ rdb_next_snapshot_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=4,
+ message=timestamp_pb2.Timestamp,
+ )
+ rdb_snapshot_start_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=5,
+ message=timestamp_pb2.Timestamp,
+ )
+
+
+class RescheduleMaintenanceRequest(proto.Message):
+ r"""Request for
+ [RescheduleMaintenance][google.cloud.redis.v1.CloudRedis.RescheduleMaintenance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ reschedule_type (google.cloud.redis_v1.types.RescheduleMaintenanceRequest.RescheduleType):
+ Required. If reschedule type is SPECIFIC_TIME, must set up
+ schedule_time as well.
+ schedule_time (google.protobuf.timestamp_pb2.Timestamp):
+ Optional. Timestamp when the maintenance shall be
+ rescheduled to if reschedule_type=SPECIFIC_TIME, in RFC 3339
+ format, for example ``2012-11-15T16:19:00.094Z``.
+ """
+
+ class RescheduleType(proto.Enum):
+ r"""Reschedule options.
+
+ Values:
+ RESCHEDULE_TYPE_UNSPECIFIED (0):
+ Not set.
+ IMMEDIATE (1):
+ If the user wants to schedule the maintenance
+ to happen now.
+ NEXT_AVAILABLE_WINDOW (2):
+ If the user wants to use the existing
+ maintenance policy to find the next available
+ window.
+ SPECIFIC_TIME (3):
+ If the user wants to reschedule the
+ maintenance to a specific time.
+ """
+ RESCHEDULE_TYPE_UNSPECIFIED = 0
+ IMMEDIATE = 1
+ NEXT_AVAILABLE_WINDOW = 2
+ SPECIFIC_TIME = 3
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ reschedule_type: RescheduleType = proto.Field(
+ proto.ENUM,
+ number=2,
+ enum=RescheduleType,
+ )
+ schedule_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message=timestamp_pb2.Timestamp,
+ )
+
+
+class MaintenancePolicy(proto.Message):
+ r"""Maintenance policy for an instance.
+
+ Attributes:
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time when the policy was
+ created.
+ update_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time when the policy was
+ last updated.
+ description (str):
+ Optional. Description of what this policy is for.
+ Create/Update methods return INVALID_ARGUMENT if the length
+ is greater than 512.
+ weekly_maintenance_window (MutableSequence[google.cloud.redis_v1.types.WeeklyMaintenanceWindow]):
+ Optional. Maintenance window that is applied to resources
+ covered by this policy. Minimum 1. For the current version,
+ the maximum number of weekly_window is expected to be one.
+ """
+
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message=timestamp_pb2.Timestamp,
+ )
+ update_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=timestamp_pb2.Timestamp,
+ )
+ description: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ weekly_maintenance_window: MutableSequence[
+ "WeeklyMaintenanceWindow"
+ ] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=4,
+ message="WeeklyMaintenanceWindow",
+ )
+
+
+class WeeklyMaintenanceWindow(proto.Message):
+ r"""Time window in which disruptive maintenance updates occur.
+ Non-disruptive updates can occur inside or outside this window.
+
+ Attributes:
+ day (google.type.dayofweek_pb2.DayOfWeek):
+ Required. The day of week that maintenance
+ updates occur.
+ start_time (google.type.timeofday_pb2.TimeOfDay):
+ Required. Start time of the window in UTC
+ time.
+ duration (google.protobuf.duration_pb2.Duration):
+ Output only. Duration of the maintenance
+ window. The current window is fixed at 1 hour.
+ """
+
+ day: dayofweek_pb2.DayOfWeek = proto.Field(
+ proto.ENUM,
+ number=1,
+ enum=dayofweek_pb2.DayOfWeek,
+ )
+ start_time: timeofday_pb2.TimeOfDay = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=timeofday_pb2.TimeOfDay,
+ )
+ duration: duration_pb2.Duration = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message=duration_pb2.Duration,
+ )
+
+
+class MaintenanceSchedule(proto.Message):
+ r"""Upcoming maintenance schedule. If no maintenance is
+ scheduled, fields are not populated.
+
+ Attributes:
+ start_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The start time of any upcoming
+ scheduled maintenance for this instance.
+ end_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The end time of any upcoming
+ scheduled maintenance for this instance.
+ can_reschedule (bool):
+ If the scheduled maintenance can be
+ rescheduled, default is true.
+ schedule_deadline_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The deadline that the
+ maintenance schedule start time can not go
+ beyond, including reschedule.
+ """
+
+ start_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message=timestamp_pb2.Timestamp,
+ )
+ end_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=timestamp_pb2.Timestamp,
+ )
+ can_reschedule: bool = proto.Field(
+ proto.BOOL,
+ number=3,
+ )
+ schedule_deadline_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=5,
+ message=timestamp_pb2.Timestamp,
+ )
+
+
+class ListInstancesRequest(proto.Message):
+ r"""Request for
+ [ListInstances][google.cloud.redis.v1.CloudRedis.ListInstances].
+
+ Attributes:
+ parent (str):
+ Required. The resource name of the instance location using
+ the form: ``projects/{project_id}/locations/{location_id}``
+ where ``location_id`` refers to a GCP region.
+ page_size (int):
+ The maximum number of items to return.
+
+ If not specified, a default value of 1000 will be used by
+ the service. Regardless of the page_size value, the response
+ may include a partial list and a caller should only rely on
+ response's
+ [``next_page_token``][google.cloud.redis.v1.ListInstancesResponse.next_page_token]
+ to determine if there are more instances left to be queried.
+ page_token (str):
+ The ``next_page_token`` value returned from a previous
+ [ListInstances][google.cloud.redis.v1.CloudRedis.ListInstances]
+ request, if any.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+class ListInstancesResponse(proto.Message):
+ r"""Response for
+ [ListInstances][google.cloud.redis.v1.CloudRedis.ListInstances].
+
+ Attributes:
+ instances (MutableSequence[google.cloud.redis_v1.types.Instance]):
+ A list of Redis instances in the project in the specified
+ location, or across all locations.
+
+ If the ``location_id`` in the parent field of the request is
+ "-", all regions available to the project are queried, and
+ the results aggregated. If in such an aggregated query a
+ location is unavailable, a placeholder Redis entry is
+ included in the response with the ``name`` field set to a
+ value of the form
+ ``projects/{project_id}/locations/{location_id}/instances/``-
+ and the ``status`` field set to ERROR and ``status_message``
+ field set to "location not available for ListInstances".
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ unreachable (MutableSequence[str]):
+ Locations that could not be reached.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ instances: MutableSequence["Instance"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message="Instance",
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ unreachable: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=3,
+ )
+
+
+class GetInstanceRequest(proto.Message):
+ r"""Request for
+ [GetInstance][google.cloud.redis.v1.CloudRedis.GetInstance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class GetInstanceAuthStringRequest(proto.Message):
+ r"""Request for
+ [GetInstanceAuthString][google.cloud.redis.v1.CloudRedis.GetInstanceAuthString].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class InstanceAuthString(proto.Message):
+ r"""Instance AUTH string details.
+
+ Attributes:
+ auth_string (str):
+ AUTH string set on the instance.
+ """
+
+ auth_string: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class CreateInstanceRequest(proto.Message):
+ r"""Request for
+ [CreateInstance][google.cloud.redis.v1.CloudRedis.CreateInstance].
+
+ Attributes:
+ parent (str):
+ Required. The resource name of the instance location using
+ the form: ``projects/{project_id}/locations/{location_id}``
+ where ``location_id`` refers to a GCP region.
+ instance_id (str):
+ Required. The logical name of the Redis instance in the
+ customer project with the following restrictions:
+
+ - Must contain only lowercase letters, numbers, and
+ hyphens.
+ - Must start with a letter.
+ - Must be between 1-40 characters.
+ - Must end with a number or a letter.
+ - Must be unique within the customer project / location
+ instance (google.cloud.redis_v1.types.Instance):
+ Required. A Redis [Instance] resource
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ instance_id: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ instance: "Instance" = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message="Instance",
+ )
+
+
+class UpdateInstanceRequest(proto.Message):
+ r"""Request for
+ [UpdateInstance][google.cloud.redis.v1.CloudRedis.UpdateInstance].
+
+ Attributes:
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Required. Mask of fields to update. At least one path must
+ be supplied in this field. The elements of the repeated
+ paths field may only include these fields from
+ [Instance][google.cloud.redis.v1.Instance]:
+
+ - ``displayName``
+ - ``labels``
+ - ``memorySizeGb``
+ - ``redisConfig``
+ - ``replica_count``
+ instance (google.cloud.redis_v1.types.Instance):
+ Required. Update description. Only fields specified in
+ update_mask are updated.
+ """
+
+ update_mask: field_mask_pb2.FieldMask = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message=field_mask_pb2.FieldMask,
+ )
+ instance: "Instance" = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message="Instance",
+ )
+
+
+class UpgradeInstanceRequest(proto.Message):
+ r"""Request for
+ [UpgradeInstance][google.cloud.redis.v1.CloudRedis.UpgradeInstance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ redis_version (str):
+ Required. Specifies the target version of
+ Redis software to upgrade to.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ redis_version: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class DeleteInstanceRequest(proto.Message):
+ r"""Request for
+ [DeleteInstance][google.cloud.redis.v1.CloudRedis.DeleteInstance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class GcsSource(proto.Message):
+ r"""The Cloud Storage location for the input content
+
+ Attributes:
+ uri (str):
+ Required. Source data URI. (e.g.
+ 'gs://my_bucket/my_object').
+ """
+
+ uri: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class InputConfig(proto.Message):
+ r"""The input content
+
+ .. _oneof: https://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields
+
+ Attributes:
+ gcs_source (google.cloud.redis_v1.types.GcsSource):
+ Google Cloud Storage location where input
+ content is located.
+
+ This field is a member of `oneof`_ ``source``.
+ """
+
+ gcs_source: "GcsSource" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ oneof="source",
+ message="GcsSource",
+ )
+
+
+class ImportInstanceRequest(proto.Message):
+ r"""Request for
+ [Import][google.cloud.redis.v1.CloudRedis.ImportInstance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ input_config (google.cloud.redis_v1.types.InputConfig):
+ Required. Specify data to be imported.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ input_config: "InputConfig" = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message="InputConfig",
+ )
+
+
+class GcsDestination(proto.Message):
+ r"""The Cloud Storage location for the output content
+
+ Attributes:
+ uri (str):
+ Required. Data destination URI (e.g.
+ 'gs://my_bucket/my_object'). Existing files will be
+ overwritten.
+ """
+
+ uri: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class OutputConfig(proto.Message):
+ r"""The output content
+
+ .. _oneof: https://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields
+
+ Attributes:
+ gcs_destination (google.cloud.redis_v1.types.GcsDestination):
+ Google Cloud Storage destination for output
+ content.
+
+ This field is a member of `oneof`_ ``destination``.
+ """
+
+ gcs_destination: "GcsDestination" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ oneof="destination",
+ message="GcsDestination",
+ )
+
+
+class ExportInstanceRequest(proto.Message):
+ r"""Request for
+ [Export][google.cloud.redis.v1.CloudRedis.ExportInstance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ output_config (google.cloud.redis_v1.types.OutputConfig):
+ Required. Specify data to be exported.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ output_config: "OutputConfig" = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message="OutputConfig",
+ )
+
+
+class FailoverInstanceRequest(proto.Message):
+ r"""Request for
+ [Failover][google.cloud.redis.v1.CloudRedis.FailoverInstance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ data_protection_mode (google.cloud.redis_v1.types.FailoverInstanceRequest.DataProtectionMode):
+ Optional. Available data protection modes that the user can
+ choose. If it's unspecified, data protection mode will be
+ LIMITED_DATA_LOSS by default.
+ """
+
+ class DataProtectionMode(proto.Enum):
+ r"""Specifies different modes of operation in relation to the
+ data retention.
+
+ Values:
+ DATA_PROTECTION_MODE_UNSPECIFIED (0):
+ Defaults to LIMITED_DATA_LOSS if a data protection mode is
+ not specified.
+ LIMITED_DATA_LOSS (1):
+ Instance failover will be protected with data
+ loss control. More specifically, the failover
+ will only be performed if the current
+ replication offset diff between primary and
+ replica is under a certain threshold.
+ FORCE_DATA_LOSS (2):
+ Instance failover will be performed without
+ data loss control.
+ """
+ DATA_PROTECTION_MODE_UNSPECIFIED = 0
+ LIMITED_DATA_LOSS = 1
+ FORCE_DATA_LOSS = 2
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ data_protection_mode: DataProtectionMode = proto.Field(
+ proto.ENUM,
+ number=2,
+ enum=DataProtectionMode,
+ )
+
+
+class OperationMetadata(proto.Message):
+ r"""Represents the v1 metadata of the long-running operation.
+
+ Attributes:
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Creation timestamp.
+ end_time (google.protobuf.timestamp_pb2.Timestamp):
+ End timestamp.
+ target (str):
+ Operation target.
+ verb (str):
+ Operation verb.
+ status_detail (str):
+ Operation status details.
+ cancel_requested (bool):
+ Specifies if cancellation was requested for
+ the operation.
+ api_version (str):
+ API version.
+ """
+
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message=timestamp_pb2.Timestamp,
+ )
+ end_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=timestamp_pb2.Timestamp,
+ )
+ target: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ verb: str = proto.Field(
+ proto.STRING,
+ number=4,
+ )
+ status_detail: str = proto.Field(
+ proto.STRING,
+ number=5,
+ )
+ cancel_requested: bool = proto.Field(
+ proto.BOOL,
+ number=6,
+ )
+ api_version: str = proto.Field(
+ proto.STRING,
+ number=7,
+ )
+
+
+class LocationMetadata(proto.Message):
+ r"""This location metadata represents additional configuration options
+ for a given location where a Redis instance may be created. All
+ fields are output only. It is returned as content of the
+ ``google.cloud.location.Location.metadata`` field.
+
+ Attributes:
+ available_zones (MutableMapping[str, google.cloud.redis_v1.types.ZoneMetadata]):
+ Output only. The set of available zones in the location. The
+ map is keyed by the lowercase ID of each zone, as defined by
+ GCE. These keys can be specified in ``location_id`` or
+ ``alternative_location_id`` fields when creating a Redis
+ instance.
+ """
+
+ available_zones: MutableMapping[str, "ZoneMetadata"] = proto.MapField(
+ proto.STRING,
+ proto.MESSAGE,
+ number=1,
+ message="ZoneMetadata",
+ )
+
+
+class ZoneMetadata(proto.Message):
+ r"""Defines specific information for a particular zone. Currently
+ empty and reserved for future use only.
+
+ """
+
+
+class TlsCertificate(proto.Message):
+ r"""TlsCertificate Resource
+
+ Attributes:
+ serial_number (str):
+ Serial number, as extracted from the
+ certificate.
+ cert (str):
+ PEM representation.
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time when the certificate was created in
+ `RFC 3339 <https://tools.ietf.org/html/rfc3339>`__ format,
+ for example ``2020-05-18T00:00:00.094Z``.
+ expire_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time when the certificate expires in `RFC
+ 3339 <https://tools.ietf.org/html/rfc3339>`__ format, for
+ example ``2020-05-18T00:00:00.094Z``.
+ sha1_fingerprint (str):
+ Sha1 Fingerprint of the certificate.
+ """
+
+ serial_number: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ cert: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message=timestamp_pb2.Timestamp,
+ )
+ expire_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=4,
+ message=timestamp_pb2.Timestamp,
+ )
+ sha1_fingerprint: str = proto.Field(
+ proto.STRING,
+ number=5,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/__init__.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/__init__.py
@@ -0,0 +1,80 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from google.cloud.redis_v1beta1 import gapic_version as package_version
+
+__version__ = package_version.__version__
+
+
+from .services.cloud_redis import CloudRedisAsyncClient, CloudRedisClient
+from .types.cloud_redis import (
+ CreateInstanceRequest,
+ DeleteInstanceRequest,
+ ExportInstanceRequest,
+ FailoverInstanceRequest,
+ GcsDestination,
+ GcsSource,
+ GetInstanceAuthStringRequest,
+ GetInstanceRequest,
+ ImportInstanceRequest,
+ InputConfig,
+ Instance,
+ InstanceAuthString,
+ ListInstancesRequest,
+ ListInstancesResponse,
+ LocationMetadata,
+ MaintenancePolicy,
+ MaintenanceSchedule,
+ NodeInfo,
+ OutputConfig,
+ PersistenceConfig,
+ RescheduleMaintenanceRequest,
+ TlsCertificate,
+ UpdateInstanceRequest,
+ UpgradeInstanceRequest,
+ WeeklyMaintenanceWindow,
+ ZoneMetadata,
+)
+
+__all__ = (
+ "CloudRedisAsyncClient",
+ "CloudRedisClient",
+ "CreateInstanceRequest",
+ "DeleteInstanceRequest",
+ "ExportInstanceRequest",
+ "FailoverInstanceRequest",
+ "GcsDestination",
+ "GcsSource",
+ "GetInstanceAuthStringRequest",
+ "GetInstanceRequest",
+ "ImportInstanceRequest",
+ "InputConfig",
+ "Instance",
+ "InstanceAuthString",
+ "ListInstancesRequest",
+ "ListInstancesResponse",
+ "LocationMetadata",
+ "MaintenancePolicy",
+ "MaintenanceSchedule",
+ "NodeInfo",
+ "OutputConfig",
+ "PersistenceConfig",
+ "RescheduleMaintenanceRequest",
+ "TlsCertificate",
+ "UpdateInstanceRequest",
+ "UpgradeInstanceRequest",
+ "WeeklyMaintenanceWindow",
+ "ZoneMetadata",
+)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/gapic_version.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/gapic_version.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/gapic_version.py
@@ -0,0 +1,16 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+__version__ = "2.13.0" # {x-release-please-version}
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/__init__.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/__init__.py
@@ -0,0 +1,15 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/__init__.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/__init__.py
@@ -0,0 +1,22 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .async_client import CloudRedisAsyncClient
+from .client import CloudRedisClient
+
+__all__ = (
+ "CloudRedisClient",
+ "CloudRedisAsyncClient",
+)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/async_client.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/async_client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/async_client.py
@@ -0,0 +1,1701 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import functools
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+)
+
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.api_core.client_options import ClientOptions
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.redis_v1beta1 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.protobuf import any_pb2 # type: ignore
+from google.protobuf import empty_pb2 # type: ignore
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.redis_v1beta1.services.cloud_redis import pagers
+from google.cloud.redis_v1beta1.types import cloud_redis
+
+from .client import CloudRedisClient
+from .transports.base import DEFAULT_CLIENT_INFO, CloudRedisTransport
+from .transports.grpc_asyncio import CloudRedisGrpcAsyncIOTransport
+
+
+class CloudRedisAsyncClient:
+ """Configures and manages Cloud Memorystore for Redis instances
+
+ Google Cloud Memorystore for Redis v1beta1
+
+ The ``redis.googleapis.com`` service implements the Google Cloud
+ Memorystore for Redis API and defines the following resource model
+ for managing Redis instances:
+
+ - The service works with a collection of cloud projects, named:
+ ``/projects/*``
+ - Each project has a collection of available locations, named:
+ ``/locations/*``
+ - Each location has a collection of Redis instances, named:
+ ``/instances/*``
+ - As such, Redis instances are resources of the form:
+ ``/projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+
+ Note that location_id must be referring to a GCP ``region``; for
+ example:
+
+ - ``projects/redpepper-1290/locations/us-central1/instances/my-redis``
+ """
+
+ _client: CloudRedisClient
+
+ DEFAULT_ENDPOINT = CloudRedisClient.DEFAULT_ENDPOINT
+ DEFAULT_MTLS_ENDPOINT = CloudRedisClient.DEFAULT_MTLS_ENDPOINT
+
+ instance_path = staticmethod(CloudRedisClient.instance_path)
+ parse_instance_path = staticmethod(CloudRedisClient.parse_instance_path)
+ common_billing_account_path = staticmethod(
+ CloudRedisClient.common_billing_account_path
+ )
+ parse_common_billing_account_path = staticmethod(
+ CloudRedisClient.parse_common_billing_account_path
+ )
+ common_folder_path = staticmethod(CloudRedisClient.common_folder_path)
+ parse_common_folder_path = staticmethod(CloudRedisClient.parse_common_folder_path)
+ common_organization_path = staticmethod(CloudRedisClient.common_organization_path)
+ parse_common_organization_path = staticmethod(
+ CloudRedisClient.parse_common_organization_path
+ )
+ common_project_path = staticmethod(CloudRedisClient.common_project_path)
+ parse_common_project_path = staticmethod(CloudRedisClient.parse_common_project_path)
+ common_location_path = staticmethod(CloudRedisClient.common_location_path)
+ parse_common_location_path = staticmethod(
+ CloudRedisClient.parse_common_location_path
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ CloudRedisAsyncClient: The constructed client.
+ """
+ return CloudRedisClient.from_service_account_info.__func__(CloudRedisAsyncClient, info, *args, **kwargs) # type: ignore
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ CloudRedisAsyncClient: The constructed client.
+ """
+ return CloudRedisClient.from_service_account_file.__func__(CloudRedisAsyncClient, filename, *args, **kwargs) # type: ignore
+
+ from_service_account_json = from_service_account_file
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ return CloudRedisClient.get_mtls_endpoint_and_cert_source(client_options) # type: ignore
+
+ @property
+ def transport(self) -> CloudRedisTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ CloudRedisTransport: The transport used by the client instance.
+ """
+ return self._client.transport
+
+ get_transport_class = functools.partial(
+ type(CloudRedisClient).get_transport_class, type(CloudRedisClient)
+ )
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Union[str, CloudRedisTransport] = "grpc_asyncio",
+ client_options: Optional[ClientOptions] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the cloud redis client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, ~.CloudRedisTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (ClientOptions): Custom options for the client. It
+ won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ self._client = CloudRedisClient(
+ credentials=credentials,
+ transport=transport,
+ client_options=client_options,
+ client_info=client_info,
+ )
+
+ async def list_instances(
+ self,
+ request: Optional[Union[cloud_redis.ListInstancesRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListInstancesAsyncPager:
+ r"""Lists all Redis instances owned by a project in either the
+ specified location (region) or all locations.
+
+ The location should have the following format:
+
+ - ``projects/{project_id}/locations/{location_id}``
+
+ If ``location_id`` is specified as ``-`` (wildcard), then all
+ regions available to the project are queried, and the results
+ are aggregated.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ async def sample_list_instances():
+ # Create a client
+ client = redis_v1beta1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.ListInstancesRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_instances(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1beta1.types.ListInstancesRequest, dict]]):
+ The request object. Request for
+ [ListInstances][google.cloud.redis.v1beta1.CloudRedis.ListInstances].
+ parent (:class:`str`):
+ Required. The resource name of the instance location
+ using the form:
+ ``projects/{project_id}/locations/{location_id}`` where
+ ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.redis_v1beta1.services.cloud_redis.pagers.ListInstancesAsyncPager:
+ Response for
+ [ListInstances][google.cloud.redis.v1beta1.CloudRedis.ListInstances].
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.ListInstancesRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_instances,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListInstancesAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_instance(
+ self,
+ request: Optional[Union[cloud_redis.GetInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.Instance:
+ r"""Gets the details of a specific Redis instance.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ async def sample_get_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.GetInstanceRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_instance(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1beta1.types.GetInstanceRequest, dict]]):
+ The request object. Request for
+ [GetInstance][google.cloud.redis.v1beta1.CloudRedis.GetInstance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.redis_v1beta1.types.Instance:
+ A Memorystore for Redis instance.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.GetInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_instance_auth_string(
+ self,
+ request: Optional[Union[cloud_redis.GetInstanceAuthStringRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.InstanceAuthString:
+ r"""Gets the AUTH string for a Redis instance. If AUTH is
+ not enabled for the instance the response will be empty.
+ This information is not included in the details returned
+ to GetInstance.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ async def sample_get_instance_auth_string():
+ # Create a client
+ client = redis_v1beta1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.GetInstanceAuthStringRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_instance_auth_string(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1beta1.types.GetInstanceAuthStringRequest, dict]]):
+ The request object. Request for
+ [GetInstanceAuthString][google.cloud.redis.v1beta1.CloudRedis.GetInstanceAuthString].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.redis_v1beta1.types.InstanceAuthString:
+ Instance AUTH string details.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.GetInstanceAuthStringRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_instance_auth_string,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def create_instance(
+ self,
+ request: Optional[Union[cloud_redis.CreateInstanceRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ instance_id: Optional[str] = None,
+ instance: Optional[cloud_redis.Instance] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Creates a Redis instance based on the specified tier and memory
+ size.
+
+ By default, the instance is accessible from the project's
+ `default network <https://cloud.google.com/vpc/docs/vpc>`__.
+
+ The creation is executed asynchronously and callers may check
+ the returned operation to track its progress. Once the operation
+ is completed the Redis instance will be fully functional. The
+ completed longrunning.Operation will contain the new instance
+ object in the response field.
+
+ The returned operation is automatically deleted after a few
+ hours, so there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ async def sample_create_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ instance = redis_v1beta1.Instance()
+ instance.name = "name_value"
+ instance.tier = "STANDARD_HA"
+ instance.memory_size_gb = 1499
+
+ request = redis_v1beta1.CreateInstanceRequest(
+ parent="parent_value",
+ instance_id="instance_id_value",
+ instance=instance,
+ )
+
+ # Make the request
+ operation = client.create_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1beta1.types.CreateInstanceRequest, dict]]):
+ The request object. Request for
+ [CreateInstance][google.cloud.redis.v1beta1.CloudRedis.CreateInstance].
+ parent (:class:`str`):
+ Required. The resource name of the instance location
+ using the form:
+ ``projects/{project_id}/locations/{location_id}`` where
+ ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ instance_id (:class:`str`):
+ Required. The logical name of the Redis instance in the
+ customer project with the following restrictions:
+
+ - Must contain only lowercase letters, numbers, and
+ hyphens.
+ - Must start with a letter.
+ - Must be between 1-40 characters.
+ - Must end with a number or a letter.
+ - Must be unique within the customer project / location
+
+ This corresponds to the ``instance_id`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ instance (:class:`google.cloud.redis_v1beta1.types.Instance`):
+ Required. A Redis [Instance] resource
+ This corresponds to the ``instance`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, instance_id, instance])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.CreateInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if instance_id is not None:
+ request.instance_id = instance_id
+ if instance is not None:
+ request.instance = instance
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.create_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def update_instance(
+ self,
+ request: Optional[Union[cloud_redis.UpdateInstanceRequest, dict]] = None,
+ *,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ instance: Optional[cloud_redis.Instance] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Updates the metadata and configuration of a specific
+ Redis instance.
+ Completed longrunning.Operation will contain the new
+ instance object in the response field. The returned
+ operation is automatically deleted after a few hours, so
+ there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ async def sample_update_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ instance = redis_v1beta1.Instance()
+ instance.name = "name_value"
+ instance.tier = "STANDARD_HA"
+ instance.memory_size_gb = 1499
+
+ request = redis_v1beta1.UpdateInstanceRequest(
+ instance=instance,
+ )
+
+ # Make the request
+ operation = client.update_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1beta1.types.UpdateInstanceRequest, dict]]):
+ The request object. Request for
+ [UpdateInstance][google.cloud.redis.v1beta1.CloudRedis.UpdateInstance].
+ update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`):
+ Required. Mask of fields to update. At least one path
+ must be supplied in this field. The elements of the
+ repeated paths field may only include these fields from
+ [Instance][google.cloud.redis.v1beta1.Instance]:
+
+ - ``displayName``
+ - ``labels``
+ - ``memorySizeGb``
+ - ``redisConfig``
+ - ``replica_count``
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ instance (:class:`google.cloud.redis_v1beta1.types.Instance`):
+ Required. Update description. Only fields specified in
+ update_mask are updated.
+
+ This corresponds to the ``instance`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([update_mask, instance])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.UpdateInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if update_mask is not None:
+ request.update_mask = update_mask
+ if instance is not None:
+ request.instance = instance
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.update_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("instance.name", request.instance.name),)
+ ),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def upgrade_instance(
+ self,
+ request: Optional[Union[cloud_redis.UpgradeInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ redis_version: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Upgrades Redis instance to the newer Redis version
+ specified in the request.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ async def sample_upgrade_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.UpgradeInstanceRequest(
+ name="name_value",
+ redis_version="redis_version_value",
+ )
+
+ # Make the request
+ operation = client.upgrade_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1beta1.types.UpgradeInstanceRequest, dict]]):
+ The request object. Request for
+ [UpgradeInstance][google.cloud.redis.v1beta1.CloudRedis.UpgradeInstance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ redis_version (:class:`str`):
+ Required. Specifies the target
+ version of Redis software to upgrade to.
+
+ This corresponds to the ``redis_version`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, redis_version])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.UpgradeInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if redis_version is not None:
+ request.redis_version = redis_version
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.upgrade_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def import_instance(
+ self,
+ request: Optional[Union[cloud_redis.ImportInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ input_config: Optional[cloud_redis.InputConfig] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Import a Redis RDB snapshot file from Cloud Storage
+ into a Redis instance.
+ Redis may stop serving during this operation. Instance
+ state will be IMPORTING for entire operation. When
+ complete, the instance will contain only data from the
+ imported file.
+
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ async def sample_import_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ input_config = redis_v1beta1.InputConfig()
+ input_config.gcs_source.uri = "uri_value"
+
+ request = redis_v1beta1.ImportInstanceRequest(
+ name="name_value",
+ input_config=input_config,
+ )
+
+ # Make the request
+ operation = client.import_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1beta1.types.ImportInstanceRequest, dict]]):
+ The request object. Request for
+ [Import][google.cloud.redis.v1beta1.CloudRedis.ImportInstance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ input_config (:class:`google.cloud.redis_v1beta1.types.InputConfig`):
+ Required. Specify data to be
+ imported.
+
+ This corresponds to the ``input_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, input_config])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.ImportInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if input_config is not None:
+ request.input_config = input_config
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.import_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def export_instance(
+ self,
+ request: Optional[Union[cloud_redis.ExportInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ output_config: Optional[cloud_redis.OutputConfig] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Export Redis instance data into a Redis RDB format
+ file in Cloud Storage.
+ Redis will continue serving during this operation.
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ async def sample_export_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ output_config = redis_v1beta1.OutputConfig()
+ output_config.gcs_destination.uri = "uri_value"
+
+ request = redis_v1beta1.ExportInstanceRequest(
+ name="name_value",
+ output_config=output_config,
+ )
+
+ # Make the request
+ operation = client.export_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1beta1.types.ExportInstanceRequest, dict]]):
+ The request object. Request for
+ [Export][google.cloud.redis.v1beta1.CloudRedis.ExportInstance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ output_config (:class:`google.cloud.redis_v1beta1.types.OutputConfig`):
+ Required. Specify data to be
+ exported.
+
+ This corresponds to the ``output_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, output_config])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.ExportInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if output_config is not None:
+ request.output_config = output_config
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.export_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def failover_instance(
+ self,
+ request: Optional[Union[cloud_redis.FailoverInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ data_protection_mode: Optional[
+ cloud_redis.FailoverInstanceRequest.DataProtectionMode
+ ] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Initiates a failover of the primary node to current
+ replica node for a specific STANDARD tier Cloud
+ Memorystore for Redis instance.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ async def sample_failover_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.FailoverInstanceRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.failover_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1beta1.types.FailoverInstanceRequest, dict]]):
+ The request object. Request for
+ [Failover][google.cloud.redis.v1beta1.CloudRedis.FailoverInstance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ data_protection_mode (:class:`google.cloud.redis_v1beta1.types.FailoverInstanceRequest.DataProtectionMode`):
+ Optional. Available data protection modes that the user
+ can choose. If it's unspecified, data protection mode
+ will be LIMITED_DATA_LOSS by default.
+
+ This corresponds to the ``data_protection_mode`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, data_protection_mode])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.FailoverInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if data_protection_mode is not None:
+ request.data_protection_mode = data_protection_mode
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.failover_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def delete_instance(
+ self,
+ request: Optional[Union[cloud_redis.DeleteInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Deletes a specific Redis instance. Instance stops
+ serving and data is deleted.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ async def sample_delete_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.DeleteInstanceRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1beta1.types.DeleteInstanceRequest, dict]]):
+ The request object. Request for
+ [DeleteInstance][google.cloud.redis.v1beta1.CloudRedis.DeleteInstance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.protobuf.empty_pb2.Empty` A generic empty message that you can re-use to avoid defining duplicated
+ empty messages in your APIs. A typical example is to
+ use it as the request or the response type of an API
+ method. For instance:
+
+ service Foo {
+ rpc Bar(google.protobuf.Empty) returns
+ (google.protobuf.Empty);
+
+ }
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.DeleteInstanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.delete_instance,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ empty_pb2.Empty,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def reschedule_maintenance(
+ self,
+ request: Optional[Union[cloud_redis.RescheduleMaintenanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ reschedule_type: Optional[
+ cloud_redis.RescheduleMaintenanceRequest.RescheduleType
+ ] = None,
+ schedule_time: Optional[timestamp_pb2.Timestamp] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Reschedule maintenance for a given instance in a
+ given project and location.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ async def sample_reschedule_maintenance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisAsyncClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.RescheduleMaintenanceRequest(
+ name="name_value",
+ reschedule_type="SPECIFIC_TIME",
+ )
+
+ # Make the request
+ operation = client.reschedule_maintenance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.redis_v1beta1.types.RescheduleMaintenanceRequest, dict]]):
+ The request object. Request for
+ [RescheduleMaintenance][google.cloud.redis.v1beta1.CloudRedis.RescheduleMaintenance].
+ name (:class:`str`):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ reschedule_type (:class:`google.cloud.redis_v1beta1.types.RescheduleMaintenanceRequest.RescheduleType`):
+ Required. If reschedule type is SPECIFIC_TIME, must set
+ up schedule_time as well.
+
+ This corresponds to the ``reschedule_type`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ schedule_time (:class:`google.protobuf.timestamp_pb2.Timestamp`):
+ Optional. Timestamp when the maintenance shall be
+ rescheduled to if reschedule_type=SPECIFIC_TIME, in RFC
+ 3339 format, for example ``2012-11-15T16:19:00.094Z``.
+
+ This corresponds to the ``schedule_time`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, reschedule_type, schedule_time])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = cloud_redis.RescheduleMaintenanceRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if reschedule_type is not None:
+ request.reschedule_type = reschedule_type
+ if schedule_time is not None:
+ request.schedule_time = schedule_time
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.reschedule_maintenance,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, exc_type, exc, tb):
+ await self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("CloudRedisAsyncClient",)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/client.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/client.py
@@ -0,0 +1,1941 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import os
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+ cast,
+)
+
+from google.api_core import client_options as client_options_lib
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.exceptions import MutualTLSChannelError # type: ignore
+from google.auth.transport import mtls # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.redis_v1beta1 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.protobuf import any_pb2 # type: ignore
+from google.protobuf import empty_pb2 # type: ignore
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.redis_v1beta1.services.cloud_redis import pagers
+from google.cloud.redis_v1beta1.types import cloud_redis
+
+from .transports.base import DEFAULT_CLIENT_INFO, CloudRedisTransport
+from .transports.grpc import CloudRedisGrpcTransport
+from .transports.grpc_asyncio import CloudRedisGrpcAsyncIOTransport
+from .transports.rest import CloudRedisRestTransport
+
+
+class CloudRedisClientMeta(type):
+ """Metaclass for the CloudRedis client.
+
+ This provides class-level methods for building and retrieving
+ support objects (e.g. transport) without polluting the client instance
+ objects.
+ """
+
+ _transport_registry = OrderedDict() # type: Dict[str, Type[CloudRedisTransport]]
+ _transport_registry["grpc"] = CloudRedisGrpcTransport
+ _transport_registry["grpc_asyncio"] = CloudRedisGrpcAsyncIOTransport
+ _transport_registry["rest"] = CloudRedisRestTransport
+
+ def get_transport_class(
+ cls,
+ label: Optional[str] = None,
+ ) -> Type[CloudRedisTransport]:
+ """Returns an appropriate transport class.
+
+ Args:
+ label: The name of the desired transport. If none is
+ provided, then the first transport in the registry is used.
+
+ Returns:
+ The transport class to use.
+ """
+ # If a specific transport is requested, return that one.
+ if label:
+ return cls._transport_registry[label]
+
+ # No transport is requested; return the default (that is, the first one
+ # in the dictionary).
+ return next(iter(cls._transport_registry.values()))
+
+
+class CloudRedisClient(metaclass=CloudRedisClientMeta):
+ """Configures and manages Cloud Memorystore for Redis instances
+
+ Google Cloud Memorystore for Redis v1beta1
+
+ The ``redis.googleapis.com`` service implements the Google Cloud
+ Memorystore for Redis API and defines the following resource model
+ for managing Redis instances:
+
+ - The service works with a collection of cloud projects, named:
+ ``/projects/*``
+ - Each project has a collection of available locations, named:
+ ``/locations/*``
+ - Each location has a collection of Redis instances, named:
+ ``/instances/*``
+ - As such, Redis instances are resources of the form:
+ ``/projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+
+ Note that location_id must be referring to a GCP ``region``; for
+ example:
+
+ - ``projects/redpepper-1290/locations/us-central1/instances/my-redis``
+ """
+
+ @staticmethod
+ def _get_default_mtls_endpoint(api_endpoint):
+ """Converts api endpoint to mTLS endpoint.
+
+ Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to
+ "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively.
+ Args:
+ api_endpoint (Optional[str]): the api endpoint to convert.
+ Returns:
+ str: converted mTLS api endpoint.
+ """
+ if not api_endpoint:
+ return api_endpoint
+
+ mtls_endpoint_re = re.compile(
+ r"(?P<name>[^.]+)(?P<mtls>\.mtls)?(?P<sandbox>\.sandbox)?(?P<googledomain>\.googleapis\.com)?"
+ )
+
+ m = mtls_endpoint_re.match(api_endpoint)
+ name, mtls, sandbox, googledomain = m.groups()
+ if mtls or not googledomain:
+ return api_endpoint
+
+ if sandbox:
+ return api_endpoint.replace(
+ "sandbox.googleapis.com", "mtls.sandbox.googleapis.com"
+ )
+
+ return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com")
+
+ DEFAULT_ENDPOINT = "redis.googleapis.com"
+ DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore
+ DEFAULT_ENDPOINT
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ CloudRedisClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_info(info)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ CloudRedisClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_file(filename)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ from_service_account_json = from_service_account_file
+
+ @property
+ def transport(self) -> CloudRedisTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ CloudRedisTransport: The transport used by the client
+ instance.
+ """
+ return self._transport
+
+ @staticmethod
+ def instance_path(
+ project: str,
+ location: str,
+ instance: str,
+ ) -> str:
+ """Returns a fully-qualified instance string."""
+ return "projects/{project}/locations/{location}/instances/{instance}".format(
+ project=project,
+ location=location,
+ instance=instance,
+ )
+
+ @staticmethod
+ def parse_instance_path(path: str) -> Dict[str, str]:
+ """Parses a instance path into its component segments."""
+ m = re.match(
+ r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)/instances/(?P<instance>.+?)$",
+ path,
+ )
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_billing_account_path(
+ billing_account: str,
+ ) -> str:
+ """Returns a fully-qualified billing_account string."""
+ return "billingAccounts/{billing_account}".format(
+ billing_account=billing_account,
+ )
+
+ @staticmethod
+ def parse_common_billing_account_path(path: str) -> Dict[str, str]:
+ """Parse a billing_account path into its component segments."""
+ m = re.match(r"^billingAccounts/(?P<billing_account>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_folder_path(
+ folder: str,
+ ) -> str:
+ """Returns a fully-qualified folder string."""
+ return "folders/{folder}".format(
+ folder=folder,
+ )
+
+ @staticmethod
+ def parse_common_folder_path(path: str) -> Dict[str, str]:
+ """Parse a folder path into its component segments."""
+ m = re.match(r"^folders/(?P<folder>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_organization_path(
+ organization: str,
+ ) -> str:
+ """Returns a fully-qualified organization string."""
+ return "organizations/{organization}".format(
+ organization=organization,
+ )
+
+ @staticmethod
+ def parse_common_organization_path(path: str) -> Dict[str, str]:
+ """Parse a organization path into its component segments."""
+ m = re.match(r"^organizations/(?P<organization>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_project_path(
+ project: str,
+ ) -> str:
+ """Returns a fully-qualified project string."""
+ return "projects/{project}".format(
+ project=project,
+ )
+
+ @staticmethod
+ def parse_common_project_path(path: str) -> Dict[str, str]:
+ """Parse a project path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_location_path(
+ project: str,
+ location: str,
+ ) -> str:
+ """Returns a fully-qualified location string."""
+ return "projects/{project}/locations/{location}".format(
+ project=project,
+ location=location,
+ )
+
+ @staticmethod
+ def parse_common_location_path(path: str) -> Dict[str, str]:
+ """Parse a location path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[client_options_lib.ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ use_client_cert = os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")
+ use_mtls_endpoint = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto")
+ if use_client_cert not in ("true", "false"):
+ raise ValueError(
+ "Environment variable `GOOGLE_API_USE_CLIENT_CERTIFICATE` must be either `true` or `false`"
+ )
+ if use_mtls_endpoint not in ("auto", "never", "always"):
+ raise MutualTLSChannelError(
+ "Environment variable `GOOGLE_API_USE_MTLS_ENDPOINT` must be `never`, `auto` or `always`"
+ )
+
+ # Figure out the client cert source to use.
+ client_cert_source = None
+ if use_client_cert == "true":
+ if client_options.client_cert_source:
+ client_cert_source = client_options.client_cert_source
+ elif mtls.has_default_client_cert_source():
+ client_cert_source = mtls.default_client_cert_source()
+
+ # Figure out which api endpoint to use.
+ if client_options.api_endpoint is not None:
+ api_endpoint = client_options.api_endpoint
+ elif use_mtls_endpoint == "always" or (
+ use_mtls_endpoint == "auto" and client_cert_source
+ ):
+ api_endpoint = cls.DEFAULT_MTLS_ENDPOINT
+ else:
+ api_endpoint = cls.DEFAULT_ENDPOINT
+
+ return api_endpoint, client_cert_source
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Optional[Union[str, CloudRedisTransport]] = None,
+ client_options: Optional[Union[client_options_lib.ClientOptions, dict]] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the cloud redis client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, CloudRedisTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (Optional[Union[google.api_core.client_options.ClientOptions, dict]]): Custom options for the
+ client. It won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ if isinstance(client_options, dict):
+ client_options = client_options_lib.from_dict(client_options)
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ client_options = cast(client_options_lib.ClientOptions, client_options)
+
+ api_endpoint, client_cert_source_func = self.get_mtls_endpoint_and_cert_source(
+ client_options
+ )
+
+ api_key_value = getattr(client_options, "api_key", None)
+ if api_key_value and credentials:
+ raise ValueError(
+ "client_options.api_key and credentials are mutually exclusive"
+ )
+
+ # Save or instantiate the transport.
+ # Ordinarily, we provide the transport, but allowing a custom transport
+ # instance provides an extensibility point for unusual situations.
+ if isinstance(transport, CloudRedisTransport):
+ # transport is a CloudRedisTransport instance.
+ if credentials or client_options.credentials_file or api_key_value:
+ raise ValueError(
+ "When providing a transport instance, "
+ "provide its credentials directly."
+ )
+ if client_options.scopes:
+ raise ValueError(
+ "When providing a transport instance, provide its scopes "
+ "directly."
+ )
+ self._transport = transport
+ else:
+ import google.auth._default # type: ignore
+
+ if api_key_value and hasattr(
+ google.auth._default, "get_api_key_credentials"
+ ):
+ credentials = google.auth._default.get_api_key_credentials(
+ api_key_value
+ )
+
+ Transport = type(self).get_transport_class(transport)
+ self._transport = Transport(
+ credentials=credentials,
+ credentials_file=client_options.credentials_file,
+ host=api_endpoint,
+ scopes=client_options.scopes,
+ client_cert_source_for_mtls=client_cert_source_func,
+ quota_project_id=client_options.quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=True,
+ api_audience=client_options.api_audience,
+ )
+
+ def list_instances(
+ self,
+ request: Optional[Union[cloud_redis.ListInstancesRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListInstancesPager:
+ r"""Lists all Redis instances owned by a project in either the
+ specified location (region) or all locations.
+
+ The location should have the following format:
+
+ - ``projects/{project_id}/locations/{location_id}``
+
+ If ``location_id`` is specified as ``-`` (wildcard), then all
+ regions available to the project are queried, and the results
+ are aggregated.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ def sample_list_instances():
+ # Create a client
+ client = redis_v1beta1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.ListInstancesRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_instances(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1beta1.types.ListInstancesRequest, dict]):
+ The request object. Request for
+ [ListInstances][google.cloud.redis.v1beta1.CloudRedis.ListInstances].
+ parent (str):
+ Required. The resource name of the instance location
+ using the form:
+ ``projects/{project_id}/locations/{location_id}`` where
+ ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.redis_v1beta1.services.cloud_redis.pagers.ListInstancesPager:
+ Response for
+ [ListInstances][google.cloud.redis.v1beta1.CloudRedis.ListInstances].
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.ListInstancesRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.ListInstancesRequest):
+ request = cloud_redis.ListInstancesRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_instances]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListInstancesPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_instance(
+ self,
+ request: Optional[Union[cloud_redis.GetInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.Instance:
+ r"""Gets the details of a specific Redis instance.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ def sample_get_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.GetInstanceRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_instance(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1beta1.types.GetInstanceRequest, dict]):
+ The request object. Request for
+ [GetInstance][google.cloud.redis.v1beta1.CloudRedis.GetInstance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.redis_v1beta1.types.Instance:
+ A Memorystore for Redis instance.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.GetInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.GetInstanceRequest):
+ request = cloud_redis.GetInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_instance_auth_string(
+ self,
+ request: Optional[Union[cloud_redis.GetInstanceAuthStringRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.InstanceAuthString:
+ r"""Gets the AUTH string for a Redis instance. If AUTH is
+ not enabled for the instance the response will be empty.
+ This information is not included in the details returned
+ to GetInstance.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ def sample_get_instance_auth_string():
+ # Create a client
+ client = redis_v1beta1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.GetInstanceAuthStringRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_instance_auth_string(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1beta1.types.GetInstanceAuthStringRequest, dict]):
+ The request object. Request for
+ [GetInstanceAuthString][google.cloud.redis.v1beta1.CloudRedis.GetInstanceAuthString].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.redis_v1beta1.types.InstanceAuthString:
+ Instance AUTH string details.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.GetInstanceAuthStringRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.GetInstanceAuthStringRequest):
+ request = cloud_redis.GetInstanceAuthStringRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_instance_auth_string]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def create_instance(
+ self,
+ request: Optional[Union[cloud_redis.CreateInstanceRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ instance_id: Optional[str] = None,
+ instance: Optional[cloud_redis.Instance] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Creates a Redis instance based on the specified tier and memory
+ size.
+
+ By default, the instance is accessible from the project's
+ `default network <https://cloud.google.com/vpc/docs/vpc>`__.
+
+ The creation is executed asynchronously and callers may check
+ the returned operation to track its progress. Once the operation
+ is completed the Redis instance will be fully functional. The
+ completed longrunning.Operation will contain the new instance
+ object in the response field.
+
+ The returned operation is automatically deleted after a few
+ hours, so there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ def sample_create_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ instance = redis_v1beta1.Instance()
+ instance.name = "name_value"
+ instance.tier = "STANDARD_HA"
+ instance.memory_size_gb = 1499
+
+ request = redis_v1beta1.CreateInstanceRequest(
+ parent="parent_value",
+ instance_id="instance_id_value",
+ instance=instance,
+ )
+
+ # Make the request
+ operation = client.create_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1beta1.types.CreateInstanceRequest, dict]):
+ The request object. Request for
+ [CreateInstance][google.cloud.redis.v1beta1.CloudRedis.CreateInstance].
+ parent (str):
+ Required. The resource name of the instance location
+ using the form:
+ ``projects/{project_id}/locations/{location_id}`` where
+ ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ instance_id (str):
+ Required. The logical name of the Redis instance in the
+ customer project with the following restrictions:
+
+ - Must contain only lowercase letters, numbers, and
+ hyphens.
+ - Must start with a letter.
+ - Must be between 1-40 characters.
+ - Must end with a number or a letter.
+ - Must be unique within the customer project / location
+
+ This corresponds to the ``instance_id`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ instance (google.cloud.redis_v1beta1.types.Instance):
+ Required. A Redis [Instance] resource
+ This corresponds to the ``instance`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, instance_id, instance])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.CreateInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.CreateInstanceRequest):
+ request = cloud_redis.CreateInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if instance_id is not None:
+ request.instance_id = instance_id
+ if instance is not None:
+ request.instance = instance
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.create_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ def update_instance(
+ self,
+ request: Optional[Union[cloud_redis.UpdateInstanceRequest, dict]] = None,
+ *,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ instance: Optional[cloud_redis.Instance] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Updates the metadata and configuration of a specific
+ Redis instance.
+ Completed longrunning.Operation will contain the new
+ instance object in the response field. The returned
+ operation is automatically deleted after a few hours, so
+ there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ def sample_update_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ instance = redis_v1beta1.Instance()
+ instance.name = "name_value"
+ instance.tier = "STANDARD_HA"
+ instance.memory_size_gb = 1499
+
+ request = redis_v1beta1.UpdateInstanceRequest(
+ instance=instance,
+ )
+
+ # Make the request
+ operation = client.update_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1beta1.types.UpdateInstanceRequest, dict]):
+ The request object. Request for
+ [UpdateInstance][google.cloud.redis.v1beta1.CloudRedis.UpdateInstance].
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Required. Mask of fields to update. At least one path
+ must be supplied in this field. The elements of the
+ repeated paths field may only include these fields from
+ [Instance][google.cloud.redis.v1beta1.Instance]:
+
+ - ``displayName``
+ - ``labels``
+ - ``memorySizeGb``
+ - ``redisConfig``
+ - ``replica_count``
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ instance (google.cloud.redis_v1beta1.types.Instance):
+ Required. Update description. Only fields specified in
+ update_mask are updated.
+
+ This corresponds to the ``instance`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([update_mask, instance])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.UpdateInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.UpdateInstanceRequest):
+ request = cloud_redis.UpdateInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if update_mask is not None:
+ request.update_mask = update_mask
+ if instance is not None:
+ request.instance = instance
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.update_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("instance.name", request.instance.name),)
+ ),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ def upgrade_instance(
+ self,
+ request: Optional[Union[cloud_redis.UpgradeInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ redis_version: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Upgrades Redis instance to the newer Redis version
+ specified in the request.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ def sample_upgrade_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.UpgradeInstanceRequest(
+ name="name_value",
+ redis_version="redis_version_value",
+ )
+
+ # Make the request
+ operation = client.upgrade_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1beta1.types.UpgradeInstanceRequest, dict]):
+ The request object. Request for
+ [UpgradeInstance][google.cloud.redis.v1beta1.CloudRedis.UpgradeInstance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ redis_version (str):
+ Required. Specifies the target
+ version of Redis software to upgrade to.
+
+ This corresponds to the ``redis_version`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, redis_version])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.UpgradeInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.UpgradeInstanceRequest):
+ request = cloud_redis.UpgradeInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if redis_version is not None:
+ request.redis_version = redis_version
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.upgrade_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ def import_instance(
+ self,
+ request: Optional[Union[cloud_redis.ImportInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ input_config: Optional[cloud_redis.InputConfig] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Import a Redis RDB snapshot file from Cloud Storage
+ into a Redis instance.
+ Redis may stop serving during this operation. Instance
+ state will be IMPORTING for entire operation. When
+ complete, the instance will contain only data from the
+ imported file.
+
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ def sample_import_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ input_config = redis_v1beta1.InputConfig()
+ input_config.gcs_source.uri = "uri_value"
+
+ request = redis_v1beta1.ImportInstanceRequest(
+ name="name_value",
+ input_config=input_config,
+ )
+
+ # Make the request
+ operation = client.import_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1beta1.types.ImportInstanceRequest, dict]):
+ The request object. Request for
+ [Import][google.cloud.redis.v1beta1.CloudRedis.ImportInstance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ input_config (google.cloud.redis_v1beta1.types.InputConfig):
+ Required. Specify data to be
+ imported.
+
+ This corresponds to the ``input_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, input_config])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.ImportInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.ImportInstanceRequest):
+ request = cloud_redis.ImportInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if input_config is not None:
+ request.input_config = input_config
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.import_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ def export_instance(
+ self,
+ request: Optional[Union[cloud_redis.ExportInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ output_config: Optional[cloud_redis.OutputConfig] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Export Redis instance data into a Redis RDB format
+ file in Cloud Storage.
+ Redis will continue serving during this operation.
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ def sample_export_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ output_config = redis_v1beta1.OutputConfig()
+ output_config.gcs_destination.uri = "uri_value"
+
+ request = redis_v1beta1.ExportInstanceRequest(
+ name="name_value",
+ output_config=output_config,
+ )
+
+ # Make the request
+ operation = client.export_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1beta1.types.ExportInstanceRequest, dict]):
+ The request object. Request for
+ [Export][google.cloud.redis.v1beta1.CloudRedis.ExportInstance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ output_config (google.cloud.redis_v1beta1.types.OutputConfig):
+ Required. Specify data to be
+ exported.
+
+ This corresponds to the ``output_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, output_config])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.ExportInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.ExportInstanceRequest):
+ request = cloud_redis.ExportInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if output_config is not None:
+ request.output_config = output_config
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.export_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ def failover_instance(
+ self,
+ request: Optional[Union[cloud_redis.FailoverInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ data_protection_mode: Optional[
+ cloud_redis.FailoverInstanceRequest.DataProtectionMode
+ ] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Initiates a failover of the primary node to current
+ replica node for a specific STANDARD tier Cloud
+ Memorystore for Redis instance.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ def sample_failover_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.FailoverInstanceRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.failover_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1beta1.types.FailoverInstanceRequest, dict]):
+ The request object. Request for
+ [Failover][google.cloud.redis.v1beta1.CloudRedis.FailoverInstance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ data_protection_mode (google.cloud.redis_v1beta1.types.FailoverInstanceRequest.DataProtectionMode):
+ Optional. Available data protection modes that the user
+ can choose. If it's unspecified, data protection mode
+ will be LIMITED_DATA_LOSS by default.
+
+ This corresponds to the ``data_protection_mode`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, data_protection_mode])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.FailoverInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.FailoverInstanceRequest):
+ request = cloud_redis.FailoverInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if data_protection_mode is not None:
+ request.data_protection_mode = data_protection_mode
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.failover_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ def delete_instance(
+ self,
+ request: Optional[Union[cloud_redis.DeleteInstanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Deletes a specific Redis instance. Instance stops
+ serving and data is deleted.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ def sample_delete_instance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.DeleteInstanceRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_instance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1beta1.types.DeleteInstanceRequest, dict]):
+ The request object. Request for
+ [DeleteInstance][google.cloud.redis.v1beta1.CloudRedis.DeleteInstance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.protobuf.empty_pb2.Empty` A generic empty message that you can re-use to avoid defining duplicated
+ empty messages in your APIs. A typical example is to
+ use it as the request or the response type of an API
+ method. For instance:
+
+ service Foo {
+ rpc Bar(google.protobuf.Empty) returns
+ (google.protobuf.Empty);
+
+ }
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.DeleteInstanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.DeleteInstanceRequest):
+ request = cloud_redis.DeleteInstanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.delete_instance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ empty_pb2.Empty,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ def reschedule_maintenance(
+ self,
+ request: Optional[Union[cloud_redis.RescheduleMaintenanceRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ reschedule_type: Optional[
+ cloud_redis.RescheduleMaintenanceRequest.RescheduleType
+ ] = None,
+ schedule_time: Optional[timestamp_pb2.Timestamp] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Reschedule maintenance for a given instance in a
+ given project and location.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import redis_v1beta1
+
+ def sample_reschedule_maintenance():
+ # Create a client
+ client = redis_v1beta1.CloudRedisClient()
+
+ # Initialize request argument(s)
+ request = redis_v1beta1.RescheduleMaintenanceRequest(
+ name="name_value",
+ reschedule_type="SPECIFIC_TIME",
+ )
+
+ # Make the request
+ operation = client.reschedule_maintenance(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.redis_v1beta1.types.RescheduleMaintenanceRequest, dict]):
+ The request object. Request for
+ [RescheduleMaintenance][google.cloud.redis.v1beta1.CloudRedis.RescheduleMaintenance].
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ reschedule_type (google.cloud.redis_v1beta1.types.RescheduleMaintenanceRequest.RescheduleType):
+ Required. If reschedule type is SPECIFIC_TIME, must set
+ up schedule_time as well.
+
+ This corresponds to the ``reschedule_type`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ schedule_time (google.protobuf.timestamp_pb2.Timestamp):
+ Optional. Timestamp when the maintenance shall be
+ rescheduled to if reschedule_type=SPECIFIC_TIME, in RFC
+ 3339 format, for example ``2012-11-15T16:19:00.094Z``.
+
+ This corresponds to the ``schedule_time`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.redis_v1beta1.types.Instance` A
+ Memorystore for Redis instance.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, reschedule_type, schedule_time])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a cloud_redis.RescheduleMaintenanceRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, cloud_redis.RescheduleMaintenanceRequest):
+ request = cloud_redis.RescheduleMaintenanceRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if reschedule_type is not None:
+ request.reschedule_type = reschedule_type
+ if schedule_time is not None:
+ request.schedule_time = schedule_time
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.reschedule_maintenance]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ cloud_redis.Instance,
+ metadata_type=any_pb2.Any,
+ )
+
+ # Done; return the response.
+ return response
+
+ def __enter__(self) -> "CloudRedisClient":
+ return self
+
+ def __exit__(self, type, value, traceback):
+ """Releases underlying transport's resources.
+
+ .. warning::
+ ONLY use as a context manager if the transport is NOT shared
+ with other clients! Exiting the with block will CLOSE the transport
+ and may cause errors in other clients!
+ """
+ self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("CloudRedisClient",)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/pagers.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/pagers.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/pagers.py
@@ -0,0 +1,155 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import (
+ Any,
+ AsyncIterator,
+ Awaitable,
+ Callable,
+ Iterator,
+ Optional,
+ Sequence,
+ Tuple,
+)
+
+from google.cloud.redis_v1beta1.types import cloud_redis
+
+
+class ListInstancesPager:
+ """A pager for iterating through ``list_instances`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.redis_v1beta1.types.ListInstancesResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``instances`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListInstances`` requests and continue to iterate
+ through the ``instances`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.redis_v1beta1.types.ListInstancesResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., cloud_redis.ListInstancesResponse],
+ request: cloud_redis.ListInstancesRequest,
+ response: cloud_redis.ListInstancesResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.redis_v1beta1.types.ListInstancesRequest):
+ The initial request object.
+ response (google.cloud.redis_v1beta1.types.ListInstancesResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = cloud_redis.ListInstancesRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[cloud_redis.ListInstancesResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[cloud_redis.Instance]:
+ for page in self.pages:
+ yield from page.instances
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListInstancesAsyncPager:
+ """A pager for iterating through ``list_instances`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.redis_v1beta1.types.ListInstancesResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``instances`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListInstances`` requests and continue to iterate
+ through the ``instances`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.redis_v1beta1.types.ListInstancesResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[cloud_redis.ListInstancesResponse]],
+ request: cloud_redis.ListInstancesRequest,
+ response: cloud_redis.ListInstancesResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.redis_v1beta1.types.ListInstancesRequest):
+ The initial request object.
+ response (google.cloud.redis_v1beta1.types.ListInstancesResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = cloud_redis.ListInstancesRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[cloud_redis.ListInstancesResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[cloud_redis.Instance]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.instances:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/__init__.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/__init__.py
@@ -0,0 +1,36 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+from typing import Dict, Type
+
+from .base import CloudRedisTransport
+from .grpc import CloudRedisGrpcTransport
+from .grpc_asyncio import CloudRedisGrpcAsyncIOTransport
+from .rest import CloudRedisRestInterceptor, CloudRedisRestTransport
+
+# Compile a registry of transports.
+_transport_registry = OrderedDict() # type: Dict[str, Type[CloudRedisTransport]]
+_transport_registry["grpc"] = CloudRedisGrpcTransport
+_transport_registry["grpc_asyncio"] = CloudRedisGrpcAsyncIOTransport
+_transport_registry["rest"] = CloudRedisRestTransport
+
+__all__ = (
+ "CloudRedisTransport",
+ "CloudRedisGrpcTransport",
+ "CloudRedisGrpcAsyncIOTransport",
+ "CloudRedisRestTransport",
+ "CloudRedisRestInterceptor",
+)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/base.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/base.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/base.py
@@ -0,0 +1,306 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import abc
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Union
+
+import google.api_core
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1, operations_v1
+from google.api_core import retry as retries
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.redis_v1beta1 import gapic_version as package_version
+from google.cloud.redis_v1beta1.types import cloud_redis
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+class CloudRedisTransport(abc.ABC):
+ """Abstract transport class for CloudRedis."""
+
+ AUTH_SCOPES = ("https://www.googleapis.com/auth/cloud-platform",)
+
+ DEFAULT_HOST: str = "redis.googleapis.com"
+
+ def __init__(
+ self,
+ *,
+ host: str = DEFAULT_HOST,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ **kwargs,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A list of scopes.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ """
+
+ scopes_kwargs = {"scopes": scopes, "default_scopes": self.AUTH_SCOPES}
+
+ # Save the scopes.
+ self._scopes = scopes
+
+ # If no credentials are provided, then determine the appropriate
+ # defaults.
+ if credentials and credentials_file:
+ raise core_exceptions.DuplicateCredentialArgs(
+ "'credentials_file' and 'credentials' are mutually exclusive"
+ )
+
+ if credentials_file is not None:
+ credentials, _ = google.auth.load_credentials_from_file(
+ credentials_file, **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ elif credentials is None:
+ credentials, _ = google.auth.default(
+ **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ # Don't apply audience if the credentials file passed from user.
+ if hasattr(credentials, "with_gdch_audience"):
+ credentials = credentials.with_gdch_audience(
+ api_audience if api_audience else host
+ )
+
+ # If the credentials are service account credentials, then always try to use self signed JWT.
+ if (
+ always_use_jwt_access
+ and isinstance(credentials, service_account.Credentials)
+ and hasattr(service_account.Credentials, "with_always_use_jwt_access")
+ ):
+ credentials = credentials.with_always_use_jwt_access(True)
+
+ # Save the credentials.
+ self._credentials = credentials
+
+ # Save the hostname. Default to port 443 (HTTPS) if none is specified.
+ if ":" not in host:
+ host += ":443"
+ self._host = host
+
+ def _prep_wrapped_messages(self, client_info):
+ # Precompute the wrapped methods.
+ self._wrapped_methods = {
+ self.list_instances: gapic_v1.method.wrap_method(
+ self.list_instances,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.get_instance: gapic_v1.method.wrap_method(
+ self.get_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.get_instance_auth_string: gapic_v1.method.wrap_method(
+ self.get_instance_auth_string,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.create_instance: gapic_v1.method.wrap_method(
+ self.create_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.update_instance: gapic_v1.method.wrap_method(
+ self.update_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.upgrade_instance: gapic_v1.method.wrap_method(
+ self.upgrade_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.import_instance: gapic_v1.method.wrap_method(
+ self.import_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.export_instance: gapic_v1.method.wrap_method(
+ self.export_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.failover_instance: gapic_v1.method.wrap_method(
+ self.failover_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.delete_instance: gapic_v1.method.wrap_method(
+ self.delete_instance,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.reschedule_maintenance: gapic_v1.method.wrap_method(
+ self.reschedule_maintenance,
+ default_timeout=None,
+ client_info=client_info,
+ ),
+ }
+
+ def close(self):
+ """Closes resources associated with the transport.
+
+ .. warning::
+ Only call this method if the transport is NOT shared
+ with other clients - this may cause errors in other clients!
+ """
+ raise NotImplementedError()
+
+ @property
+ def operations_client(self):
+ """Return the client designed to process long-running operations."""
+ raise NotImplementedError()
+
+ @property
+ def list_instances(
+ self,
+ ) -> Callable[
+ [cloud_redis.ListInstancesRequest],
+ Union[
+ cloud_redis.ListInstancesResponse,
+ Awaitable[cloud_redis.ListInstancesResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.GetInstanceRequest],
+ Union[cloud_redis.Instance, Awaitable[cloud_redis.Instance]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_instance_auth_string(
+ self,
+ ) -> Callable[
+ [cloud_redis.GetInstanceAuthStringRequest],
+ Union[
+ cloud_redis.InstanceAuthString, Awaitable[cloud_redis.InstanceAuthString]
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def create_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.CreateInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def update_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.UpdateInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def upgrade_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.UpgradeInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def import_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.ImportInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def export_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.ExportInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def failover_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.FailoverInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def delete_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.DeleteInstanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def reschedule_maintenance(
+ self,
+ ) -> Callable[
+ [cloud_redis.RescheduleMaintenanceRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def kind(self) -> str:
+ raise NotImplementedError()
+
+
+__all__ = ("CloudRedisTransport",)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/grpc.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/grpc.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/grpc.py
@@ -0,0 +1,612 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers, operations_v1
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+
+from google.cloud.redis_v1beta1.types import cloud_redis
+
+from .base import DEFAULT_CLIENT_INFO, CloudRedisTransport
+
+
+class CloudRedisGrpcTransport(CloudRedisTransport):
+ """gRPC backend transport for CloudRedis.
+
+ Configures and manages Cloud Memorystore for Redis instances
+
+ Google Cloud Memorystore for Redis v1beta1
+
+ The ``redis.googleapis.com`` service implements the Google Cloud
+ Memorystore for Redis API and defines the following resource model
+ for managing Redis instances:
+
+ - The service works with a collection of cloud projects, named:
+ ``/projects/*``
+ - Each project has a collection of available locations, named:
+ ``/locations/*``
+ - Each location has a collection of Redis instances, named:
+ ``/instances/*``
+ - As such, Redis instances are resources of the form:
+ ``/projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+
+ Note that location_id must be referring to a GCP ``region``; for
+ example:
+
+ - ``projects/redpepper-1290/locations/us-central1/instances/my-redis``
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _stubs: Dict[str, Callable]
+
+ def __init__(
+ self,
+ *,
+ host: str = "redis.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[grpc.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ channel (Optional[grpc.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "redis.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> grpc.Channel:
+ """Create and return a gRPC channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ grpc.Channel: A gRPC channel object.
+
+ Raises:
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+
+ return grpc_helpers.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ @property
+ def grpc_channel(self) -> grpc.Channel:
+ """Return the channel designed to connect to this service."""
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsClient(self.grpc_channel)
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def list_instances(
+ self,
+ ) -> Callable[
+ [cloud_redis.ListInstancesRequest], cloud_redis.ListInstancesResponse
+ ]:
+ r"""Return a callable for the list instances method over gRPC.
+
+ Lists all Redis instances owned by a project in either the
+ specified location (region) or all locations.
+
+ The location should have the following format:
+
+ - ``projects/{project_id}/locations/{location_id}``
+
+ If ``location_id`` is specified as ``-`` (wildcard), then all
+ regions available to the project are queried, and the results
+ are aggregated.
+
+ Returns:
+ Callable[[~.ListInstancesRequest],
+ ~.ListInstancesResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_instances" not in self._stubs:
+ self._stubs["list_instances"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/ListInstances",
+ request_serializer=cloud_redis.ListInstancesRequest.serialize,
+ response_deserializer=cloud_redis.ListInstancesResponse.deserialize,
+ )
+ return self._stubs["list_instances"]
+
+ @property
+ def get_instance(
+ self,
+ ) -> Callable[[cloud_redis.GetInstanceRequest], cloud_redis.Instance]:
+ r"""Return a callable for the get instance method over gRPC.
+
+ Gets the details of a specific Redis instance.
+
+ Returns:
+ Callable[[~.GetInstanceRequest],
+ ~.Instance]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_instance" not in self._stubs:
+ self._stubs["get_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/GetInstance",
+ request_serializer=cloud_redis.GetInstanceRequest.serialize,
+ response_deserializer=cloud_redis.Instance.deserialize,
+ )
+ return self._stubs["get_instance"]
+
+ @property
+ def get_instance_auth_string(
+ self,
+ ) -> Callable[
+ [cloud_redis.GetInstanceAuthStringRequest], cloud_redis.InstanceAuthString
+ ]:
+ r"""Return a callable for the get instance auth string method over gRPC.
+
+ Gets the AUTH string for a Redis instance. If AUTH is
+ not enabled for the instance the response will be empty.
+ This information is not included in the details returned
+ to GetInstance.
+
+ Returns:
+ Callable[[~.GetInstanceAuthStringRequest],
+ ~.InstanceAuthString]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_instance_auth_string" not in self._stubs:
+ self._stubs["get_instance_auth_string"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/GetInstanceAuthString",
+ request_serializer=cloud_redis.GetInstanceAuthStringRequest.serialize,
+ response_deserializer=cloud_redis.InstanceAuthString.deserialize,
+ )
+ return self._stubs["get_instance_auth_string"]
+
+ @property
+ def create_instance(
+ self,
+ ) -> Callable[[cloud_redis.CreateInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the create instance method over gRPC.
+
+ Creates a Redis instance based on the specified tier and memory
+ size.
+
+ By default, the instance is accessible from the project's
+ `default network <https://cloud.google.com/vpc/docs/vpc>`__.
+
+ The creation is executed asynchronously and callers may check
+ the returned operation to track its progress. Once the operation
+ is completed the Redis instance will be fully functional. The
+ completed longrunning.Operation will contain the new instance
+ object in the response field.
+
+ The returned operation is automatically deleted after a few
+ hours, so there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.CreateInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_instance" not in self._stubs:
+ self._stubs["create_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/CreateInstance",
+ request_serializer=cloud_redis.CreateInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_instance"]
+
+ @property
+ def update_instance(
+ self,
+ ) -> Callable[[cloud_redis.UpdateInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the update instance method over gRPC.
+
+ Updates the metadata and configuration of a specific
+ Redis instance.
+ Completed longrunning.Operation will contain the new
+ instance object in the response field. The returned
+ operation is automatically deleted after a few hours, so
+ there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.UpdateInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_instance" not in self._stubs:
+ self._stubs["update_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/UpdateInstance",
+ request_serializer=cloud_redis.UpdateInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["update_instance"]
+
+ @property
+ def upgrade_instance(
+ self,
+ ) -> Callable[[cloud_redis.UpgradeInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the upgrade instance method over gRPC.
+
+ Upgrades Redis instance to the newer Redis version
+ specified in the request.
+
+ Returns:
+ Callable[[~.UpgradeInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "upgrade_instance" not in self._stubs:
+ self._stubs["upgrade_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/UpgradeInstance",
+ request_serializer=cloud_redis.UpgradeInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["upgrade_instance"]
+
+ @property
+ def import_instance(
+ self,
+ ) -> Callable[[cloud_redis.ImportInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the import instance method over gRPC.
+
+ Import a Redis RDB snapshot file from Cloud Storage
+ into a Redis instance.
+ Redis may stop serving during this operation. Instance
+ state will be IMPORTING for entire operation. When
+ complete, the instance will contain only data from the
+ imported file.
+
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.ImportInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "import_instance" not in self._stubs:
+ self._stubs["import_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/ImportInstance",
+ request_serializer=cloud_redis.ImportInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["import_instance"]
+
+ @property
+ def export_instance(
+ self,
+ ) -> Callable[[cloud_redis.ExportInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the export instance method over gRPC.
+
+ Export Redis instance data into a Redis RDB format
+ file in Cloud Storage.
+ Redis will continue serving during this operation.
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.ExportInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "export_instance" not in self._stubs:
+ self._stubs["export_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/ExportInstance",
+ request_serializer=cloud_redis.ExportInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["export_instance"]
+
+ @property
+ def failover_instance(
+ self,
+ ) -> Callable[[cloud_redis.FailoverInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the failover instance method over gRPC.
+
+ Initiates a failover of the primary node to current
+ replica node for a specific STANDARD tier Cloud
+ Memorystore for Redis instance.
+
+ Returns:
+ Callable[[~.FailoverInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "failover_instance" not in self._stubs:
+ self._stubs["failover_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/FailoverInstance",
+ request_serializer=cloud_redis.FailoverInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["failover_instance"]
+
+ @property
+ def delete_instance(
+ self,
+ ) -> Callable[[cloud_redis.DeleteInstanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the delete instance method over gRPC.
+
+ Deletes a specific Redis instance. Instance stops
+ serving and data is deleted.
+
+ Returns:
+ Callable[[~.DeleteInstanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_instance" not in self._stubs:
+ self._stubs["delete_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/DeleteInstance",
+ request_serializer=cloud_redis.DeleteInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_instance"]
+
+ @property
+ def reschedule_maintenance(
+ self,
+ ) -> Callable[[cloud_redis.RescheduleMaintenanceRequest], operations_pb2.Operation]:
+ r"""Return a callable for the reschedule maintenance method over gRPC.
+
+ Reschedule maintenance for a given instance in a
+ given project and location.
+
+ Returns:
+ Callable[[~.RescheduleMaintenanceRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "reschedule_maintenance" not in self._stubs:
+ self._stubs["reschedule_maintenance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/RescheduleMaintenance",
+ request_serializer=cloud_redis.RescheduleMaintenanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["reschedule_maintenance"]
+
+ def close(self):
+ self.grpc_channel.close()
+
+ @property
+ def kind(self) -> str:
+ return "grpc"
+
+
+__all__ = ("CloudRedisGrpcTransport",)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/grpc_asyncio.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/grpc_asyncio.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/grpc_asyncio.py
@@ -0,0 +1,630 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers_async, operations_v1
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+from grpc.experimental import aio # type: ignore
+
+from google.cloud.redis_v1beta1.types import cloud_redis
+
+from .base import DEFAULT_CLIENT_INFO, CloudRedisTransport
+from .grpc import CloudRedisGrpcTransport
+
+
+class CloudRedisGrpcAsyncIOTransport(CloudRedisTransport):
+ """gRPC AsyncIO backend transport for CloudRedis.
+
+ Configures and manages Cloud Memorystore for Redis instances
+
+ Google Cloud Memorystore for Redis v1beta1
+
+ The ``redis.googleapis.com`` service implements the Google Cloud
+ Memorystore for Redis API and defines the following resource model
+ for managing Redis instances:
+
+ - The service works with a collection of cloud projects, named:
+ ``/projects/*``
+ - Each project has a collection of available locations, named:
+ ``/locations/*``
+ - Each location has a collection of Redis instances, named:
+ ``/instances/*``
+ - As such, Redis instances are resources of the form:
+ ``/projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+
+ Note that location_id must be referring to a GCP ``region``; for
+ example:
+
+ - ``projects/redpepper-1290/locations/us-central1/instances/my-redis``
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _grpc_channel: aio.Channel
+ _stubs: Dict[str, Callable] = {}
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "redis.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> aio.Channel:
+ """Create and return a gRPC AsyncIO channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ aio.Channel: A gRPC AsyncIO channel object.
+ """
+
+ return grpc_helpers_async.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ def __init__(
+ self,
+ *,
+ host: str = "redis.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[aio.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ channel (Optional[aio.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsAsyncClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def grpc_channel(self) -> aio.Channel:
+ """Create the channel designed to connect to this service.
+
+ This property caches on the instance; repeated calls return
+ the same channel.
+ """
+ # Return the channel from cache.
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsAsyncClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsAsyncClient(
+ self.grpc_channel
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def list_instances(
+ self,
+ ) -> Callable[
+ [cloud_redis.ListInstancesRequest], Awaitable[cloud_redis.ListInstancesResponse]
+ ]:
+ r"""Return a callable for the list instances method over gRPC.
+
+ Lists all Redis instances owned by a project in either the
+ specified location (region) or all locations.
+
+ The location should have the following format:
+
+ - ``projects/{project_id}/locations/{location_id}``
+
+ If ``location_id`` is specified as ``-`` (wildcard), then all
+ regions available to the project are queried, and the results
+ are aggregated.
+
+ Returns:
+ Callable[[~.ListInstancesRequest],
+ Awaitable[~.ListInstancesResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_instances" not in self._stubs:
+ self._stubs["list_instances"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/ListInstances",
+ request_serializer=cloud_redis.ListInstancesRequest.serialize,
+ response_deserializer=cloud_redis.ListInstancesResponse.deserialize,
+ )
+ return self._stubs["list_instances"]
+
+ @property
+ def get_instance(
+ self,
+ ) -> Callable[[cloud_redis.GetInstanceRequest], Awaitable[cloud_redis.Instance]]:
+ r"""Return a callable for the get instance method over gRPC.
+
+ Gets the details of a specific Redis instance.
+
+ Returns:
+ Callable[[~.GetInstanceRequest],
+ Awaitable[~.Instance]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_instance" not in self._stubs:
+ self._stubs["get_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/GetInstance",
+ request_serializer=cloud_redis.GetInstanceRequest.serialize,
+ response_deserializer=cloud_redis.Instance.deserialize,
+ )
+ return self._stubs["get_instance"]
+
+ @property
+ def get_instance_auth_string(
+ self,
+ ) -> Callable[
+ [cloud_redis.GetInstanceAuthStringRequest],
+ Awaitable[cloud_redis.InstanceAuthString],
+ ]:
+ r"""Return a callable for the get instance auth string method over gRPC.
+
+ Gets the AUTH string for a Redis instance. If AUTH is
+ not enabled for the instance the response will be empty.
+ This information is not included in the details returned
+ to GetInstance.
+
+ Returns:
+ Callable[[~.GetInstanceAuthStringRequest],
+ Awaitable[~.InstanceAuthString]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_instance_auth_string" not in self._stubs:
+ self._stubs["get_instance_auth_string"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/GetInstanceAuthString",
+ request_serializer=cloud_redis.GetInstanceAuthStringRequest.serialize,
+ response_deserializer=cloud_redis.InstanceAuthString.deserialize,
+ )
+ return self._stubs["get_instance_auth_string"]
+
+ @property
+ def create_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.CreateInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the create instance method over gRPC.
+
+ Creates a Redis instance based on the specified tier and memory
+ size.
+
+ By default, the instance is accessible from the project's
+ `default network <https://cloud.google.com/vpc/docs/vpc>`__.
+
+ The creation is executed asynchronously and callers may check
+ the returned operation to track its progress. Once the operation
+ is completed the Redis instance will be fully functional. The
+ completed longrunning.Operation will contain the new instance
+ object in the response field.
+
+ The returned operation is automatically deleted after a few
+ hours, so there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.CreateInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_instance" not in self._stubs:
+ self._stubs["create_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/CreateInstance",
+ request_serializer=cloud_redis.CreateInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_instance"]
+
+ @property
+ def update_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.UpdateInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the update instance method over gRPC.
+
+ Updates the metadata and configuration of a specific
+ Redis instance.
+ Completed longrunning.Operation will contain the new
+ instance object in the response field. The returned
+ operation is automatically deleted after a few hours, so
+ there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.UpdateInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_instance" not in self._stubs:
+ self._stubs["update_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/UpdateInstance",
+ request_serializer=cloud_redis.UpdateInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["update_instance"]
+
+ @property
+ def upgrade_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.UpgradeInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the upgrade instance method over gRPC.
+
+ Upgrades Redis instance to the newer Redis version
+ specified in the request.
+
+ Returns:
+ Callable[[~.UpgradeInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "upgrade_instance" not in self._stubs:
+ self._stubs["upgrade_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/UpgradeInstance",
+ request_serializer=cloud_redis.UpgradeInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["upgrade_instance"]
+
+ @property
+ def import_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.ImportInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the import instance method over gRPC.
+
+ Import a Redis RDB snapshot file from Cloud Storage
+ into a Redis instance.
+ Redis may stop serving during this operation. Instance
+ state will be IMPORTING for entire operation. When
+ complete, the instance will contain only data from the
+ imported file.
+
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.ImportInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "import_instance" not in self._stubs:
+ self._stubs["import_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/ImportInstance",
+ request_serializer=cloud_redis.ImportInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["import_instance"]
+
+ @property
+ def export_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.ExportInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the export instance method over gRPC.
+
+ Export Redis instance data into a Redis RDB format
+ file in Cloud Storage.
+ Redis will continue serving during this operation.
+ The returned operation is automatically deleted after a
+ few hours, so there is no need to call DeleteOperation.
+
+ Returns:
+ Callable[[~.ExportInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "export_instance" not in self._stubs:
+ self._stubs["export_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/ExportInstance",
+ request_serializer=cloud_redis.ExportInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["export_instance"]
+
+ @property
+ def failover_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.FailoverInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the failover instance method over gRPC.
+
+ Initiates a failover of the primary node to current
+ replica node for a specific STANDARD tier Cloud
+ Memorystore for Redis instance.
+
+ Returns:
+ Callable[[~.FailoverInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "failover_instance" not in self._stubs:
+ self._stubs["failover_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/FailoverInstance",
+ request_serializer=cloud_redis.FailoverInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["failover_instance"]
+
+ @property
+ def delete_instance(
+ self,
+ ) -> Callable[
+ [cloud_redis.DeleteInstanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the delete instance method over gRPC.
+
+ Deletes a specific Redis instance. Instance stops
+ serving and data is deleted.
+
+ Returns:
+ Callable[[~.DeleteInstanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_instance" not in self._stubs:
+ self._stubs["delete_instance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/DeleteInstance",
+ request_serializer=cloud_redis.DeleteInstanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_instance"]
+
+ @property
+ def reschedule_maintenance(
+ self,
+ ) -> Callable[
+ [cloud_redis.RescheduleMaintenanceRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the reschedule maintenance method over gRPC.
+
+ Reschedule maintenance for a given instance in a
+ given project and location.
+
+ Returns:
+ Callable[[~.RescheduleMaintenanceRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "reschedule_maintenance" not in self._stubs:
+ self._stubs["reschedule_maintenance"] = self.grpc_channel.unary_unary(
+ "/google.cloud.redis.v1beta1.CloudRedis/RescheduleMaintenance",
+ request_serializer=cloud_redis.RescheduleMaintenanceRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["reschedule_maintenance"]
+
+ def close(self):
+ return self.grpc_channel.close()
+
+
+__all__ = ("CloudRedisGrpcAsyncIOTransport",)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/rest.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/rest.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/services/cloud_redis/transports/rest.py
@@ -0,0 +1,1731 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import dataclasses
+import json # type: ignore
+import re
+from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import (
+ gapic_v1,
+ operations_v1,
+ path_template,
+ rest_helpers,
+ rest_streaming,
+)
+from google.api_core import exceptions as core_exceptions
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.auth.transport.requests import AuthorizedSession # type: ignore
+from google.protobuf import json_format
+import grpc # type: ignore
+from requests import __version__ as requests_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+
+from google.longrunning import operations_pb2 # type: ignore
+
+from google.cloud.redis_v1beta1.types import cloud_redis
+
+from .base import CloudRedisTransport
+from .base import DEFAULT_CLIENT_INFO as BASE_DEFAULT_CLIENT_INFO
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=BASE_DEFAULT_CLIENT_INFO.gapic_version,
+ grpc_version=None,
+ rest_version=requests_version,
+)
+
+
+class CloudRedisRestInterceptor:
+ """Interceptor for CloudRedis.
+
+ Interceptors are used to manipulate requests, request metadata, and responses
+ in arbitrary ways.
+ Example use cases include:
+ * Logging
+ * Verifying requests according to service or custom semantics
+ * Stripping extraneous information from responses
+
+ These use cases and more can be enabled by injecting an
+ instance of a custom subclass when constructing the CloudRedisRestTransport.
+
+ .. code-block:: python
+ class MyCustomCloudRedisInterceptor(CloudRedisRestInterceptor):
+ def pre_create_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_create_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_delete_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_delete_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_export_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_export_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_failover_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_failover_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_instance_auth_string(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_instance_auth_string(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_import_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_import_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_instances(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_instances(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_reschedule_maintenance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_reschedule_maintenance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_update_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_update_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_upgrade_instance(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_upgrade_instance(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ transport = CloudRedisRestTransport(interceptor=MyCustomCloudRedisInterceptor())
+ client = CloudRedisClient(transport=transport)
+
+
+ """
+
+ def pre_create_instance(
+ self,
+ request: cloud_redis.CreateInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.CreateInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for create_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_create_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for create_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_delete_instance(
+ self,
+ request: cloud_redis.DeleteInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.DeleteInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for delete_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_delete_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for delete_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_export_instance(
+ self,
+ request: cloud_redis.ExportInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.ExportInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for export_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_export_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for export_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_failover_instance(
+ self,
+ request: cloud_redis.FailoverInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.FailoverInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for failover_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_failover_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for failover_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_instance(
+ self,
+ request: cloud_redis.GetInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.GetInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_get_instance(self, response: cloud_redis.Instance) -> cloud_redis.Instance:
+ """Post-rpc interceptor for get_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_instance_auth_string(
+ self,
+ request: cloud_redis.GetInstanceAuthStringRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.GetInstanceAuthStringRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_instance_auth_string
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_get_instance_auth_string(
+ self, response: cloud_redis.InstanceAuthString
+ ) -> cloud_redis.InstanceAuthString:
+ """Post-rpc interceptor for get_instance_auth_string
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_import_instance(
+ self,
+ request: cloud_redis.ImportInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.ImportInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for import_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_import_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for import_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_instances(
+ self,
+ request: cloud_redis.ListInstancesRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.ListInstancesRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_instances
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_list_instances(
+ self, response: cloud_redis.ListInstancesResponse
+ ) -> cloud_redis.ListInstancesResponse:
+ """Post-rpc interceptor for list_instances
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_reschedule_maintenance(
+ self,
+ request: cloud_redis.RescheduleMaintenanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.RescheduleMaintenanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for reschedule_maintenance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_reschedule_maintenance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for reschedule_maintenance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_update_instance(
+ self,
+ request: cloud_redis.UpdateInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.UpdateInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for update_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_update_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for update_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_upgrade_instance(
+ self,
+ request: cloud_redis.UpgradeInstanceRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[cloud_redis.UpgradeInstanceRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for upgrade_instance
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the CloudRedis server.
+ """
+ return request, metadata
+
+ def post_upgrade_instance(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for upgrade_instance
+
+ Override in a subclass to manipulate the response
+ after it is returned by the CloudRedis server but before
+ it is returned to user code.
+ """
+ return response
+
+
+@dataclasses.dataclass
+class CloudRedisRestStub:
+ _session: AuthorizedSession
+ _host: str
+ _interceptor: CloudRedisRestInterceptor
+
+
+class CloudRedisRestTransport(CloudRedisTransport):
+ """REST backend transport for CloudRedis.
+
+ Configures and manages Cloud Memorystore for Redis instances
+
+ Google Cloud Memorystore for Redis v1beta1
+
+ The ``redis.googleapis.com`` service implements the Google Cloud
+ Memorystore for Redis API and defines the following resource model
+ for managing Redis instances:
+
+ - The service works with a collection of cloud projects, named:
+ ``/projects/*``
+ - Each project has a collection of available locations, named:
+ ``/locations/*``
+ - Each location has a collection of Redis instances, named:
+ ``/instances/*``
+ - As such, Redis instances are resources of the form:
+ ``/projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+
+ Note that location_id must be referring to a GCP ``region``; for
+ example:
+
+ - ``projects/redpepper-1290/locations/us-central1/instances/my-redis``
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends JSON representations of protocol buffers over HTTP/1.1
+
+ """
+
+ def __init__(
+ self,
+ *,
+ host: str = "redis.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ url_scheme: str = "https",
+ interceptor: Optional[CloudRedisRestInterceptor] = None,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Callable[[], Tuple[bytes, bytes]]): Client
+ certificate to configure mutual TLS HTTP channel. It is ignored
+ if ``channel`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you are developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ url_scheme: the protocol scheme for the API endpoint. Normally
+ "https", but for testing or local servers,
+ "http" can be specified.
+ """
+ # Run the base constructor
+ # TODO(yon-mg): resolve other ctor params i.e. scopes, quota, etc.
+ # TODO: When custom host (api_endpoint) is set, `scopes` must *also* be set on the
+ # credentials object
+ maybe_url_match = re.match("^(?P<scheme>http(?:s)?://)?(?P<host>.*)$", host)
+ if maybe_url_match is None:
+ raise ValueError(
+ f"Unexpected hostname structure: {host}"
+ ) # pragma: NO COVER
+
+ url_match_items = maybe_url_match.groupdict()
+
+ host = f"{url_scheme}://{host}" if not url_match_items["scheme"] else host
+
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+ self._session = AuthorizedSession(
+ self._credentials, default_host=self.DEFAULT_HOST
+ )
+ self._operations_client: Optional[operations_v1.AbstractOperationsClient] = None
+ if client_cert_source_for_mtls:
+ self._session.configure_mtls_channel(client_cert_source_for_mtls)
+ self._interceptor = interceptor or CloudRedisRestInterceptor()
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def operations_client(self) -> operations_v1.AbstractOperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ http_options: Dict[str, List[Dict[str, str]]] = {
+ "google.longrunning.Operations.CancelOperation": [
+ {
+ "method": "post",
+ "uri": "/v1beta1/{name=projects/*/locations/*/operations/*}:cancel",
+ },
+ ],
+ "google.longrunning.Operations.DeleteOperation": [
+ {
+ "method": "delete",
+ "uri": "/v1beta1/{name=projects/*/locations/*/operations/*}",
+ },
+ ],
+ "google.longrunning.Operations.GetOperation": [
+ {
+ "method": "get",
+ "uri": "/v1beta1/{name=projects/*/locations/*/operations/*}",
+ },
+ ],
+ "google.longrunning.Operations.ListOperations": [
+ {
+ "method": "get",
+ "uri": "/v1beta1/{name=projects/*/locations/*}/operations",
+ },
+ ],
+ }
+
+ rest_transport = operations_v1.OperationsRestTransport(
+ host=self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ scopes=self._scopes,
+ http_options=http_options,
+ path_prefix="v1beta1",
+ )
+
+ self._operations_client = operations_v1.AbstractOperationsClient(
+ transport=rest_transport
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ class _CreateInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("CreateInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "instanceId": "",
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.CreateInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the create instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.CreateInstanceRequest):
+ The request object. Request for
+ [CreateInstance][google.cloud.redis.v1beta1.CloudRedis.CreateInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1beta1/{parent=projects/*/locations/*}/instances",
+ "body": "instance",
+ },
+ ]
+ request, metadata = self._interceptor.pre_create_instance(request, metadata)
+ pb_request = cloud_redis.CreateInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_create_instance(resp)
+ return resp
+
+ class _DeleteInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("DeleteInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.DeleteInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the delete instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.DeleteInstanceRequest):
+ The request object. Request for
+ [DeleteInstance][google.cloud.redis.v1beta1.CloudRedis.DeleteInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "delete",
+ "uri": "/v1beta1/{name=projects/*/locations/*/instances/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_delete_instance(request, metadata)
+ pb_request = cloud_redis.DeleteInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_delete_instance(resp)
+ return resp
+
+ class _ExportInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("ExportInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.ExportInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the export instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.ExportInstanceRequest):
+ The request object. Request for
+ [Export][google.cloud.redis.v1beta1.CloudRedis.ExportInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1beta1/{name=projects/*/locations/*/instances/*}:export",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_export_instance(request, metadata)
+ pb_request = cloud_redis.ExportInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_export_instance(resp)
+ return resp
+
+ class _FailoverInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("FailoverInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.FailoverInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the failover instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.FailoverInstanceRequest):
+ The request object. Request for
+ [Failover][google.cloud.redis.v1beta1.CloudRedis.FailoverInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1beta1/{name=projects/*/locations/*/instances/*}:failover",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_failover_instance(
+ request, metadata
+ )
+ pb_request = cloud_redis.FailoverInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_failover_instance(resp)
+ return resp
+
+ class _GetInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("GetInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.GetInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.Instance:
+ r"""Call the get instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.GetInstanceRequest):
+ The request object. Request for
+ [GetInstance][google.cloud.redis.v1beta1.CloudRedis.GetInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.cloud_redis.Instance:
+ A Memorystore for Redis instance.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1beta1/{name=projects/*/locations/*/instances/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_instance(request, metadata)
+ pb_request = cloud_redis.GetInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = cloud_redis.Instance()
+ pb_resp = cloud_redis.Instance.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_instance(resp)
+ return resp
+
+ class _GetInstanceAuthString(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("GetInstanceAuthString")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.GetInstanceAuthStringRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.InstanceAuthString:
+ r"""Call the get instance auth string method over HTTP.
+
+ Args:
+ request (~.cloud_redis.GetInstanceAuthStringRequest):
+ The request object. Request for
+ [GetInstanceAuthString][google.cloud.redis.v1beta1.CloudRedis.GetInstanceAuthString].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.cloud_redis.InstanceAuthString:
+ Instance AUTH string details.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1beta1/{name=projects/*/locations/*/instances/*}/authString",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_instance_auth_string(
+ request, metadata
+ )
+ pb_request = cloud_redis.GetInstanceAuthStringRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = cloud_redis.InstanceAuthString()
+ pb_resp = cloud_redis.InstanceAuthString.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_instance_auth_string(resp)
+ return resp
+
+ class _ImportInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("ImportInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.ImportInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the import instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.ImportInstanceRequest):
+ The request object. Request for
+ [Import][google.cloud.redis.v1beta1.CloudRedis.ImportInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1beta1/{name=projects/*/locations/*/instances/*}:import",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_import_instance(request, metadata)
+ pb_request = cloud_redis.ImportInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_import_instance(resp)
+ return resp
+
+ class _ListInstances(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("ListInstances")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.ListInstancesRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> cloud_redis.ListInstancesResponse:
+ r"""Call the list instances method over HTTP.
+
+ Args:
+ request (~.cloud_redis.ListInstancesRequest):
+ The request object. Request for
+ [ListInstances][google.cloud.redis.v1beta1.CloudRedis.ListInstances].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.cloud_redis.ListInstancesResponse:
+ Response for
+ [ListInstances][google.cloud.redis.v1beta1.CloudRedis.ListInstances].
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1beta1/{parent=projects/*/locations/*}/instances",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_instances(request, metadata)
+ pb_request = cloud_redis.ListInstancesRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = cloud_redis.ListInstancesResponse()
+ pb_resp = cloud_redis.ListInstancesResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_instances(resp)
+ return resp
+
+ class _RescheduleMaintenance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("RescheduleMaintenance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.RescheduleMaintenanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the reschedule maintenance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.RescheduleMaintenanceRequest):
+ The request object. Request for
+ [RescheduleMaintenance][google.cloud.redis.v1beta1.CloudRedis.RescheduleMaintenance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1beta1/{name=projects/*/locations/*/instances/*}:rescheduleMaintenance",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_reschedule_maintenance(
+ request, metadata
+ )
+ pb_request = cloud_redis.RescheduleMaintenanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_reschedule_maintenance(resp)
+ return resp
+
+ class _UpdateInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("UpdateInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "updateMask": {},
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.UpdateInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the update instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.UpdateInstanceRequest):
+ The request object. Request for
+ [UpdateInstance][google.cloud.redis.v1beta1.CloudRedis.UpdateInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "patch",
+ "uri": "/v1beta1/{instance.name=projects/*/locations/*/instances/*}",
+ "body": "instance",
+ },
+ ]
+ request, metadata = self._interceptor.pre_update_instance(request, metadata)
+ pb_request = cloud_redis.UpdateInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_update_instance(resp)
+ return resp
+
+ class _UpgradeInstance(CloudRedisRestStub):
+ def __hash__(self):
+ return hash("UpgradeInstance")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: cloud_redis.UpgradeInstanceRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the upgrade instance method over HTTP.
+
+ Args:
+ request (~.cloud_redis.UpgradeInstanceRequest):
+ The request object. Request for
+ [UpgradeInstance][google.cloud.redis.v1beta1.CloudRedis.UpgradeInstance].
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1beta1/{name=projects/*/locations/*/instances/*}:upgrade",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_upgrade_instance(
+ request, metadata
+ )
+ pb_request = cloud_redis.UpgradeInstanceRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_upgrade_instance(resp)
+ return resp
+
+ @property
+ def create_instance(
+ self,
+ ) -> Callable[[cloud_redis.CreateInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._CreateInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def delete_instance(
+ self,
+ ) -> Callable[[cloud_redis.DeleteInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._DeleteInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def export_instance(
+ self,
+ ) -> Callable[[cloud_redis.ExportInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ExportInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def failover_instance(
+ self,
+ ) -> Callable[[cloud_redis.FailoverInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._FailoverInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_instance(
+ self,
+ ) -> Callable[[cloud_redis.GetInstanceRequest], cloud_redis.Instance]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_instance_auth_string(
+ self,
+ ) -> Callable[
+ [cloud_redis.GetInstanceAuthStringRequest], cloud_redis.InstanceAuthString
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetInstanceAuthString(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def import_instance(
+ self,
+ ) -> Callable[[cloud_redis.ImportInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ImportInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_instances(
+ self,
+ ) -> Callable[
+ [cloud_redis.ListInstancesRequest], cloud_redis.ListInstancesResponse
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListInstances(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def reschedule_maintenance(
+ self,
+ ) -> Callable[[cloud_redis.RescheduleMaintenanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._RescheduleMaintenance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def update_instance(
+ self,
+ ) -> Callable[[cloud_redis.UpdateInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._UpdateInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def upgrade_instance(
+ self,
+ ) -> Callable[[cloud_redis.UpgradeInstanceRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._UpgradeInstance(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def kind(self) -> str:
+ return "rest"
+
+ def close(self):
+ self._session.close()
+
+
+__all__ = ("CloudRedisRestTransport",)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/types/__init__.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/types/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/types/__init__.py
@@ -0,0 +1,72 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .cloud_redis import (
+ CreateInstanceRequest,
+ DeleteInstanceRequest,
+ ExportInstanceRequest,
+ FailoverInstanceRequest,
+ GcsDestination,
+ GcsSource,
+ GetInstanceAuthStringRequest,
+ GetInstanceRequest,
+ ImportInstanceRequest,
+ InputConfig,
+ Instance,
+ InstanceAuthString,
+ ListInstancesRequest,
+ ListInstancesResponse,
+ LocationMetadata,
+ MaintenancePolicy,
+ MaintenanceSchedule,
+ NodeInfo,
+ OutputConfig,
+ PersistenceConfig,
+ RescheduleMaintenanceRequest,
+ TlsCertificate,
+ UpdateInstanceRequest,
+ UpgradeInstanceRequest,
+ WeeklyMaintenanceWindow,
+ ZoneMetadata,
+)
+
+__all__ = (
+ "CreateInstanceRequest",
+ "DeleteInstanceRequest",
+ "ExportInstanceRequest",
+ "FailoverInstanceRequest",
+ "GcsDestination",
+ "GcsSource",
+ "GetInstanceAuthStringRequest",
+ "GetInstanceRequest",
+ "ImportInstanceRequest",
+ "InputConfig",
+ "Instance",
+ "InstanceAuthString",
+ "ListInstancesRequest",
+ "ListInstancesResponse",
+ "LocationMetadata",
+ "MaintenancePolicy",
+ "MaintenanceSchedule",
+ "NodeInfo",
+ "OutputConfig",
+ "PersistenceConfig",
+ "RescheduleMaintenanceRequest",
+ "TlsCertificate",
+ "UpdateInstanceRequest",
+ "UpgradeInstanceRequest",
+ "WeeklyMaintenanceWindow",
+ "ZoneMetadata",
+)
diff --git a/packages/google-cloud-redis/google/cloud/redis_v1beta1/types/cloud_redis.py b/packages/google-cloud-redis/google/cloud/redis_v1beta1/types/cloud_redis.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/google/cloud/redis_v1beta1/types/cloud_redis.py
@@ -0,0 +1,1249 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import duration_pb2 # type: ignore
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+from google.type import dayofweek_pb2 # type: ignore
+from google.type import timeofday_pb2 # type: ignore
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.redis.v1beta1",
+ manifest={
+ "NodeInfo",
+ "Instance",
+ "PersistenceConfig",
+ "RescheduleMaintenanceRequest",
+ "MaintenancePolicy",
+ "WeeklyMaintenanceWindow",
+ "MaintenanceSchedule",
+ "ListInstancesRequest",
+ "ListInstancesResponse",
+ "GetInstanceRequest",
+ "GetInstanceAuthStringRequest",
+ "InstanceAuthString",
+ "CreateInstanceRequest",
+ "UpdateInstanceRequest",
+ "UpgradeInstanceRequest",
+ "DeleteInstanceRequest",
+ "GcsSource",
+ "InputConfig",
+ "ImportInstanceRequest",
+ "GcsDestination",
+ "OutputConfig",
+ "ExportInstanceRequest",
+ "FailoverInstanceRequest",
+ "LocationMetadata",
+ "ZoneMetadata",
+ "TlsCertificate",
+ },
+)
+
+
+class NodeInfo(proto.Message):
+ r"""Node specific properties.
+
+ Attributes:
+ id (str):
+ Output only. Node identifying string. e.g.
+ 'node-0', 'node-1'
+ zone (str):
+ Output only. Location of the node.
+ """
+
+ id: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ zone: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class Instance(proto.Message):
+ r"""A Memorystore for Redis instance.
+
+ Attributes:
+ name (str):
+ Required. Unique name of the resource in this scope
+ including project and location using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+
+ Note: Redis instances are managed and addressed at regional
+ level so location_id here refers to a GCP region; however,
+ users may choose which specific zone (or collection of zones
+ for cross-zone instances) an instance should be provisioned
+ in. Refer to
+ [location_id][google.cloud.redis.v1beta1.Instance.location_id]
+ and
+ [alternative_location_id][google.cloud.redis.v1beta1.Instance.alternative_location_id]
+ fields for more details.
+ display_name (str):
+ An arbitrary and optional user-provided name
+ for the instance.
+ labels (MutableMapping[str, str]):
+ Resource labels to represent user provided
+ metadata
+ location_id (str):
+ Optional. The zone where the instance will be
+ provisioned. If not provided, the service will
+ choose a zone from the specified region for the
+ instance. For standard tier, additional nodes
+ will be added across multiple zones for
+ protection against zonal failures. If specified,
+ at least one node will be provisioned in this
+ zone.
+ alternative_location_id (str):
+ Optional. If specified, at least one node will be
+ provisioned in this zone in addition to the zone specified
+ in location_id. Only applicable to standard tier. If
+ provided, it must be a different zone from the one provided
+ in [location_id]. Additional nodes beyond the first 2 will
+ be placed in zones selected by the service.
+ redis_version (str):
+ Optional. The version of Redis software. If not provided,
+ latest supported version will be used. Currently, the
+ supported values are:
+
+ - ``REDIS_3_2`` for Redis 3.2 compatibility
+ - ``REDIS_4_0`` for Redis 4.0 compatibility (default)
+ - ``REDIS_5_0`` for Redis 5.0 compatibility
+ - ``REDIS_6_X`` for Redis 6.x compatibility
+ reserved_ip_range (str):
+ Optional. For DIRECT_PEERING mode, the CIDR range of
+ internal addresses that are reserved for this instance.
+ Range must be unique and non-overlapping with existing
+ subnets in an authorized network. For PRIVATE_SERVICE_ACCESS
+ mode, the name of one allocated IP address ranges associated
+ with this private service access connection. If not
+ provided, the service will choose an unused /29 block, for
+ example, 10.0.0.0/29 or 192.168.0.0/29. For
+ READ_REPLICAS_ENABLED the default block size is /28.
+ secondary_ip_range (str):
+ Optional. Additional IP range for node placement. Required
+ when enabling read replicas on an existing instance. For
+ DIRECT_PEERING mode value must be a CIDR range of size /28,
+ or "auto". For PRIVATE_SERVICE_ACCESS mode value must be the
+ name of an allocated address range associated with the
+ private service access connection, or "auto".
+ host (str):
+ Output only. Hostname or IP address of the
+ exposed Redis endpoint used by clients to
+ connect to the service.
+ port (int):
+ Output only. The port number of the exposed
+ Redis endpoint.
+ current_location_id (str):
+ Output only. The current zone where the Redis primary node
+ is located. In basic tier, this will always be the same as
+ [location_id]. In standard tier, this can be the zone of any
+ node in the instance.
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time the instance was
+ created.
+ state (google.cloud.redis_v1beta1.types.Instance.State):
+ Output only. The current state of this
+ instance.
+ status_message (str):
+ Output only. Additional information about the
+ current status of this instance, if available.
+ redis_configs (MutableMapping[str, str]):
+ Optional. Redis configuration parameters, according to
+ http://redis.io/topics/config. Currently, the only supported
+ parameters are:
+
+ Redis version 3.2 and newer:
+
+ - maxmemory-policy
+ - notify-keyspace-events
+
+ Redis version 4.0 and newer:
+
+ - activedefrag
+ - lfu-decay-time
+ - lfu-log-factor
+ - maxmemory-gb
+
+ Redis version 5.0 and newer:
+
+ - stream-node-max-bytes
+ - stream-node-max-entries
+ tier (google.cloud.redis_v1beta1.types.Instance.Tier):
+ Required. The service tier of the instance.
+ memory_size_gb (int):
+ Required. Redis memory size in GiB.
+ authorized_network (str):
+ Optional. The full name of the Google Compute Engine
+ `network <https://cloud.google.com/vpc/docs/vpc>`__ to which
+ the instance is connected. If left unspecified, the
+ ``default`` network will be used.
+ persistence_iam_identity (str):
+ Output only. Cloud IAM identity used by import / export
+ operations to transfer data to/from Cloud Storage. Format is
+ "serviceAccount:<service_account_email>". The value may
+ change over time for a given instance so should be checked
+ before each import/export operation.
+ connect_mode (google.cloud.redis_v1beta1.types.Instance.ConnectMode):
+ Optional. The network connect mode of the Redis instance. If
+ not provided, the connect mode defaults to DIRECT_PEERING.
+ auth_enabled (bool):
+ Optional. Indicates whether OSS Redis AUTH is
+ enabled for the instance. If set to "true" AUTH
+ is enabled on the instance. Default value is
+ "false" meaning AUTH is disabled.
+ server_ca_certs (MutableSequence[google.cloud.redis_v1beta1.types.TlsCertificate]):
+ Output only. List of server CA certificates
+ for the instance.
+ transit_encryption_mode (google.cloud.redis_v1beta1.types.Instance.TransitEncryptionMode):
+ Optional. The TLS mode of the Redis instance.
+ If not provided, TLS is disabled for the
+ instance.
+ maintenance_policy (google.cloud.redis_v1beta1.types.MaintenancePolicy):
+ Optional. The maintenance policy for the
+ instance. If not provided, maintenance events
+ can be performed at any time.
+ maintenance_schedule (google.cloud.redis_v1beta1.types.MaintenanceSchedule):
+ Output only. Date and time of upcoming
+ maintenance events which have been scheduled.
+ replica_count (int):
+ Optional. The number of replica nodes. The valid range for
+ the Standard Tier with read replicas enabled is [1-5] and
+ defaults to 2. If read replicas are not enabled for a
+ Standard Tier instance, the only valid value is 1 and the
+ default is 1. The valid value for basic tier is 0 and the
+ default is also 0.
+ nodes (MutableSequence[google.cloud.redis_v1beta1.types.NodeInfo]):
+ Output only. Info per node.
+ read_endpoint (str):
+ Output only. Hostname or IP address of the
+ exposed readonly Redis endpoint. Standard tier
+ only. Targets all healthy replica nodes in
+ instance. Replication is asynchronous and
+ replica nodes will exhibit some lag behind the
+ primary. Write requests must target 'host'.
+ read_endpoint_port (int):
+ Output only. The port number of the exposed
+ readonly redis endpoint. Standard tier only.
+ Write requests should target 'port'.
+ read_replicas_mode (google.cloud.redis_v1beta1.types.Instance.ReadReplicasMode):
+ Optional. Read replicas mode for the instance. Defaults to
+ READ_REPLICAS_DISABLED.
+ persistence_config (google.cloud.redis_v1beta1.types.PersistenceConfig):
+ Optional. Persistence configuration
+ parameters
+ """
+
+ class State(proto.Enum):
+ r"""Represents the different states of a Redis instance.
+
+ Values:
+ STATE_UNSPECIFIED (0):
+ Not set.
+ CREATING (1):
+ Redis instance is being created.
+ READY (2):
+ Redis instance has been created and is fully
+ usable.
+ UPDATING (3):
+ Redis instance configuration is being
+ updated. Certain kinds of updates may cause the
+ instance to become unusable while the update is
+ in progress.
+ DELETING (4):
+ Redis instance is being deleted.
+ REPAIRING (5):
+ Redis instance is being repaired and may be
+ unusable.
+ MAINTENANCE (6):
+ Maintenance is being performed on this Redis
+ instance.
+ IMPORTING (8):
+ Redis instance is importing data
+ (availability may be affected).
+ FAILING_OVER (10):
+ Redis instance is failing over (availability
+ may be affected).
+ """
+ STATE_UNSPECIFIED = 0
+ CREATING = 1
+ READY = 2
+ UPDATING = 3
+ DELETING = 4
+ REPAIRING = 5
+ MAINTENANCE = 6
+ IMPORTING = 8
+ FAILING_OVER = 10
+
+ class Tier(proto.Enum):
+ r"""Available service tiers to choose from
+
+ Values:
+ TIER_UNSPECIFIED (0):
+ Not set.
+ BASIC (1):
+ BASIC tier: standalone instance
+ STANDARD_HA (3):
+ STANDARD_HA tier: highly available primary/replica instances
+ """
+ TIER_UNSPECIFIED = 0
+ BASIC = 1
+ STANDARD_HA = 3
+
+ class ConnectMode(proto.Enum):
+ r"""Available connection modes.
+
+ Values:
+ CONNECT_MODE_UNSPECIFIED (0):
+ Not set.
+ DIRECT_PEERING (1):
+ Connect via direct peering to the Memorystore
+ for Redis hosted service.
+ PRIVATE_SERVICE_ACCESS (2):
+ Connect your Memorystore for Redis instance
+ using Private Service Access. Private services
+ access provides an IP address range for multiple
+ Google Cloud services, including Memorystore.
+ """
+ CONNECT_MODE_UNSPECIFIED = 0
+ DIRECT_PEERING = 1
+ PRIVATE_SERVICE_ACCESS = 2
+
+ class TransitEncryptionMode(proto.Enum):
+ r"""Available TLS modes.
+
+ Values:
+ TRANSIT_ENCRYPTION_MODE_UNSPECIFIED (0):
+ Not set.
+ SERVER_AUTHENTICATION (1):
+ Client to Server traffic encryption enabled
+ with server authentication.
+ DISABLED (2):
+ TLS is disabled for the instance.
+ """
+ TRANSIT_ENCRYPTION_MODE_UNSPECIFIED = 0
+ SERVER_AUTHENTICATION = 1
+ DISABLED = 2
+
+ class ReadReplicasMode(proto.Enum):
+ r"""Read replicas mode.
+
+ Values:
+ READ_REPLICAS_MODE_UNSPECIFIED (0):
+ If not set, Memorystore Redis backend will default to
+ READ_REPLICAS_DISABLED.
+ READ_REPLICAS_DISABLED (1):
+ If disabled, read endpoint will not be
+ provided and the instance cannot scale up or
+ down the number of replicas.
+ READ_REPLICAS_ENABLED (2):
+ If enabled, read endpoint will be provided
+ and the instance can scale up and down the
+ number of replicas. Not valid for basic tier.
+ """
+ READ_REPLICAS_MODE_UNSPECIFIED = 0
+ READ_REPLICAS_DISABLED = 1
+ READ_REPLICAS_ENABLED = 2
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ display_name: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ labels: MutableMapping[str, str] = proto.MapField(
+ proto.STRING,
+ proto.STRING,
+ number=3,
+ )
+ location_id: str = proto.Field(
+ proto.STRING,
+ number=4,
+ )
+ alternative_location_id: str = proto.Field(
+ proto.STRING,
+ number=5,
+ )
+ redis_version: str = proto.Field(
+ proto.STRING,
+ number=7,
+ )
+ reserved_ip_range: str = proto.Field(
+ proto.STRING,
+ number=9,
+ )
+ secondary_ip_range: str = proto.Field(
+ proto.STRING,
+ number=30,
+ )
+ host: str = proto.Field(
+ proto.STRING,
+ number=10,
+ )
+ port: int = proto.Field(
+ proto.INT32,
+ number=11,
+ )
+ current_location_id: str = proto.Field(
+ proto.STRING,
+ number=12,
+ )
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=13,
+ message=timestamp_pb2.Timestamp,
+ )
+ state: State = proto.Field(
+ proto.ENUM,
+ number=14,
+ enum=State,
+ )
+ status_message: str = proto.Field(
+ proto.STRING,
+ number=15,
+ )
+ redis_configs: MutableMapping[str, str] = proto.MapField(
+ proto.STRING,
+ proto.STRING,
+ number=16,
+ )
+ tier: Tier = proto.Field(
+ proto.ENUM,
+ number=17,
+ enum=Tier,
+ )
+ memory_size_gb: int = proto.Field(
+ proto.INT32,
+ number=18,
+ )
+ authorized_network: str = proto.Field(
+ proto.STRING,
+ number=20,
+ )
+ persistence_iam_identity: str = proto.Field(
+ proto.STRING,
+ number=21,
+ )
+ connect_mode: ConnectMode = proto.Field(
+ proto.ENUM,
+ number=22,
+ enum=ConnectMode,
+ )
+ auth_enabled: bool = proto.Field(
+ proto.BOOL,
+ number=23,
+ )
+ server_ca_certs: MutableSequence["TlsCertificate"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=25,
+ message="TlsCertificate",
+ )
+ transit_encryption_mode: TransitEncryptionMode = proto.Field(
+ proto.ENUM,
+ number=26,
+ enum=TransitEncryptionMode,
+ )
+ maintenance_policy: "MaintenancePolicy" = proto.Field(
+ proto.MESSAGE,
+ number=27,
+ message="MaintenancePolicy",
+ )
+ maintenance_schedule: "MaintenanceSchedule" = proto.Field(
+ proto.MESSAGE,
+ number=28,
+ message="MaintenanceSchedule",
+ )
+ replica_count: int = proto.Field(
+ proto.INT32,
+ number=31,
+ )
+ nodes: MutableSequence["NodeInfo"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=32,
+ message="NodeInfo",
+ )
+ read_endpoint: str = proto.Field(
+ proto.STRING,
+ number=33,
+ )
+ read_endpoint_port: int = proto.Field(
+ proto.INT32,
+ number=34,
+ )
+ read_replicas_mode: ReadReplicasMode = proto.Field(
+ proto.ENUM,
+ number=35,
+ enum=ReadReplicasMode,
+ )
+ persistence_config: "PersistenceConfig" = proto.Field(
+ proto.MESSAGE,
+ number=37,
+ message="PersistenceConfig",
+ )
+
+
+class PersistenceConfig(proto.Message):
+ r"""Configuration of the persistence functionality.
+
+ Attributes:
+ persistence_mode (google.cloud.redis_v1beta1.types.PersistenceConfig.PersistenceMode):
+ Optional. Controls whether Persistence
+ features are enabled. If not provided, the
+ existing value will be used.
+ rdb_snapshot_period (google.cloud.redis_v1beta1.types.PersistenceConfig.SnapshotPeriod):
+ Optional. Period between RDB snapshots. Snapshots will be
+ attempted every period starting from the provided snapshot
+ start time. For example, a start time of 01/01/2033 06:45
+ and SIX_HOURS snapshot period will do nothing until
+ 01/01/2033, and then trigger snapshots every day at 06:45,
+ 12:45, 18:45, and 00:45 the next day, and so on. If not
+ provided, TWENTY_FOUR_HOURS will be used as default.
+ rdb_next_snapshot_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The next time that a snapshot
+ attempt is scheduled to occur.
+ rdb_snapshot_start_time (google.protobuf.timestamp_pb2.Timestamp):
+ Optional. Date and time that the first
+ snapshot was/will be attempted, and to which
+ future snapshots will be aligned. If not
+ provided, the current time will be used.
+ """
+
+ class PersistenceMode(proto.Enum):
+ r"""Available Persistence modes.
+
+ Values:
+ PERSISTENCE_MODE_UNSPECIFIED (0):
+ Not set.
+ DISABLED (1):
+ Persistence is disabled for the instance,
+ and any existing snapshots are deleted.
+ RDB (2):
+ RDB based Persistence is enabled.
+ """
+ PERSISTENCE_MODE_UNSPECIFIED = 0
+ DISABLED = 1
+ RDB = 2
+
+ class SnapshotPeriod(proto.Enum):
+ r"""Available snapshot periods for scheduling.
+
+ Values:
+ SNAPSHOT_PERIOD_UNSPECIFIED (0):
+ Not set.
+ ONE_HOUR (3):
+ Snapshot every 1 hour.
+ SIX_HOURS (4):
+ Snapshot every 6 hours.
+ TWELVE_HOURS (5):
+ Snapshot every 12 hours.
+ TWENTY_FOUR_HOURS (6):
+ Snapshot every 24 hours.
+ """
+ SNAPSHOT_PERIOD_UNSPECIFIED = 0
+ ONE_HOUR = 3
+ SIX_HOURS = 4
+ TWELVE_HOURS = 5
+ TWENTY_FOUR_HOURS = 6
+
+ persistence_mode: PersistenceMode = proto.Field(
+ proto.ENUM,
+ number=1,
+ enum=PersistenceMode,
+ )
+ rdb_snapshot_period: SnapshotPeriod = proto.Field(
+ proto.ENUM,
+ number=2,
+ enum=SnapshotPeriod,
+ )
+ rdb_next_snapshot_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=4,
+ message=timestamp_pb2.Timestamp,
+ )
+ rdb_snapshot_start_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=5,
+ message=timestamp_pb2.Timestamp,
+ )
+
+
+class RescheduleMaintenanceRequest(proto.Message):
+ r"""Request for
+ [RescheduleMaintenance][google.cloud.redis.v1beta1.CloudRedis.RescheduleMaintenance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ reschedule_type (google.cloud.redis_v1beta1.types.RescheduleMaintenanceRequest.RescheduleType):
+ Required. If reschedule type is SPECIFIC_TIME, must set up
+ schedule_time as well.
+ schedule_time (google.protobuf.timestamp_pb2.Timestamp):
+ Optional. Timestamp when the maintenance shall be
+ rescheduled to if reschedule_type=SPECIFIC_TIME, in RFC 3339
+ format, for example ``2012-11-15T16:19:00.094Z``.
+ """
+
+ class RescheduleType(proto.Enum):
+ r"""Reschedule options.
+
+ Values:
+ RESCHEDULE_TYPE_UNSPECIFIED (0):
+ Not set.
+ IMMEDIATE (1):
+ If the user wants to schedule the maintenance
+ to happen now.
+ NEXT_AVAILABLE_WINDOW (2):
+ If the user wants to use the existing
+ maintenance policy to find the next available
+ window.
+ SPECIFIC_TIME (3):
+ If the user wants to reschedule the
+ maintenance to a specific time.
+ """
+ RESCHEDULE_TYPE_UNSPECIFIED = 0
+ IMMEDIATE = 1
+ NEXT_AVAILABLE_WINDOW = 2
+ SPECIFIC_TIME = 3
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ reschedule_type: RescheduleType = proto.Field(
+ proto.ENUM,
+ number=2,
+ enum=RescheduleType,
+ )
+ schedule_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message=timestamp_pb2.Timestamp,
+ )
+
+
+class MaintenancePolicy(proto.Message):
+ r"""Maintenance policy for an instance.
+
+ Attributes:
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time when the policy was
+ created.
+ update_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time when the policy was
+ last updated.
+ description (str):
+ Optional. Description of what this policy is for.
+ Create/Update methods return INVALID_ARGUMENT if the length
+ is greater than 512.
+ weekly_maintenance_window (MutableSequence[google.cloud.redis_v1beta1.types.WeeklyMaintenanceWindow]):
+ Optional. Maintenance window that is applied to resources
+ covered by this policy. Minimum 1. For the current version,
+ the maximum number of weekly_window is expected to be one.
+ """
+
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message=timestamp_pb2.Timestamp,
+ )
+ update_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=timestamp_pb2.Timestamp,
+ )
+ description: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ weekly_maintenance_window: MutableSequence[
+ "WeeklyMaintenanceWindow"
+ ] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=4,
+ message="WeeklyMaintenanceWindow",
+ )
+
+
+class WeeklyMaintenanceWindow(proto.Message):
+ r"""Time window in which disruptive maintenance updates occur.
+ Non-disruptive updates can occur inside or outside this window.
+
+ Attributes:
+ day (google.type.dayofweek_pb2.DayOfWeek):
+ Required. The day of week that maintenance
+ updates occur.
+ start_time (google.type.timeofday_pb2.TimeOfDay):
+ Required. Start time of the window in UTC
+ time.
+ duration (google.protobuf.duration_pb2.Duration):
+ Output only. Duration of the maintenance
+ window. The current window is fixed at 1 hour.
+ """
+
+ day: dayofweek_pb2.DayOfWeek = proto.Field(
+ proto.ENUM,
+ number=1,
+ enum=dayofweek_pb2.DayOfWeek,
+ )
+ start_time: timeofday_pb2.TimeOfDay = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=timeofday_pb2.TimeOfDay,
+ )
+ duration: duration_pb2.Duration = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message=duration_pb2.Duration,
+ )
+
+
+class MaintenanceSchedule(proto.Message):
+ r"""Upcoming maintenance schedule. If no maintenance is
+ scheduled, fields are not populated.
+
+ Attributes:
+ start_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The start time of any upcoming
+ scheduled maintenance for this instance.
+ end_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The end time of any upcoming
+ scheduled maintenance for this instance.
+ can_reschedule (bool):
+ If the scheduled maintenance can be
+ rescheduled, default is true.
+ schedule_deadline_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The deadline that the
+ maintenance schedule start time can not go
+ beyond, including reschedule.
+ """
+
+ start_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message=timestamp_pb2.Timestamp,
+ )
+ end_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=timestamp_pb2.Timestamp,
+ )
+ can_reschedule: bool = proto.Field(
+ proto.BOOL,
+ number=3,
+ )
+ schedule_deadline_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=5,
+ message=timestamp_pb2.Timestamp,
+ )
+
+
+class ListInstancesRequest(proto.Message):
+ r"""Request for
+ [ListInstances][google.cloud.redis.v1beta1.CloudRedis.ListInstances].
+
+ Attributes:
+ parent (str):
+ Required. The resource name of the instance location using
+ the form: ``projects/{project_id}/locations/{location_id}``
+ where ``location_id`` refers to a GCP region.
+ page_size (int):
+ The maximum number of items to return.
+
+ If not specified, a default value of 1000 will be used by
+ the service. Regardless of the page_size value, the response
+ may include a partial list and a caller should only rely on
+ response's
+ [``next_page_token``][google.cloud.redis.v1beta1.ListInstancesResponse.next_page_token]
+ to determine if there are more instances left to be queried.
+ page_token (str):
+ The ``next_page_token`` value returned from a previous
+ [ListInstances][google.cloud.redis.v1beta1.CloudRedis.ListInstances]
+ request, if any.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+class ListInstancesResponse(proto.Message):
+ r"""Response for
+ [ListInstances][google.cloud.redis.v1beta1.CloudRedis.ListInstances].
+
+ Attributes:
+ instances (MutableSequence[google.cloud.redis_v1beta1.types.Instance]):
+ A list of Redis instances in the project in the specified
+ location, or across all locations.
+
+ If the ``location_id`` in the parent field of the request is
+ "-", all regions available to the project are queried, and
+ the results aggregated. If in such an aggregated query a
+ location is unavailable, a placeholder Redis entry is
+ included in the response with the ``name`` field set to a
+ value of the form
+ ``projects/{project_id}/locations/{location_id}/instances/``-
+ and the ``status`` field set to ERROR and ``status_message``
+ field set to "location not available for ListInstances".
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ unreachable (MutableSequence[str]):
+ Locations that could not be reached.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ instances: MutableSequence["Instance"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message="Instance",
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ unreachable: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=3,
+ )
+
+
+class GetInstanceRequest(proto.Message):
+ r"""Request for
+ [GetInstance][google.cloud.redis.v1beta1.CloudRedis.GetInstance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class GetInstanceAuthStringRequest(proto.Message):
+ r"""Request for
+ [GetInstanceAuthString][google.cloud.redis.v1beta1.CloudRedis.GetInstanceAuthString].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class InstanceAuthString(proto.Message):
+ r"""Instance AUTH string details.
+
+ Attributes:
+ auth_string (str):
+ AUTH string set on the instance.
+ """
+
+ auth_string: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class CreateInstanceRequest(proto.Message):
+ r"""Request for
+ [CreateInstance][google.cloud.redis.v1beta1.CloudRedis.CreateInstance].
+
+ Attributes:
+ parent (str):
+ Required. The resource name of the instance location using
+ the form: ``projects/{project_id}/locations/{location_id}``
+ where ``location_id`` refers to a GCP region.
+ instance_id (str):
+ Required. The logical name of the Redis instance in the
+ customer project with the following restrictions:
+
+ - Must contain only lowercase letters, numbers, and
+ hyphens.
+ - Must start with a letter.
+ - Must be between 1-40 characters.
+ - Must end with a number or a letter.
+ - Must be unique within the customer project / location
+ instance (google.cloud.redis_v1beta1.types.Instance):
+ Required. A Redis [Instance] resource
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ instance_id: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ instance: "Instance" = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message="Instance",
+ )
+
+
+class UpdateInstanceRequest(proto.Message):
+ r"""Request for
+ [UpdateInstance][google.cloud.redis.v1beta1.CloudRedis.UpdateInstance].
+
+ Attributes:
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Required. Mask of fields to update. At least one path must
+ be supplied in this field. The elements of the repeated
+ paths field may only include these fields from
+ [Instance][google.cloud.redis.v1beta1.Instance]:
+
+ - ``displayName``
+ - ``labels``
+ - ``memorySizeGb``
+ - ``redisConfig``
+ - ``replica_count``
+ instance (google.cloud.redis_v1beta1.types.Instance):
+ Required. Update description. Only fields specified in
+ update_mask are updated.
+ """
+
+ update_mask: field_mask_pb2.FieldMask = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message=field_mask_pb2.FieldMask,
+ )
+ instance: "Instance" = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message="Instance",
+ )
+
+
+class UpgradeInstanceRequest(proto.Message):
+ r"""Request for
+ [UpgradeInstance][google.cloud.redis.v1beta1.CloudRedis.UpgradeInstance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ redis_version (str):
+ Required. Specifies the target version of
+ Redis software to upgrade to.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ redis_version: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class DeleteInstanceRequest(proto.Message):
+ r"""Request for
+ [DeleteInstance][google.cloud.redis.v1beta1.CloudRedis.DeleteInstance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class GcsSource(proto.Message):
+ r"""The Cloud Storage location for the input content
+
+ Attributes:
+ uri (str):
+ Required. Source data URI. (e.g.
+ 'gs://my_bucket/my_object').
+ """
+
+ uri: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class InputConfig(proto.Message):
+ r"""The input content
+
+ .. _oneof: https://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields
+
+ Attributes:
+ gcs_source (google.cloud.redis_v1beta1.types.GcsSource):
+ Google Cloud Storage location where input
+ content is located.
+
+ This field is a member of `oneof`_ ``source``.
+ """
+
+ gcs_source: "GcsSource" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ oneof="source",
+ message="GcsSource",
+ )
+
+
+class ImportInstanceRequest(proto.Message):
+ r"""Request for
+ [Import][google.cloud.redis.v1beta1.CloudRedis.ImportInstance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ input_config (google.cloud.redis_v1beta1.types.InputConfig):
+ Required. Specify data to be imported.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ input_config: "InputConfig" = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message="InputConfig",
+ )
+
+
+class GcsDestination(proto.Message):
+ r"""The Cloud Storage location for the output content
+
+ Attributes:
+ uri (str):
+ Required. Data destination URI (e.g.
+ 'gs://my_bucket/my_object'). Existing files will be
+ overwritten.
+ """
+
+ uri: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class OutputConfig(proto.Message):
+ r"""The output content
+
+ .. _oneof: https://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields
+
+ Attributes:
+ gcs_destination (google.cloud.redis_v1beta1.types.GcsDestination):
+ Google Cloud Storage destination for output
+ content.
+
+ This field is a member of `oneof`_ ``destination``.
+ """
+
+ gcs_destination: "GcsDestination" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ oneof="destination",
+ message="GcsDestination",
+ )
+
+
+class ExportInstanceRequest(proto.Message):
+ r"""Request for
+ [Export][google.cloud.redis.v1beta1.CloudRedis.ExportInstance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ output_config (google.cloud.redis_v1beta1.types.OutputConfig):
+ Required. Specify data to be exported.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ output_config: "OutputConfig" = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message="OutputConfig",
+ )
+
+
+class FailoverInstanceRequest(proto.Message):
+ r"""Request for
+ [Failover][google.cloud.redis.v1beta1.CloudRedis.FailoverInstance].
+
+ Attributes:
+ name (str):
+ Required. Redis instance resource name using the form:
+ ``projects/{project_id}/locations/{location_id}/instances/{instance_id}``
+ where ``location_id`` refers to a GCP region.
+ data_protection_mode (google.cloud.redis_v1beta1.types.FailoverInstanceRequest.DataProtectionMode):
+ Optional. Available data protection modes that the user can
+ choose. If it's unspecified, data protection mode will be
+ LIMITED_DATA_LOSS by default.
+ """
+
+ class DataProtectionMode(proto.Enum):
+ r"""Specifies different modes of operation in relation to the
+ data retention.
+
+ Values:
+ DATA_PROTECTION_MODE_UNSPECIFIED (0):
+ Defaults to LIMITED_DATA_LOSS if a data protection mode is
+ not specified.
+ LIMITED_DATA_LOSS (1):
+ Instance failover will be protected with data
+ loss control. More specifically, the failover
+ will only be performed if the current
+ replication offset diff between primary and
+ replica is under a certain threshold.
+ FORCE_DATA_LOSS (2):
+ Instance failover will be performed without
+ data loss control.
+ """
+ DATA_PROTECTION_MODE_UNSPECIFIED = 0
+ LIMITED_DATA_LOSS = 1
+ FORCE_DATA_LOSS = 2
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ data_protection_mode: DataProtectionMode = proto.Field(
+ proto.ENUM,
+ number=2,
+ enum=DataProtectionMode,
+ )
+
+
+class LocationMetadata(proto.Message):
+ r"""This location metadata represents additional configuration options
+ for a given location where a Redis instance may be created. All
+ fields are output only. It is returned as content of the
+ ``google.cloud.location.Location.metadata`` field.
+
+ Attributes:
+ available_zones (MutableMapping[str, google.cloud.redis_v1beta1.types.ZoneMetadata]):
+ Output only. The set of available zones in the location. The
+ map is keyed by the lowercase ID of each zone, as defined by
+ GCE. These keys can be specified in ``location_id`` or
+ ``alternative_location_id`` fields when creating a Redis
+ instance.
+ """
+
+ available_zones: MutableMapping[str, "ZoneMetadata"] = proto.MapField(
+ proto.STRING,
+ proto.MESSAGE,
+ number=1,
+ message="ZoneMetadata",
+ )
+
+
+class ZoneMetadata(proto.Message):
+ r"""Defines specific information for a particular zone. Currently
+ empty and reserved for future use only.
+
+ """
+
+
+class TlsCertificate(proto.Message):
+ r"""TlsCertificate Resource
+
+ Attributes:
+ serial_number (str):
+ Serial number, as extracted from the
+ certificate.
+ cert (str):
+ PEM representation.
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time when the certificate was created in
+ `RFC 3339 <https://tools.ietf.org/html/rfc3339>`__ format,
+ for example ``2020-05-18T00:00:00.094Z``.
+ expire_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time when the certificate expires in `RFC
+ 3339 <https://tools.ietf.org/html/rfc3339>`__ format, for
+ example ``2020-05-18T00:00:00.094Z``.
+ sha1_fingerprint (str):
+ Sha1 Fingerprint of the certificate.
+ """
+
+ serial_number: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ cert: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message=timestamp_pb2.Timestamp,
+ )
+ expire_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=4,
+ message=timestamp_pb2.Timestamp,
+ )
+ sha1_fingerprint: str = proto.Field(
+ proto.STRING,
+ number=5,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-redis/noxfile.py b/packages/google-cloud-redis/noxfile.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/noxfile.py
@@ -0,0 +1,428 @@
+# -*- coding: utf-8 -*-
+#
+# Copyright 2018 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# Generated by synthtool. DO NOT EDIT!
+
+from __future__ import absolute_import
+
+import os
+import pathlib
+import re
+import shutil
+import warnings
+
+import nox
+
+BLACK_VERSION = "black==22.3.0"
+ISORT_VERSION = "isort==5.10.1"
+LINT_PATHS = ["docs", "google", "tests", "noxfile.py", "setup.py"]
+
+DEFAULT_PYTHON_VERSION = "3.9"
+
+UNIT_TEST_PYTHON_VERSIONS = ["3.7", "3.8", "3.9", "3.10", "3.11"]
+UNIT_TEST_STANDARD_DEPENDENCIES = [
+ "mock",
+ "asyncmock",
+ "pytest",
+ "pytest-cov",
+ "pytest-asyncio",
+]
+UNIT_TEST_EXTERNAL_DEPENDENCIES = []
+UNIT_TEST_LOCAL_DEPENDENCIES = []
+UNIT_TEST_DEPENDENCIES = []
+UNIT_TEST_EXTRAS = []
+UNIT_TEST_EXTRAS_BY_PYTHON = {}
+
+SYSTEM_TEST_PYTHON_VERSIONS = []
+SYSTEM_TEST_STANDARD_DEPENDENCIES = [
+ "mock",
+ "pytest",
+ "google-cloud-testutils",
+]
+SYSTEM_TEST_EXTERNAL_DEPENDENCIES = []
+SYSTEM_TEST_LOCAL_DEPENDENCIES = []
+SYSTEM_TEST_DEPENDENCIES = []
+SYSTEM_TEST_EXTRAS = []
+SYSTEM_TEST_EXTRAS_BY_PYTHON = {}
+
+CURRENT_DIRECTORY = pathlib.Path(__file__).parent.absolute()
+
+# 'docfx' is excluded since it only needs to run in 'docs-presubmit'
+nox.options.sessions = [
+ "unit",
+ "system",
+ "cover",
+ "lint",
+ "lint_setup_py",
+ "blacken",
+ "docs",
+]
+
+# Error if a python version is missing
+nox.options.error_on_missing_interpreters = True
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def lint(session):
+ """Run linters.
+
+ Returns a failure if the linters find linting errors or sufficiently
+ serious code quality issues.
+ """
+ session.install("flake8", BLACK_VERSION)
+ session.run(
+ "black",
+ "--check",
+ *LINT_PATHS,
+ )
+ session.run("flake8", "google", "tests")
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def blacken(session):
+ """Run black. Format code to uniform standard."""
+ session.install(BLACK_VERSION)
+ session.run(
+ "black",
+ *LINT_PATHS,
+ )
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def format(session):
+ """
+ Run isort to sort imports. Then run black
+ to format code to uniform standard.
+ """
+ session.install(BLACK_VERSION, ISORT_VERSION)
+ # Use the --fss option to sort imports using strict alphabetical order.
+ # See https://pycqa.github.io/isort/docs/configuration/options.html#force-sort-within-sections
+ session.run(
+ "isort",
+ "--fss",
+ *LINT_PATHS,
+ )
+ session.run(
+ "black",
+ *LINT_PATHS,
+ )
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def lint_setup_py(session):
+ """Verify that setup.py is valid (including RST check)."""
+ session.install("docutils", "pygments")
+ session.run("python", "setup.py", "check", "--restructuredtext", "--strict")
+
+
+def install_unittest_dependencies(session, *constraints):
+ standard_deps = UNIT_TEST_STANDARD_DEPENDENCIES + UNIT_TEST_DEPENDENCIES
+ session.install(*standard_deps, *constraints)
+
+ if UNIT_TEST_EXTERNAL_DEPENDENCIES:
+ warnings.warn(
+ "'unit_test_external_dependencies' is deprecated. Instead, please "
+ "use 'unit_test_dependencies' or 'unit_test_local_dependencies'.",
+ DeprecationWarning,
+ )
+ session.install(*UNIT_TEST_EXTERNAL_DEPENDENCIES, *constraints)
+
+ if UNIT_TEST_LOCAL_DEPENDENCIES:
+ session.install(*UNIT_TEST_LOCAL_DEPENDENCIES, *constraints)
+
+ if UNIT_TEST_EXTRAS_BY_PYTHON:
+ extras = UNIT_TEST_EXTRAS_BY_PYTHON.get(session.python, [])
+ elif UNIT_TEST_EXTRAS:
+ extras = UNIT_TEST_EXTRAS
+ else:
+ extras = []
+
+ if extras:
+ session.install("-e", f".[{','.join(extras)}]", *constraints)
+ else:
+ session.install("-e", ".", *constraints)
+
+
+def default(session):
+ # Install all test dependencies, then install this package in-place.
+
+ constraints_path = str(
+ CURRENT_DIRECTORY / "testing" / f"constraints-{session.python}.txt"
+ )
+ install_unittest_dependencies(session, "-c", constraints_path)
+
+ # Run py.test against the unit tests.
+ session.run(
+ "py.test",
+ "--quiet",
+ f"--junitxml=unit_{session.python}_sponge_log.xml",
+ "--cov=google",
+ "--cov=tests/unit",
+ "--cov-append",
+ "--cov-config=.coveragerc",
+ "--cov-report=",
+ "--cov-fail-under=0",
+ os.path.join("tests", "unit"),
+ *session.posargs,
+ )
+
+
+@nox.session(python=UNIT_TEST_PYTHON_VERSIONS)
+def unit(session):
+ """Run the unit test suite."""
+ default(session)
+
+
+def install_systemtest_dependencies(session, *constraints):
+
+ # Use pre-release gRPC for system tests.
+ # Exclude version 1.52.0rc1 which has a known issue.
+ # See https://github.com/grpc/grpc/issues/32163
+ session.install("--pre", "grpcio!=1.52.0rc1")
+
+ session.install(*SYSTEM_TEST_STANDARD_DEPENDENCIES, *constraints)
+
+ if SYSTEM_TEST_EXTERNAL_DEPENDENCIES:
+ session.install(*SYSTEM_TEST_EXTERNAL_DEPENDENCIES, *constraints)
+
+ if SYSTEM_TEST_LOCAL_DEPENDENCIES:
+ session.install("-e", *SYSTEM_TEST_LOCAL_DEPENDENCIES, *constraints)
+
+ if SYSTEM_TEST_DEPENDENCIES:
+ session.install("-e", *SYSTEM_TEST_DEPENDENCIES, *constraints)
+
+ if SYSTEM_TEST_EXTRAS_BY_PYTHON:
+ extras = SYSTEM_TEST_EXTRAS_BY_PYTHON.get(session.python, [])
+ elif SYSTEM_TEST_EXTRAS:
+ extras = SYSTEM_TEST_EXTRAS
+ else:
+ extras = []
+
+ if extras:
+ session.install("-e", f".[{','.join(extras)}]", *constraints)
+ else:
+ session.install("-e", ".", *constraints)
+
+
+@nox.session(python=SYSTEM_TEST_PYTHON_VERSIONS)
+def system(session):
+ """Run the system test suite."""
+ constraints_path = str(
+ CURRENT_DIRECTORY / "testing" / f"constraints-{session.python}.txt"
+ )
+ system_test_path = os.path.join("tests", "system.py")
+ system_test_folder_path = os.path.join("tests", "system")
+
+ # Check the value of `RUN_SYSTEM_TESTS` env var. It defaults to true.
+ if os.environ.get("RUN_SYSTEM_TESTS", "true") == "false":
+ session.skip("RUN_SYSTEM_TESTS is set to false, skipping")
+ # Install pyopenssl for mTLS testing.
+ if os.environ.get("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false") == "true":
+ session.install("pyopenssl")
+
+ system_test_exists = os.path.exists(system_test_path)
+ system_test_folder_exists = os.path.exists(system_test_folder_path)
+ # Sanity check: only run tests if found.
+ if not system_test_exists and not system_test_folder_exists:
+ session.skip("System tests were not found")
+
+ install_systemtest_dependencies(session, "-c", constraints_path)
+
+ # Run py.test against the system tests.
+ if system_test_exists:
+ session.run(
+ "py.test",
+ "--quiet",
+ f"--junitxml=system_{session.python}_sponge_log.xml",
+ system_test_path,
+ *session.posargs,
+ )
+ if system_test_folder_exists:
+ session.run(
+ "py.test",
+ "--quiet",
+ f"--junitxml=system_{session.python}_sponge_log.xml",
+ system_test_folder_path,
+ *session.posargs,
+ )
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def cover(session):
+ """Run the final coverage report.
+
+ This outputs the coverage report aggregating coverage from the unit
+ test runs (not system test runs), and then erases coverage data.
+ """
+ session.install("coverage", "pytest-cov")
+ session.run("coverage", "report", "--show-missing", "--fail-under=100")
+
+ session.run("coverage", "erase")
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def docs(session):
+ """Build the docs for this library."""
+
+ session.install("-e", ".")
+ session.install(
+ "sphinx==4.0.1",
+ "alabaster",
+ "recommonmark",
+ )
+
+ shutil.rmtree(os.path.join("docs", "_build"), ignore_errors=True)
+ session.run(
+ "sphinx-build",
+ "-W", # warnings as errors
+ "-T", # show full traceback on exception
+ "-N", # no colors
+ "-b",
+ "html",
+ "-d",
+ os.path.join("docs", "_build", "doctrees", ""),
+ os.path.join("docs", ""),
+ os.path.join("docs", "_build", "html", ""),
+ )
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def docfx(session):
+ """Build the docfx yaml files for this library."""
+
+ session.install("-e", ".")
+ session.install(
+ "sphinx==4.0.1",
+ "alabaster",
+ "recommonmark",
+ "gcp-sphinx-docfx-yaml",
+ )
+
+ shutil.rmtree(os.path.join("docs", "_build"), ignore_errors=True)
+ session.run(
+ "sphinx-build",
+ "-T", # show full traceback on exception
+ "-N", # no colors
+ "-D",
+ (
+ "extensions=sphinx.ext.autodoc,"
+ "sphinx.ext.autosummary,"
+ "docfx_yaml.extension,"
+ "sphinx.ext.intersphinx,"
+ "sphinx.ext.coverage,"
+ "sphinx.ext.napoleon,"
+ "sphinx.ext.todo,"
+ "sphinx.ext.viewcode,"
+ "recommonmark"
+ ),
+ "-b",
+ "html",
+ "-d",
+ os.path.join("docs", "_build", "doctrees", ""),
+ os.path.join("docs", ""),
+ os.path.join("docs", "_build", "html", ""),
+ )
+
+
+@nox.session(python="3.11")
+def prerelease_deps(session):
+ """Run all tests with prerelease versions of dependencies installed."""
+
+ # Install all dependencies
+ session.install("-e", ".[all, tests, tracing]")
+ unit_deps_all = UNIT_TEST_STANDARD_DEPENDENCIES + UNIT_TEST_EXTERNAL_DEPENDENCIES
+ session.install(*unit_deps_all)
+ system_deps_all = (
+ SYSTEM_TEST_STANDARD_DEPENDENCIES
+ + SYSTEM_TEST_EXTERNAL_DEPENDENCIES
+ + SYSTEM_TEST_EXTRAS
+ )
+ session.install(*system_deps_all)
+
+ # Because we test minimum dependency versions on the minimum Python
+ # version, the first version we test with in the unit tests sessions has a
+ # constraints file containing all dependencies and extras.
+ with open(
+ CURRENT_DIRECTORY
+ / "testing"
+ / f"constraints-{UNIT_TEST_PYTHON_VERSIONS[0]}.txt",
+ encoding="utf-8",
+ ) as constraints_file:
+ constraints_text = constraints_file.read()
+
+ # Ignore leading whitespace and comment lines.
+ constraints_deps = [
+ match.group(1)
+ for match in re.finditer(
+ r"^\s*(\S+)(?===\S+)", constraints_text, flags=re.MULTILINE
+ )
+ ]
+
+ session.install(*constraints_deps)
+
+ prerel_deps = [
+ "protobuf",
+ # dependency of grpc
+ "six",
+ "googleapis-common-protos",
+ # Exclude version 1.52.0rc1 which has a known issue. See https://github.com/grpc/grpc/issues/32163
+ "grpcio!=1.52.0rc1",
+ "grpcio-status",
+ "google-api-core",
+ "proto-plus",
+ "google-cloud-testutils",
+ # dependencies of google-cloud-testutils"
+ "click",
+ ]
+
+ for dep in prerel_deps:
+ session.install("--pre", "--no-deps", "--upgrade", dep)
+
+ # Remaining dependencies
+ other_deps = [
+ "requests",
+ "google-auth",
+ ]
+ session.install(*other_deps)
+
+ # Print out prerelease package versions
+ session.run(
+ "python", "-c", "import google.protobuf; print(google.protobuf.__version__)"
+ )
+ session.run("python", "-c", "import grpc; print(grpc.__version__)")
+
+ session.run("py.test", "tests/unit")
+
+ system_test_path = os.path.join("tests", "system.py")
+ system_test_folder_path = os.path.join("tests", "system")
+
+ # Only run system tests if found.
+ if os.path.exists(system_test_path):
+ session.run(
+ "py.test",
+ "--verbose",
+ f"--junitxml=system_{session.python}_sponge_log.xml",
+ system_test_path,
+ *session.posargs,
+ )
+ if os.path.exists(system_test_folder_path):
+ session.run(
+ "py.test",
+ "--verbose",
+ f"--junitxml=system_{session.python}_sponge_log.xml",
+ system_test_folder_path,
+ *session.posargs,
+ )
diff --git a/packages/google-cloud-redis/scripts/fixup_redis_v1_keywords.py b/packages/google-cloud-redis/scripts/fixup_redis_v1_keywords.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/scripts/fixup_redis_v1_keywords.py
@@ -0,0 +1,186 @@
+#! /usr/bin/env python3
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import argparse
+import os
+import libcst as cst
+import pathlib
+import sys
+from typing import (Any, Callable, Dict, List, Sequence, Tuple)
+
+
+def partition(
+ predicate: Callable[[Any], bool],
+ iterator: Sequence[Any]
+) -> Tuple[List[Any], List[Any]]:
+ """A stable, out-of-place partition."""
+ results = ([], [])
+
+ for i in iterator:
+ results[int(predicate(i))].append(i)
+
+ # Returns trueList, falseList
+ return results[1], results[0]
+
+
+class redisCallTransformer(cst.CSTTransformer):
+ CTRL_PARAMS: Tuple[str] = ('retry', 'timeout', 'metadata')
+ METHOD_TO_PARAMS: Dict[str, Tuple[str]] = {
+ 'create_instance': ('parent', 'instance_id', 'instance', ),
+ 'delete_instance': ('name', ),
+ 'export_instance': ('name', 'output_config', ),
+ 'failover_instance': ('name', 'data_protection_mode', ),
+ 'get_instance': ('name', ),
+ 'get_instance_auth_string': ('name', ),
+ 'import_instance': ('name', 'input_config', ),
+ 'list_instances': ('parent', 'page_size', 'page_token', ),
+ 'reschedule_maintenance': ('name', 'reschedule_type', 'schedule_time', ),
+ 'update_instance': ('update_mask', 'instance', ),
+ 'upgrade_instance': ('name', 'redis_version', ),
+ }
+
+ def leave_Call(self, original: cst.Call, updated: cst.Call) -> cst.CSTNode:
+ try:
+ key = original.func.attr.value
+ kword_params = self.METHOD_TO_PARAMS[key]
+ except (AttributeError, KeyError):
+ # Either not a method from the API or too convoluted to be sure.
+ return updated
+
+ # If the existing code is valid, keyword args come after positional args.
+ # Therefore, all positional args must map to the first parameters.
+ args, kwargs = partition(lambda a: not bool(a.keyword), updated.args)
+ if any(k.keyword.value == "request" for k in kwargs):
+ # We've already fixed this file, don't fix it again.
+ return updated
+
+ kwargs, ctrl_kwargs = partition(
+ lambda a: a.keyword.value not in self.CTRL_PARAMS,
+ kwargs
+ )
+
+ args, ctrl_args = args[:len(kword_params)], args[len(kword_params):]
+ ctrl_kwargs.extend(cst.Arg(value=a.value, keyword=cst.Name(value=ctrl))
+ for a, ctrl in zip(ctrl_args, self.CTRL_PARAMS))
+
+ request_arg = cst.Arg(
+ value=cst.Dict([
+ cst.DictElement(
+ cst.SimpleString("'{}'".format(name)),
+cst.Element(value=arg.value)
+ )
+ # Note: the args + kwargs looks silly, but keep in mind that
+ # the control parameters had to be stripped out, and that
+ # those could have been passed positionally or by keyword.
+ for name, arg in zip(kword_params, args + kwargs)]),
+ keyword=cst.Name("request")
+ )
+
+ return updated.with_changes(
+ args=[request_arg] + ctrl_kwargs
+ )
+
+
+def fix_files(
+ in_dir: pathlib.Path,
+ out_dir: pathlib.Path,
+ *,
+ transformer=redisCallTransformer(),
+):
+ """Duplicate the input dir to the output dir, fixing file method calls.
+
+ Preconditions:
+ * in_dir is a real directory
+ * out_dir is a real, empty directory
+ """
+ pyfile_gen = (
+ pathlib.Path(os.path.join(root, f))
+ for root, _, files in os.walk(in_dir)
+ for f in files if os.path.splitext(f)[1] == ".py"
+ )
+
+ for fpath in pyfile_gen:
+ with open(fpath, 'r') as f:
+ src = f.read()
+
+ # Parse the code and insert method call fixes.
+ tree = cst.parse_module(src)
+ updated = tree.visit(transformer)
+
+ # Create the path and directory structure for the new file.
+ updated_path = out_dir.joinpath(fpath.relative_to(in_dir))
+ updated_path.parent.mkdir(parents=True, exist_ok=True)
+
+ # Generate the updated source file at the corresponding path.
+ with open(updated_path, 'w') as f:
+ f.write(updated.code)
+
+
+if __name__ == '__main__':
+ parser = argparse.ArgumentParser(
+ description="""Fix up source that uses the redis client library.
+
+The existing sources are NOT overwritten but are copied to output_dir with changes made.
+
+Note: This tool operates at a best-effort level at converting positional
+ parameters in client method calls to keyword based parameters.
+ Cases where it WILL FAIL include
+ A) * or ** expansion in a method call.
+ B) Calls via function or method alias (includes free function calls)
+ C) Indirect or dispatched calls (e.g. the method is looked up dynamically)
+
+ These all constitute false negatives. The tool will also detect false
+ positives when an API method shares a name with another method.
+""")
+ parser.add_argument(
+ '-d',
+ '--input-directory',
+ required=True,
+ dest='input_dir',
+ help='the input directory to walk for python files to fix up',
+ )
+ parser.add_argument(
+ '-o',
+ '--output-directory',
+ required=True,
+ dest='output_dir',
+ help='the directory to output files fixed via un-flattening',
+ )
+ args = parser.parse_args()
+ input_dir = pathlib.Path(args.input_dir)
+ output_dir = pathlib.Path(args.output_dir)
+ if not input_dir.is_dir():
+ print(
+ f"input directory '{input_dir}' does not exist or is not a directory",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ if not output_dir.is_dir():
+ print(
+ f"output directory '{output_dir}' does not exist or is not a directory",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ if os.listdir(output_dir):
+ print(
+ f"output directory '{output_dir}' is not empty",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ fix_files(input_dir, output_dir)
diff --git a/packages/google-cloud-redis/scripts/fixup_redis_v1beta1_keywords.py b/packages/google-cloud-redis/scripts/fixup_redis_v1beta1_keywords.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/scripts/fixup_redis_v1beta1_keywords.py
@@ -0,0 +1,186 @@
+#! /usr/bin/env python3
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import argparse
+import os
+import libcst as cst
+import pathlib
+import sys
+from typing import (Any, Callable, Dict, List, Sequence, Tuple)
+
+
+def partition(
+ predicate: Callable[[Any], bool],
+ iterator: Sequence[Any]
+) -> Tuple[List[Any], List[Any]]:
+ """A stable, out-of-place partition."""
+ results = ([], [])
+
+ for i in iterator:
+ results[int(predicate(i))].append(i)
+
+ # Returns trueList, falseList
+ return results[1], results[0]
+
+
+class redisCallTransformer(cst.CSTTransformer):
+ CTRL_PARAMS: Tuple[str] = ('retry', 'timeout', 'metadata')
+ METHOD_TO_PARAMS: Dict[str, Tuple[str]] = {
+ 'create_instance': ('parent', 'instance_id', 'instance', ),
+ 'delete_instance': ('name', ),
+ 'export_instance': ('name', 'output_config', ),
+ 'failover_instance': ('name', 'data_protection_mode', ),
+ 'get_instance': ('name', ),
+ 'get_instance_auth_string': ('name', ),
+ 'import_instance': ('name', 'input_config', ),
+ 'list_instances': ('parent', 'page_size', 'page_token', ),
+ 'reschedule_maintenance': ('name', 'reschedule_type', 'schedule_time', ),
+ 'update_instance': ('update_mask', 'instance', ),
+ 'upgrade_instance': ('name', 'redis_version', ),
+ }
+
+ def leave_Call(self, original: cst.Call, updated: cst.Call) -> cst.CSTNode:
+ try:
+ key = original.func.attr.value
+ kword_params = self.METHOD_TO_PARAMS[key]
+ except (AttributeError, KeyError):
+ # Either not a method from the API or too convoluted to be sure.
+ return updated
+
+ # If the existing code is valid, keyword args come after positional args.
+ # Therefore, all positional args must map to the first parameters.
+ args, kwargs = partition(lambda a: not bool(a.keyword), updated.args)
+ if any(k.keyword.value == "request" for k in kwargs):
+ # We've already fixed this file, don't fix it again.
+ return updated
+
+ kwargs, ctrl_kwargs = partition(
+ lambda a: a.keyword.value not in self.CTRL_PARAMS,
+ kwargs
+ )
+
+ args, ctrl_args = args[:len(kword_params)], args[len(kword_params):]
+ ctrl_kwargs.extend(cst.Arg(value=a.value, keyword=cst.Name(value=ctrl))
+ for a, ctrl in zip(ctrl_args, self.CTRL_PARAMS))
+
+ request_arg = cst.Arg(
+ value=cst.Dict([
+ cst.DictElement(
+ cst.SimpleString("'{}'".format(name)),
+cst.Element(value=arg.value)
+ )
+ # Note: the args + kwargs looks silly, but keep in mind that
+ # the control parameters had to be stripped out, and that
+ # those could have been passed positionally or by keyword.
+ for name, arg in zip(kword_params, args + kwargs)]),
+ keyword=cst.Name("request")
+ )
+
+ return updated.with_changes(
+ args=[request_arg] + ctrl_kwargs
+ )
+
+
+def fix_files(
+ in_dir: pathlib.Path,
+ out_dir: pathlib.Path,
+ *,
+ transformer=redisCallTransformer(),
+):
+ """Duplicate the input dir to the output dir, fixing file method calls.
+
+ Preconditions:
+ * in_dir is a real directory
+ * out_dir is a real, empty directory
+ """
+ pyfile_gen = (
+ pathlib.Path(os.path.join(root, f))
+ for root, _, files in os.walk(in_dir)
+ for f in files if os.path.splitext(f)[1] == ".py"
+ )
+
+ for fpath in pyfile_gen:
+ with open(fpath, 'r') as f:
+ src = f.read()
+
+ # Parse the code and insert method call fixes.
+ tree = cst.parse_module(src)
+ updated = tree.visit(transformer)
+
+ # Create the path and directory structure for the new file.
+ updated_path = out_dir.joinpath(fpath.relative_to(in_dir))
+ updated_path.parent.mkdir(parents=True, exist_ok=True)
+
+ # Generate the updated source file at the corresponding path.
+ with open(updated_path, 'w') as f:
+ f.write(updated.code)
+
+
+if __name__ == '__main__':
+ parser = argparse.ArgumentParser(
+ description="""Fix up source that uses the redis client library.
+
+The existing sources are NOT overwritten but are copied to output_dir with changes made.
+
+Note: This tool operates at a best-effort level at converting positional
+ parameters in client method calls to keyword based parameters.
+ Cases where it WILL FAIL include
+ A) * or ** expansion in a method call.
+ B) Calls via function or method alias (includes free function calls)
+ C) Indirect or dispatched calls (e.g. the method is looked up dynamically)
+
+ These all constitute false negatives. The tool will also detect false
+ positives when an API method shares a name with another method.
+""")
+ parser.add_argument(
+ '-d',
+ '--input-directory',
+ required=True,
+ dest='input_dir',
+ help='the input directory to walk for python files to fix up',
+ )
+ parser.add_argument(
+ '-o',
+ '--output-directory',
+ required=True,
+ dest='output_dir',
+ help='the directory to output files fixed via un-flattening',
+ )
+ args = parser.parse_args()
+ input_dir = pathlib.Path(args.input_dir)
+ output_dir = pathlib.Path(args.output_dir)
+ if not input_dir.is_dir():
+ print(
+ f"input directory '{input_dir}' does not exist or is not a directory",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ if not output_dir.is_dir():
+ print(
+ f"output directory '{output_dir}' does not exist or is not a directory",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ if os.listdir(output_dir):
+ print(
+ f"output directory '{output_dir}' is not empty",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ fix_files(input_dir, output_dir)
diff --git a/packages/google-cloud-redis/scripts/readme-gen/readme_gen.py b/packages/google-cloud-redis/scripts/readme-gen/readme_gen.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/scripts/readme-gen/readme_gen.py
@@ -0,0 +1,69 @@
+#!/usr/bin/env python
+
+# Copyright 2016 Google Inc
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Generates READMEs using configuration defined in yaml."""
+
+import argparse
+import io
+import os
+import subprocess
+
+import jinja2
+import yaml
+
+
+jinja_env = jinja2.Environment(
+ trim_blocks=True,
+ loader=jinja2.FileSystemLoader(
+ os.path.abspath(os.path.join(os.path.dirname(__file__), "templates"))
+ ),
+ autoescape=True,
+)
+
+README_TMPL = jinja_env.get_template('README.tmpl.rst')
+
+
+def get_help(file):
+ return subprocess.check_output(['python', file, '--help']).decode()
+
+
+def main():
+ parser = argparse.ArgumentParser()
+ parser.add_argument('source')
+ parser.add_argument('--destination', default='README.rst')
+
+ args = parser.parse_args()
+
+ source = os.path.abspath(args.source)
+ root = os.path.dirname(source)
+ destination = os.path.join(root, args.destination)
+
+ jinja_env.globals['get_help'] = get_help
+
+ with io.open(source, 'r') as f:
+ config = yaml.load(f)
+
+ # This allows get_help to execute in the right directory.
+ os.chdir(root)
+
+ output = README_TMPL.render(config)
+
+ with io.open(destination, 'w') as f:
+ f.write(output)
+
+
+if __name__ == '__main__':
+ main()
diff --git a/packages/google-cloud-redis/setup.py b/packages/google-cloud-redis/setup.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-redis/setup.py
@@ -0,0 +1,90 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import io
+import os
+
+import setuptools # type: ignore
+
+package_root = os.path.abspath(os.path.dirname(__file__))
+
+name = "google-cloud-redis"
+
+
+description = "Google Cloud Redis API client library"
+
+version = {}
+with open(os.path.join(package_root, "google/cloud/redis/gapic_version.py")) as fp:
+ exec(fp.read(), version)
+version = version["__version__"]
+
+if version[0] == "0":
+ release_status = "Development Status :: 4 - Beta"
+else:
+ release_status = "Development Status :: 5 - Production/Stable"
+
+dependencies = [
+ "google-api-core[grpc] >= 1.34.0, <3.0.0dev,!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,!=2.10.*",
+ "proto-plus >= 1.22.0, <2.0.0dev",
+ "proto-plus >= 1.22.2, <2.0.0dev; python_version>='3.11'",
+ "protobuf>=3.19.5,<5.0.0dev,!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5",
+]
+url = "https://github.com/googleapis/google-cloud-python"
+
+package_root = os.path.abspath(os.path.dirname(__file__))
+
+readme_filename = os.path.join(package_root, "README.rst")
+with io.open(readme_filename, encoding="utf-8") as readme_file:
+ readme = readme_file.read()
+
+packages = [
+ package
+ for package in setuptools.PEP420PackageFinder.find()
+ if package.startswith("google")
+]
+
+namespaces = ["google", "google.cloud"]
+
+setuptools.setup(
+ name=name,
+ version=version,
+ description=description,
+ long_description=readme,
+ author="Google LLC",
+ author_email="googleapis-packages@google.com",
+ license="Apache 2.0",
+ url=url,
+ classifiers=[
+ release_status,
+ "Intended Audience :: Developers",
+ "License :: OSI Approved :: Apache Software License",
+ "Programming Language :: Python",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3.7",
+ "Programming Language :: Python :: 3.8",
+ "Programming Language :: Python :: 3.9",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
+ "Operating System :: OS Independent",
+ "Topic :: Internet",
+ ],
+ platforms="Posix; MacOS X; Windows",
+ packages=packages,
+ python_requires=">=3.7",
+ namespace_packages=namespaces,
+ install_requires=dependencies,
+ include_package_data=True,
+ zip_safe=False,
+)
| AttributeError: url on Storage Exception when key not found
When attempting to get a key that does not exist the exception for the `NotFoundError` is trying to reference `request.url` which does not exist.
``` py
Traceback (most recent call last):
[...]
file_key = self.bucket.get_key(path)
File "gcloud/storage/bucket.py", line 83, in get_key
response = self.connection.api_request(method='GET', path=key.path)
File "gcloud/storage/connection.py", line 212, in api_request
raise exceptions.NotFoundError(response, content)
File "gcloud/storage/exceptions.py", line 17, in __init__
self.message = 'GET %s returned a 404.' % (response.url)
File "httplib2/__init__.py", line 1680, in __getattr__
raise AttributeError, name
AttributeError: url
```
| Thanks for this report. I'm having a look.
It is actually a httplib2.Response instance, which is a dict with some dynamically looked up attributes. We can't guarantee it contains a name. Likely we should print the whole dict instead of trying to look up the url.
| 2023-06-05T16:32:52Z | [] | [] |
Traceback (most recent call last):
[...]
file_key = self.bucket.get_key(path)
File "gcloud/storage/bucket.py", line 83, in get_key
response = self.connection.api_request(method='GET', path=key.path)
File "gcloud/storage/connection.py", line 212, in api_request
raise exceptions.NotFoundError(response, content)
File "gcloud/storage/exceptions.py", line 17, in __init__
self.message = 'GET %s returned a 404.' % (response.url)
File "httplib2/__init__.py", line 1680, in __getattr__
raise AttributeError, name
AttributeError: url
| 5,724 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-11340 | 3a6894e831350094a2b4bf12bdca63b484b3da15 | diff --git a/packages/google-cloud-resource-manager/docs/conf.py b/packages/google-cloud-resource-manager/docs/conf.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/docs/conf.py
@@ -0,0 +1,384 @@
+# -*- coding: utf-8 -*-
+# Copyright 2021 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+# google-cloud-resource-manager documentation build configuration file
+#
+# This file is execfile()d with the current directory set to its
+# containing dir.
+#
+# Note that not all possible configuration values are present in this
+# autogenerated file.
+#
+# All configuration values have a default; values that are commented out
+# serve to show the default.
+
+import os
+import shlex
+import sys
+
+# If extensions (or modules to document with autodoc) are in another directory,
+# add these directories to sys.path here. If the directory is relative to the
+# documentation root, use os.path.abspath to make it absolute, like shown here.
+sys.path.insert(0, os.path.abspath(".."))
+
+# For plugins that can not read conf.py.
+# See also: https://github.com/docascode/sphinx-docfx-yaml/issues/85
+sys.path.insert(0, os.path.abspath("."))
+
+__version__ = ""
+
+# -- General configuration ------------------------------------------------
+
+# If your documentation needs a minimal Sphinx version, state it here.
+needs_sphinx = "1.5.5"
+
+# Add any Sphinx extension module names here, as strings. They can be
+# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
+# ones.
+extensions = [
+ "sphinx.ext.autodoc",
+ "sphinx.ext.autosummary",
+ "sphinx.ext.intersphinx",
+ "sphinx.ext.coverage",
+ "sphinx.ext.doctest",
+ "sphinx.ext.napoleon",
+ "sphinx.ext.todo",
+ "sphinx.ext.viewcode",
+ "recommonmark",
+]
+
+# autodoc/autosummary flags
+autoclass_content = "both"
+autodoc_default_options = {"members": True}
+autosummary_generate = True
+
+
+# Add any paths that contain templates here, relative to this directory.
+templates_path = ["_templates"]
+
+# The suffix(es) of source filenames.
+# You can specify multiple suffix as a list of string:
+# source_suffix = ['.rst', '.md']
+source_suffix = [".rst", ".md"]
+
+# The encoding of source files.
+# source_encoding = 'utf-8-sig'
+
+# The root toctree document.
+root_doc = "index"
+
+# General information about the project.
+project = "google-cloud-resource-manager"
+copyright = "2019, Google"
+author = "Google APIs"
+
+# The version info for the project you're documenting, acts as replacement for
+# |version| and |release|, also used in various other places throughout the
+# built documents.
+#
+# The full version, including alpha/beta/rc tags.
+release = __version__
+# The short X.Y version.
+version = ".".join(release.split(".")[0:2])
+
+# The language for content autogenerated by Sphinx. Refer to documentation
+# for a list of supported languages.
+#
+# This is also used if you do content translation via gettext catalogs.
+# Usually you set "language" from the command line for these cases.
+language = None
+
+# There are two options for replacing |today|: either, you set today to some
+# non-false value, then it is used:
+# today = ''
+# Else, today_fmt is used as the format for a strftime call.
+# today_fmt = '%B %d, %Y'
+
+# List of patterns, relative to source directory, that match files and
+# directories to ignore when looking for source files.
+exclude_patterns = [
+ "_build",
+ "**/.nox/**/*",
+ "samples/AUTHORING_GUIDE.md",
+ "samples/CONTRIBUTING.md",
+ "samples/snippets/README.rst",
+]
+
+# The reST default role (used for this markup: `text`) to use for all
+# documents.
+# default_role = None
+
+# If true, '()' will be appended to :func: etc. cross-reference text.
+# add_function_parentheses = True
+
+# If true, the current module name will be prepended to all description
+# unit titles (such as .. function::).
+# add_module_names = True
+
+# If true, sectionauthor and moduleauthor directives will be shown in the
+# output. They are ignored by default.
+# show_authors = False
+
+# The name of the Pygments (syntax highlighting) style to use.
+pygments_style = "sphinx"
+
+# A list of ignored prefixes for module index sorting.
+# modindex_common_prefix = []
+
+# If true, keep warnings as "system message" paragraphs in the built documents.
+# keep_warnings = False
+
+# If true, `todo` and `todoList` produce output, else they produce nothing.
+todo_include_todos = True
+
+
+# -- Options for HTML output ----------------------------------------------
+
+# The theme to use for HTML and HTML Help pages. See the documentation for
+# a list of builtin themes.
+html_theme = "alabaster"
+
+# Theme options are theme-specific and customize the look and feel of a theme
+# further. For a list of options available for each theme, see the
+# documentation.
+html_theme_options = {
+ "description": "Google Cloud Client Libraries for google-cloud-resource-manager",
+ "github_user": "googleapis",
+ "github_repo": "python-resource-manager",
+ "github_banner": True,
+ "font_family": "'Roboto', Georgia, sans",
+ "head_font_family": "'Roboto', Georgia, serif",
+ "code_font_family": "'Roboto Mono', 'Consolas', monospace",
+}
+
+# Add any paths that contain custom themes here, relative to this directory.
+# html_theme_path = []
+
+# The name for this set of Sphinx documents. If None, it defaults to
+# "<project> v<release> documentation".
+# html_title = None
+
+# A shorter title for the navigation bar. Default is the same as html_title.
+# html_short_title = None
+
+# The name of an image file (relative to this directory) to place at the top
+# of the sidebar.
+# html_logo = None
+
+# The name of an image file (within the static path) to use as favicon of the
+# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
+# pixels large.
+# html_favicon = None
+
+# Add any paths that contain custom static files (such as style sheets) here,
+# relative to this directory. They are copied after the builtin static files,
+# so a file named "default.css" will overwrite the builtin "default.css".
+html_static_path = ["_static"]
+
+# Add any extra paths that contain custom files (such as robots.txt or
+# .htaccess) here, relative to this directory. These files are copied
+# directly to the root of the documentation.
+# html_extra_path = []
+
+# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
+# using the given strftime format.
+# html_last_updated_fmt = '%b %d, %Y'
+
+# If true, SmartyPants will be used to convert quotes and dashes to
+# typographically correct entities.
+# html_use_smartypants = True
+
+# Custom sidebar templates, maps document names to template names.
+# html_sidebars = {}
+
+# Additional templates that should be rendered to pages, maps page names to
+# template names.
+# html_additional_pages = {}
+
+# If false, no module index is generated.
+# html_domain_indices = True
+
+# If false, no index is generated.
+# html_use_index = True
+
+# If true, the index is split into individual pages for each letter.
+# html_split_index = False
+
+# If true, links to the reST sources are added to the pages.
+# html_show_sourcelink = True
+
+# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
+# html_show_sphinx = True
+
+# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
+# html_show_copyright = True
+
+# If true, an OpenSearch description file will be output, and all pages will
+# contain a <link> tag referring to it. The value of this option must be the
+# base URL from which the finished HTML is served.
+# html_use_opensearch = ''
+
+# This is the file name suffix for HTML files (e.g. ".xhtml").
+# html_file_suffix = None
+
+# Language to be used for generating the HTML full-text search index.
+# Sphinx supports the following languages:
+# 'da', 'de', 'en', 'es', 'fi', 'fr', 'hu', 'it', 'ja'
+# 'nl', 'no', 'pt', 'ro', 'ru', 'sv', 'tr'
+# html_search_language = 'en'
+
+# A dictionary with options for the search language support, empty by default.
+# Now only 'ja' uses this config value
+# html_search_options = {'type': 'default'}
+
+# The name of a javascript file (relative to the configuration directory) that
+# implements a search results scorer. If empty, the default will be used.
+# html_search_scorer = 'scorer.js'
+
+# Output file base name for HTML help builder.
+htmlhelp_basename = "google-cloud-resource-manager-doc"
+
+# -- Options for warnings ------------------------------------------------------
+
+
+suppress_warnings = [
+ # Temporarily suppress this to avoid "more than one target found for
+ # cross-reference" warning, which are intractable for us to avoid while in
+ # a mono-repo.
+ # See https://github.com/sphinx-doc/sphinx/blob
+ # /2a65ffeef5c107c19084fabdd706cdff3f52d93c/sphinx/domains/python.py#L843
+ "ref.python"
+]
+
+# -- Options for LaTeX output ---------------------------------------------
+
+latex_elements = {
+ # The paper size ('letterpaper' or 'a4paper').
+ #'papersize': 'letterpaper',
+ # The font size ('10pt', '11pt' or '12pt').
+ #'pointsize': '10pt',
+ # Additional stuff for the LaTeX preamble.
+ #'preamble': '',
+ # Latex figure (float) alignment
+ #'figure_align': 'htbp',
+}
+
+# Grouping the document tree into LaTeX files. List of tuples
+# (source start file, target name, title,
+# author, documentclass [howto, manual, or own class]).
+latex_documents = [
+ (
+ root_doc,
+ "google-cloud-resource-manager.tex",
+ "google-cloud-resource-manager Documentation",
+ author,
+ "manual",
+ )
+]
+
+# The name of an image file (relative to this directory) to place at the top of
+# the title page.
+# latex_logo = None
+
+# For "manual" documents, if this is true, then toplevel headings are parts,
+# not chapters.
+# latex_use_parts = False
+
+# If true, show page references after internal links.
+# latex_show_pagerefs = False
+
+# If true, show URL addresses after external links.
+# latex_show_urls = False
+
+# Documents to append as an appendix to all manuals.
+# latex_appendices = []
+
+# If false, no module index is generated.
+# latex_domain_indices = True
+
+
+# -- Options for manual page output ---------------------------------------
+
+# One entry per manual page. List of tuples
+# (source start file, name, description, authors, manual section).
+man_pages = [
+ (
+ root_doc,
+ "google-cloud-resource-manager",
+ "google-cloud-resource-manager Documentation",
+ [author],
+ 1,
+ )
+]
+
+# If true, show URL addresses after external links.
+# man_show_urls = False
+
+
+# -- Options for Texinfo output -------------------------------------------
+
+# Grouping the document tree into Texinfo files. List of tuples
+# (source start file, target name, title, author,
+# dir menu entry, description, category)
+texinfo_documents = [
+ (
+ root_doc,
+ "google-cloud-resource-manager",
+ "google-cloud-resource-manager Documentation",
+ author,
+ "google-cloud-resource-manager",
+ "google-cloud-resource-manager Library",
+ "APIs",
+ )
+]
+
+# Documents to append as an appendix to all manuals.
+# texinfo_appendices = []
+
+# If false, no module index is generated.
+# texinfo_domain_indices = True
+
+# How to display URL addresses: 'footnote', 'no', or 'inline'.
+# texinfo_show_urls = 'footnote'
+
+# If true, do not generate a @detailmenu in the "Top" node's menu.
+# texinfo_no_detailmenu = False
+
+
+# Example configuration for intersphinx: refer to the Python standard library.
+intersphinx_mapping = {
+ "python": ("https://python.readthedocs.org/en/latest/", None),
+ "google-auth": ("https://googleapis.dev/python/google-auth/latest/", None),
+ "google.api_core": (
+ "https://googleapis.dev/python/google-api-core/latest/",
+ None,
+ ),
+ "grpc": ("https://grpc.github.io/grpc/python/", None),
+ "proto-plus": ("https://proto-plus-python.readthedocs.io/en/latest/", None),
+ "protobuf": ("https://googleapis.dev/python/protobuf/latest/", None),
+}
+
+
+# Napoleon settings
+napoleon_google_docstring = True
+napoleon_numpy_docstring = True
+napoleon_include_private_with_doc = False
+napoleon_include_special_with_doc = True
+napoleon_use_admonition_for_examples = False
+napoleon_use_admonition_for_notes = False
+napoleon_use_admonition_for_references = False
+napoleon_use_ivar = False
+napoleon_use_param = True
+napoleon_use_rtype = True
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager/__init__.py
@@ -0,0 +1,239 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from google.cloud.resourcemanager import gapic_version as package_version
+
+__version__ = package_version.__version__
+
+
+from google.cloud.resourcemanager_v3.services.folders.async_client import (
+ FoldersAsyncClient,
+)
+from google.cloud.resourcemanager_v3.services.folders.client import FoldersClient
+from google.cloud.resourcemanager_v3.services.organizations.async_client import (
+ OrganizationsAsyncClient,
+)
+from google.cloud.resourcemanager_v3.services.organizations.client import (
+ OrganizationsClient,
+)
+from google.cloud.resourcemanager_v3.services.projects.async_client import (
+ ProjectsAsyncClient,
+)
+from google.cloud.resourcemanager_v3.services.projects.client import ProjectsClient
+from google.cloud.resourcemanager_v3.services.tag_bindings.async_client import (
+ TagBindingsAsyncClient,
+)
+from google.cloud.resourcemanager_v3.services.tag_bindings.client import (
+ TagBindingsClient,
+)
+from google.cloud.resourcemanager_v3.services.tag_holds.async_client import (
+ TagHoldsAsyncClient,
+)
+from google.cloud.resourcemanager_v3.services.tag_holds.client import TagHoldsClient
+from google.cloud.resourcemanager_v3.services.tag_keys.async_client import (
+ TagKeysAsyncClient,
+)
+from google.cloud.resourcemanager_v3.services.tag_keys.client import TagKeysClient
+from google.cloud.resourcemanager_v3.services.tag_values.async_client import (
+ TagValuesAsyncClient,
+)
+from google.cloud.resourcemanager_v3.services.tag_values.client import TagValuesClient
+from google.cloud.resourcemanager_v3.types.folders import (
+ CreateFolderMetadata,
+ CreateFolderRequest,
+ DeleteFolderMetadata,
+ DeleteFolderRequest,
+ Folder,
+ GetFolderRequest,
+ ListFoldersRequest,
+ ListFoldersResponse,
+ MoveFolderMetadata,
+ MoveFolderRequest,
+ SearchFoldersRequest,
+ SearchFoldersResponse,
+ UndeleteFolderMetadata,
+ UndeleteFolderRequest,
+ UpdateFolderMetadata,
+ UpdateFolderRequest,
+)
+from google.cloud.resourcemanager_v3.types.organizations import (
+ DeleteOrganizationMetadata,
+ GetOrganizationRequest,
+ Organization,
+ SearchOrganizationsRequest,
+ SearchOrganizationsResponse,
+ UndeleteOrganizationMetadata,
+)
+from google.cloud.resourcemanager_v3.types.projects import (
+ CreateProjectMetadata,
+ CreateProjectRequest,
+ DeleteProjectMetadata,
+ DeleteProjectRequest,
+ GetProjectRequest,
+ ListProjectsRequest,
+ ListProjectsResponse,
+ MoveProjectMetadata,
+ MoveProjectRequest,
+ Project,
+ SearchProjectsRequest,
+ SearchProjectsResponse,
+ UndeleteProjectMetadata,
+ UndeleteProjectRequest,
+ UpdateProjectMetadata,
+ UpdateProjectRequest,
+)
+from google.cloud.resourcemanager_v3.types.tag_bindings import (
+ CreateTagBindingMetadata,
+ CreateTagBindingRequest,
+ DeleteTagBindingMetadata,
+ DeleteTagBindingRequest,
+ EffectiveTag,
+ ListEffectiveTagsRequest,
+ ListEffectiveTagsResponse,
+ ListTagBindingsRequest,
+ ListTagBindingsResponse,
+ TagBinding,
+)
+from google.cloud.resourcemanager_v3.types.tag_holds import (
+ CreateTagHoldMetadata,
+ CreateTagHoldRequest,
+ DeleteTagHoldMetadata,
+ DeleteTagHoldRequest,
+ ListTagHoldsRequest,
+ ListTagHoldsResponse,
+ TagHold,
+)
+from google.cloud.resourcemanager_v3.types.tag_keys import (
+ CreateTagKeyMetadata,
+ CreateTagKeyRequest,
+ DeleteTagKeyMetadata,
+ DeleteTagKeyRequest,
+ GetNamespacedTagKeyRequest,
+ GetTagKeyRequest,
+ ListTagKeysRequest,
+ ListTagKeysResponse,
+ Purpose,
+ TagKey,
+ UpdateTagKeyMetadata,
+ UpdateTagKeyRequest,
+)
+from google.cloud.resourcemanager_v3.types.tag_values import (
+ CreateTagValueMetadata,
+ CreateTagValueRequest,
+ DeleteTagValueMetadata,
+ DeleteTagValueRequest,
+ GetNamespacedTagValueRequest,
+ GetTagValueRequest,
+ ListTagValuesRequest,
+ ListTagValuesResponse,
+ TagValue,
+ UpdateTagValueMetadata,
+ UpdateTagValueRequest,
+)
+
+__all__ = (
+ "FoldersClient",
+ "FoldersAsyncClient",
+ "OrganizationsClient",
+ "OrganizationsAsyncClient",
+ "ProjectsClient",
+ "ProjectsAsyncClient",
+ "TagBindingsClient",
+ "TagBindingsAsyncClient",
+ "TagHoldsClient",
+ "TagHoldsAsyncClient",
+ "TagKeysClient",
+ "TagKeysAsyncClient",
+ "TagValuesClient",
+ "TagValuesAsyncClient",
+ "CreateFolderMetadata",
+ "CreateFolderRequest",
+ "DeleteFolderMetadata",
+ "DeleteFolderRequest",
+ "Folder",
+ "GetFolderRequest",
+ "ListFoldersRequest",
+ "ListFoldersResponse",
+ "MoveFolderMetadata",
+ "MoveFolderRequest",
+ "SearchFoldersRequest",
+ "SearchFoldersResponse",
+ "UndeleteFolderMetadata",
+ "UndeleteFolderRequest",
+ "UpdateFolderMetadata",
+ "UpdateFolderRequest",
+ "DeleteOrganizationMetadata",
+ "GetOrganizationRequest",
+ "Organization",
+ "SearchOrganizationsRequest",
+ "SearchOrganizationsResponse",
+ "UndeleteOrganizationMetadata",
+ "CreateProjectMetadata",
+ "CreateProjectRequest",
+ "DeleteProjectMetadata",
+ "DeleteProjectRequest",
+ "GetProjectRequest",
+ "ListProjectsRequest",
+ "ListProjectsResponse",
+ "MoveProjectMetadata",
+ "MoveProjectRequest",
+ "Project",
+ "SearchProjectsRequest",
+ "SearchProjectsResponse",
+ "UndeleteProjectMetadata",
+ "UndeleteProjectRequest",
+ "UpdateProjectMetadata",
+ "UpdateProjectRequest",
+ "CreateTagBindingMetadata",
+ "CreateTagBindingRequest",
+ "DeleteTagBindingMetadata",
+ "DeleteTagBindingRequest",
+ "EffectiveTag",
+ "ListEffectiveTagsRequest",
+ "ListEffectiveTagsResponse",
+ "ListTagBindingsRequest",
+ "ListTagBindingsResponse",
+ "TagBinding",
+ "CreateTagHoldMetadata",
+ "CreateTagHoldRequest",
+ "DeleteTagHoldMetadata",
+ "DeleteTagHoldRequest",
+ "ListTagHoldsRequest",
+ "ListTagHoldsResponse",
+ "TagHold",
+ "CreateTagKeyMetadata",
+ "CreateTagKeyRequest",
+ "DeleteTagKeyMetadata",
+ "DeleteTagKeyRequest",
+ "GetNamespacedTagKeyRequest",
+ "GetTagKeyRequest",
+ "ListTagKeysRequest",
+ "ListTagKeysResponse",
+ "TagKey",
+ "UpdateTagKeyMetadata",
+ "UpdateTagKeyRequest",
+ "Purpose",
+ "CreateTagValueMetadata",
+ "CreateTagValueRequest",
+ "DeleteTagValueMetadata",
+ "DeleteTagValueRequest",
+ "GetNamespacedTagValueRequest",
+ "GetTagValueRequest",
+ "ListTagValuesRequest",
+ "ListTagValuesResponse",
+ "TagValue",
+ "UpdateTagValueMetadata",
+ "UpdateTagValueRequest",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager/gapic_version.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager/gapic_version.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager/gapic_version.py
@@ -0,0 +1,16 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+__version__ = "1.10.1" # {x-release-please-version}
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/__init__.py
@@ -0,0 +1,214 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+__version__ = package_version.__version__
+
+
+from .services.folders import FoldersAsyncClient, FoldersClient
+from .services.organizations import OrganizationsAsyncClient, OrganizationsClient
+from .services.projects import ProjectsAsyncClient, ProjectsClient
+from .services.tag_bindings import TagBindingsAsyncClient, TagBindingsClient
+from .services.tag_holds import TagHoldsAsyncClient, TagHoldsClient
+from .services.tag_keys import TagKeysAsyncClient, TagKeysClient
+from .services.tag_values import TagValuesAsyncClient, TagValuesClient
+from .types.folders import (
+ CreateFolderMetadata,
+ CreateFolderRequest,
+ DeleteFolderMetadata,
+ DeleteFolderRequest,
+ Folder,
+ GetFolderRequest,
+ ListFoldersRequest,
+ ListFoldersResponse,
+ MoveFolderMetadata,
+ MoveFolderRequest,
+ SearchFoldersRequest,
+ SearchFoldersResponse,
+ UndeleteFolderMetadata,
+ UndeleteFolderRequest,
+ UpdateFolderMetadata,
+ UpdateFolderRequest,
+)
+from .types.organizations import (
+ DeleteOrganizationMetadata,
+ GetOrganizationRequest,
+ Organization,
+ SearchOrganizationsRequest,
+ SearchOrganizationsResponse,
+ UndeleteOrganizationMetadata,
+)
+from .types.projects import (
+ CreateProjectMetadata,
+ CreateProjectRequest,
+ DeleteProjectMetadata,
+ DeleteProjectRequest,
+ GetProjectRequest,
+ ListProjectsRequest,
+ ListProjectsResponse,
+ MoveProjectMetadata,
+ MoveProjectRequest,
+ Project,
+ SearchProjectsRequest,
+ SearchProjectsResponse,
+ UndeleteProjectMetadata,
+ UndeleteProjectRequest,
+ UpdateProjectMetadata,
+ UpdateProjectRequest,
+)
+from .types.tag_bindings import (
+ CreateTagBindingMetadata,
+ CreateTagBindingRequest,
+ DeleteTagBindingMetadata,
+ DeleteTagBindingRequest,
+ EffectiveTag,
+ ListEffectiveTagsRequest,
+ ListEffectiveTagsResponse,
+ ListTagBindingsRequest,
+ ListTagBindingsResponse,
+ TagBinding,
+)
+from .types.tag_holds import (
+ CreateTagHoldMetadata,
+ CreateTagHoldRequest,
+ DeleteTagHoldMetadata,
+ DeleteTagHoldRequest,
+ ListTagHoldsRequest,
+ ListTagHoldsResponse,
+ TagHold,
+)
+from .types.tag_keys import (
+ CreateTagKeyMetadata,
+ CreateTagKeyRequest,
+ DeleteTagKeyMetadata,
+ DeleteTagKeyRequest,
+ GetNamespacedTagKeyRequest,
+ GetTagKeyRequest,
+ ListTagKeysRequest,
+ ListTagKeysResponse,
+ Purpose,
+ TagKey,
+ UpdateTagKeyMetadata,
+ UpdateTagKeyRequest,
+)
+from .types.tag_values import (
+ CreateTagValueMetadata,
+ CreateTagValueRequest,
+ DeleteTagValueMetadata,
+ DeleteTagValueRequest,
+ GetNamespacedTagValueRequest,
+ GetTagValueRequest,
+ ListTagValuesRequest,
+ ListTagValuesResponse,
+ TagValue,
+ UpdateTagValueMetadata,
+ UpdateTagValueRequest,
+)
+
+__all__ = (
+ "FoldersAsyncClient",
+ "OrganizationsAsyncClient",
+ "ProjectsAsyncClient",
+ "TagBindingsAsyncClient",
+ "TagHoldsAsyncClient",
+ "TagKeysAsyncClient",
+ "TagValuesAsyncClient",
+ "CreateFolderMetadata",
+ "CreateFolderRequest",
+ "CreateProjectMetadata",
+ "CreateProjectRequest",
+ "CreateTagBindingMetadata",
+ "CreateTagBindingRequest",
+ "CreateTagHoldMetadata",
+ "CreateTagHoldRequest",
+ "CreateTagKeyMetadata",
+ "CreateTagKeyRequest",
+ "CreateTagValueMetadata",
+ "CreateTagValueRequest",
+ "DeleteFolderMetadata",
+ "DeleteFolderRequest",
+ "DeleteOrganizationMetadata",
+ "DeleteProjectMetadata",
+ "DeleteProjectRequest",
+ "DeleteTagBindingMetadata",
+ "DeleteTagBindingRequest",
+ "DeleteTagHoldMetadata",
+ "DeleteTagHoldRequest",
+ "DeleteTagKeyMetadata",
+ "DeleteTagKeyRequest",
+ "DeleteTagValueMetadata",
+ "DeleteTagValueRequest",
+ "EffectiveTag",
+ "Folder",
+ "FoldersClient",
+ "GetFolderRequest",
+ "GetNamespacedTagKeyRequest",
+ "GetNamespacedTagValueRequest",
+ "GetOrganizationRequest",
+ "GetProjectRequest",
+ "GetTagKeyRequest",
+ "GetTagValueRequest",
+ "ListEffectiveTagsRequest",
+ "ListEffectiveTagsResponse",
+ "ListFoldersRequest",
+ "ListFoldersResponse",
+ "ListProjectsRequest",
+ "ListProjectsResponse",
+ "ListTagBindingsRequest",
+ "ListTagBindingsResponse",
+ "ListTagHoldsRequest",
+ "ListTagHoldsResponse",
+ "ListTagKeysRequest",
+ "ListTagKeysResponse",
+ "ListTagValuesRequest",
+ "ListTagValuesResponse",
+ "MoveFolderMetadata",
+ "MoveFolderRequest",
+ "MoveProjectMetadata",
+ "MoveProjectRequest",
+ "Organization",
+ "OrganizationsClient",
+ "Project",
+ "ProjectsClient",
+ "Purpose",
+ "SearchFoldersRequest",
+ "SearchFoldersResponse",
+ "SearchOrganizationsRequest",
+ "SearchOrganizationsResponse",
+ "SearchProjectsRequest",
+ "SearchProjectsResponse",
+ "TagBinding",
+ "TagBindingsClient",
+ "TagHold",
+ "TagHoldsClient",
+ "TagKey",
+ "TagKeysClient",
+ "TagValue",
+ "TagValuesClient",
+ "UndeleteFolderMetadata",
+ "UndeleteFolderRequest",
+ "UndeleteOrganizationMetadata",
+ "UndeleteProjectMetadata",
+ "UndeleteProjectRequest",
+ "UpdateFolderMetadata",
+ "UpdateFolderRequest",
+ "UpdateProjectMetadata",
+ "UpdateProjectRequest",
+ "UpdateTagKeyMetadata",
+ "UpdateTagKeyRequest",
+ "UpdateTagValueMetadata",
+ "UpdateTagValueRequest",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/gapic_version.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/gapic_version.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/gapic_version.py
@@ -0,0 +1,16 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+__version__ = "1.10.1" # {x-release-please-version}
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/__init__.py
@@ -0,0 +1,15 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/__init__.py
@@ -0,0 +1,22 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .async_client import FoldersAsyncClient
+from .client import FoldersClient
+
+__all__ = (
+ "FoldersClient",
+ "FoldersAsyncClient",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/async_client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/async_client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/async_client.py
@@ -0,0 +1,1842 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import functools
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+)
+
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.api_core.client_options import ClientOptions
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.folders import pagers
+from google.cloud.resourcemanager_v3.types import folders
+
+from .client import FoldersClient
+from .transports.base import DEFAULT_CLIENT_INFO, FoldersTransport
+from .transports.grpc_asyncio import FoldersGrpcAsyncIOTransport
+
+
+class FoldersAsyncClient:
+ """Manages Cloud Platform folder resources.
+ Folders can be used to organize the resources under an
+ organization and to control the policies applied to groups of
+ resources.
+ """
+
+ _client: FoldersClient
+
+ DEFAULT_ENDPOINT = FoldersClient.DEFAULT_ENDPOINT
+ DEFAULT_MTLS_ENDPOINT = FoldersClient.DEFAULT_MTLS_ENDPOINT
+
+ folder_path = staticmethod(FoldersClient.folder_path)
+ parse_folder_path = staticmethod(FoldersClient.parse_folder_path)
+ common_billing_account_path = staticmethod(
+ FoldersClient.common_billing_account_path
+ )
+ parse_common_billing_account_path = staticmethod(
+ FoldersClient.parse_common_billing_account_path
+ )
+ common_folder_path = staticmethod(FoldersClient.common_folder_path)
+ parse_common_folder_path = staticmethod(FoldersClient.parse_common_folder_path)
+ common_organization_path = staticmethod(FoldersClient.common_organization_path)
+ parse_common_organization_path = staticmethod(
+ FoldersClient.parse_common_organization_path
+ )
+ common_project_path = staticmethod(FoldersClient.common_project_path)
+ parse_common_project_path = staticmethod(FoldersClient.parse_common_project_path)
+ common_location_path = staticmethod(FoldersClient.common_location_path)
+ parse_common_location_path = staticmethod(FoldersClient.parse_common_location_path)
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ FoldersAsyncClient: The constructed client.
+ """
+ return FoldersClient.from_service_account_info.__func__(FoldersAsyncClient, info, *args, **kwargs) # type: ignore
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ FoldersAsyncClient: The constructed client.
+ """
+ return FoldersClient.from_service_account_file.__func__(FoldersAsyncClient, filename, *args, **kwargs) # type: ignore
+
+ from_service_account_json = from_service_account_file
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ return FoldersClient.get_mtls_endpoint_and_cert_source(client_options) # type: ignore
+
+ @property
+ def transport(self) -> FoldersTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ FoldersTransport: The transport used by the client instance.
+ """
+ return self._client.transport
+
+ get_transport_class = functools.partial(
+ type(FoldersClient).get_transport_class, type(FoldersClient)
+ )
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Union[str, FoldersTransport] = "grpc_asyncio",
+ client_options: Optional[ClientOptions] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the folders client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, ~.FoldersTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (ClientOptions): Custom options for the client. It
+ won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ self._client = FoldersClient(
+ credentials=credentials,
+ transport=transport,
+ client_options=client_options,
+ client_info=client_info,
+ )
+
+ async def get_folder(
+ self,
+ request: Optional[Union[folders.GetFolderRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> folders.Folder:
+ r"""Retrieves a folder identified by the supplied resource name.
+ Valid folder resource names have the format
+ ``folders/{folder_id}`` (for example, ``folders/1234``). The
+ caller must have ``resourcemanager.folders.get`` permission on
+ the identified folder.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_get_folder():
+ # Create a client
+ client = resourcemanager_v3.FoldersAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetFolderRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_folder(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.GetFolderRequest, dict]]):
+ The request object. The GetFolder request message.
+ name (:class:`str`):
+ Required. The resource name of the folder to retrieve.
+ Must be of the form ``folders/{folder_id}``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.Folder:
+ A folder in an organization's
+ resource hierarchy, used to organize
+ that organization's resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = folders.GetFolderRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_folder,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_folders(
+ self,
+ request: Optional[Union[folders.ListFoldersRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListFoldersAsyncPager:
+ r"""Lists the folders that are direct descendants of supplied parent
+ resource. ``list()`` provides a strongly consistent view of the
+ folders underneath the specified parent resource. ``list()``
+ returns folders sorted based upon the (ascending) lexical
+ ordering of their display_name. The caller must have
+ ``resourcemanager.folders.list`` permission on the identified
+ parent.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_list_folders():
+ # Create a client
+ client = resourcemanager_v3.FoldersAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListFoldersRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_folders(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.ListFoldersRequest, dict]]):
+ The request object. The ListFolders request message.
+ parent (:class:`str`):
+ Required. The name of the parent resource whose folders
+ are being listed. Only children of this parent resource
+ are listed; descendants are not listed.
+
+ If the parent is a folder, use the value
+ ``folders/{folder_id}``. If the parent is an
+ organization, use the value ``organizations/{org_id}``.
+
+ Access to this method is controlled by checking the
+ ``resourcemanager.folders.list`` permission on the
+ ``parent``.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.folders.pagers.ListFoldersAsyncPager:
+ The ListFolders response message.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = folders.ListFoldersRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_folders,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListFoldersAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def search_folders(
+ self,
+ request: Optional[Union[folders.SearchFoldersRequest, dict]] = None,
+ *,
+ query: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.SearchFoldersAsyncPager:
+ r"""Search for folders that match specific filter criteria.
+ ``search()`` provides an eventually consistent view of the
+ folders a user has access to which meet the specified filter
+ criteria.
+
+ This will only return folders on which the caller has the
+ permission ``resourcemanager.folders.get``.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_search_folders():
+ # Create a client
+ client = resourcemanager_v3.FoldersAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.SearchFoldersRequest(
+ )
+
+ # Make the request
+ page_result = client.search_folders(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.SearchFoldersRequest, dict]]):
+ The request object. The request message for searching
+ folders.
+ query (:class:`str`):
+ Optional. Search criteria used to select the folders to
+ return. If no search criteria is specified then all
+ accessible folders will be returned.
+
+ Query expressions can be used to restrict results based
+ upon displayName, state and parent, where the operators
+ ``=`` (``:``) ``NOT``, ``AND`` and ``OR`` can be used
+ along with the suffix wildcard symbol ``*``.
+
+ The ``displayName`` field in a query expression should
+ use escaped quotes for values that include whitespace to
+ prevent unexpected behavior.
+
+ ::
+
+ | Field | Description |
+ |-------------------------|----------------------------------------|
+ | displayName | Filters by displayName. |
+ | parent | Filters by parent (for example: folders/123). |
+ | state, lifecycleState | Filters by state. |
+
+ Some example queries are:
+
+ - Query ``displayName=Test*`` returns Folder resources
+ whose display name starts with "Test".
+ - Query ``state=ACTIVE`` returns Folder resources with
+ ``state`` set to ``ACTIVE``.
+ - Query ``parent=folders/123`` returns Folder resources
+ that have ``folders/123`` as a parent resource.
+ - Query ``parent=folders/123 AND state=ACTIVE`` returns
+ active Folder resources that have ``folders/123`` as
+ a parent resource.
+ - Query ``displayName=\\"Test String\\"`` returns
+ Folder resources with display names that include both
+ "Test" and "String".
+
+ This corresponds to the ``query`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.folders.pagers.SearchFoldersAsyncPager:
+ The response message for searching
+ folders.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([query])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = folders.SearchFoldersRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if query is not None:
+ request.query = query
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.search_folders,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.SearchFoldersAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def create_folder(
+ self,
+ request: Optional[Union[folders.CreateFolderRequest, dict]] = None,
+ *,
+ folder: Optional[folders.Folder] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Creates a folder in the resource hierarchy. Returns an
+ ``Operation`` which can be used to track the progress of the
+ folder creation workflow. Upon success, the
+ ``Operation.response`` field will be populated with the created
+ Folder.
+
+ In order to succeed, the addition of this new folder must not
+ violate the folder naming, height, or fanout constraints.
+
+ - The folder's ``display_name`` must be distinct from all other
+ folders that share its parent.
+ - The addition of the folder must not cause the active folder
+ hierarchy to exceed a height of 10. Note, the full active +
+ deleted folder hierarchy is allowed to reach a height of 20;
+ this provides additional headroom when moving folders that
+ contain deleted folders.
+ - The addition of the folder must not cause the total number of
+ folders under its parent to exceed 300.
+
+ If the operation fails due to a folder constraint violation,
+ some errors may be returned by the ``CreateFolder`` request,
+ with status code ``FAILED_PRECONDITION`` and an error
+ description. Other folder constraint violations will be
+ communicated in the ``Operation``, with the specific
+ ``PreconditionFailure`` returned in the details list in the
+ ``Operation.error`` field.
+
+ The caller must have ``resourcemanager.folders.create``
+ permission on the identified parent.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_create_folder():
+ # Create a client
+ client = resourcemanager_v3.FoldersAsyncClient()
+
+ # Initialize request argument(s)
+ folder = resourcemanager_v3.Folder()
+ folder.parent = "parent_value"
+
+ request = resourcemanager_v3.CreateFolderRequest(
+ folder=folder,
+ )
+
+ # Make the request
+ operation = client.create_folder(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.CreateFolderRequest, dict]]):
+ The request object. The CreateFolder request message.
+ folder (:class:`google.cloud.resourcemanager_v3.types.Folder`):
+ Required. The folder being created,
+ only the display name and parent will be
+ consulted. All other fields will be
+ ignored.
+
+ This corresponds to the ``folder`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Folder` A folder in an organization's resource hierarchy, used to
+ organize that organization's resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([folder])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = folders.CreateFolderRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if folder is not None:
+ request.folder = folder
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.create_folder,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ folders.Folder,
+ metadata_type=folders.CreateFolderMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def update_folder(
+ self,
+ request: Optional[Union[folders.UpdateFolderRequest, dict]] = None,
+ *,
+ folder: Optional[folders.Folder] = None,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Updates a folder, changing its ``display_name``. Changes to the
+ folder ``display_name`` will be rejected if they violate either
+ the ``display_name`` formatting rules or the naming constraints
+ described in the
+ [CreateFolder][google.cloud.resourcemanager.v3.Folders.CreateFolder]
+ documentation.
+
+ The folder's ``display_name`` must start and end with a letter
+ or digit, may contain letters, digits, spaces, hyphens and
+ underscores and can be between 3 and 30 characters. This is
+ captured by the regular expression:
+ ``[\p{L}\p{N}][\p{L}\p{N}_- ]{1,28}[\p{L}\p{N}]``. The caller
+ must have ``resourcemanager.folders.update`` permission on the
+ identified folder.
+
+ If the update fails due to the unique name constraint then a
+ ``PreconditionFailure`` explaining this violation will be
+ returned in the Status.details field.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_update_folder():
+ # Create a client
+ client = resourcemanager_v3.FoldersAsyncClient()
+
+ # Initialize request argument(s)
+ folder = resourcemanager_v3.Folder()
+ folder.parent = "parent_value"
+
+ request = resourcemanager_v3.UpdateFolderRequest(
+ folder=folder,
+ )
+
+ # Make the request
+ operation = client.update_folder(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.UpdateFolderRequest, dict]]):
+ The request object. The request sent to the
+ [UpdateFolder][google.cloud.resourcemanager.v3.Folder.UpdateFolder]
+ method.
+
+ Only the ``display_name`` field can be changed. All
+ other fields will be ignored. Use the
+ [MoveFolder][google.cloud.resourcemanager.v3.Folders.MoveFolder]
+ method to change the ``parent`` field.
+ folder (:class:`google.cloud.resourcemanager_v3.types.Folder`):
+ Required. The new definition of the Folder. It must
+ include the ``name`` field, which cannot be changed.
+
+ This corresponds to the ``folder`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`):
+ Required. Fields to be updated. Only the
+ ``display_name`` can be updated.
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Folder` A folder in an organization's resource hierarchy, used to
+ organize that organization's resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([folder, update_mask])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = folders.UpdateFolderRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if folder is not None:
+ request.folder = folder
+ if update_mask is not None:
+ request.update_mask = update_mask
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.update_folder,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("folder.name", request.folder.name),)
+ ),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ folders.Folder,
+ metadata_type=folders.UpdateFolderMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def move_folder(
+ self,
+ request: Optional[Union[folders.MoveFolderRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ destination_parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Moves a folder under a new resource parent. Returns an
+ ``Operation`` which can be used to track the progress of the
+ folder move workflow. Upon success, the ``Operation.response``
+ field will be populated with the moved folder. Upon failure, a
+ ``FolderOperationError`` categorizing the failure cause will be
+ returned - if the failure occurs synchronously then the
+ ``FolderOperationError`` will be returned in the
+ ``Status.details`` field. If it occurs asynchronously, then the
+ FolderOperation will be returned in the ``Operation.error``
+ field. In addition, the ``Operation.metadata`` field will be
+ populated with a ``FolderOperation`` message as an aid to
+ stateless clients. Folder moves will be rejected if they violate
+ either the naming, height, or fanout constraints described in
+ the
+ [CreateFolder][google.cloud.resourcemanager.v3.Folders.CreateFolder]
+ documentation. The caller must have
+ ``resourcemanager.folders.move`` permission on the folder's
+ current and proposed new parent.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_move_folder():
+ # Create a client
+ client = resourcemanager_v3.FoldersAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.MoveFolderRequest(
+ name="name_value",
+ destination_parent="destination_parent_value",
+ )
+
+ # Make the request
+ operation = client.move_folder(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.MoveFolderRequest, dict]]):
+ The request object. The MoveFolder request message.
+ name (:class:`str`):
+ Required. The resource name of the Folder to move. Must
+ be of the form folders/{folder_id}
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ destination_parent (:class:`str`):
+ Required. The resource name of the folder or
+ organization which should be the folder's new parent.
+ Must be of the form ``folders/{folder_id}`` or
+ ``organizations/{org_id}``.
+
+ This corresponds to the ``destination_parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Folder` A folder in an organization's resource hierarchy, used to
+ organize that organization's resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, destination_parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = folders.MoveFolderRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if destination_parent is not None:
+ request.destination_parent = destination_parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.move_folder,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ folders.Folder,
+ metadata_type=folders.MoveFolderMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def delete_folder(
+ self,
+ request: Optional[Union[folders.DeleteFolderRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Requests deletion of a folder. The folder is moved into the
+ [DELETE_REQUESTED][google.cloud.resourcemanager.v3.Folder.State.DELETE_REQUESTED]
+ state immediately, and is deleted approximately 30 days later.
+ This method may only be called on an empty folder, where a
+ folder is empty if it doesn't contain any folders or projects in
+ the
+ [ACTIVE][google.cloud.resourcemanager.v3.Folder.State.ACTIVE]
+ state. If called on a folder in
+ [DELETE_REQUESTED][google.cloud.resourcemanager.v3.Folder.State.DELETE_REQUESTED]
+ state the operation will result in a no-op success. The caller
+ must have ``resourcemanager.folders.delete`` permission on the
+ identified folder.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_delete_folder():
+ # Create a client
+ client = resourcemanager_v3.FoldersAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.DeleteFolderRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_folder(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.DeleteFolderRequest, dict]]):
+ The request object. The DeleteFolder request message.
+ name (:class:`str`):
+ Required. The resource name of the folder to be deleted.
+ Must be of the form ``folders/{folder_id}``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Folder` A folder in an organization's resource hierarchy, used to
+ organize that organization's resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = folders.DeleteFolderRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.delete_folder,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ folders.Folder,
+ metadata_type=folders.DeleteFolderMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def undelete_folder(
+ self,
+ request: Optional[Union[folders.UndeleteFolderRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Cancels the deletion request for a folder. This method may be
+ called on a folder in any state. If the folder is in the
+ [ACTIVE][google.cloud.resourcemanager.v3.Folder.State.ACTIVE]
+ state the result will be a no-op success. In order to succeed,
+ the folder's parent must be in the
+ [ACTIVE][google.cloud.resourcemanager.v3.Folder.State.ACTIVE]
+ state. In addition, reintroducing the folder into the tree must
+ not violate folder naming, height, and fanout constraints
+ described in the
+ [CreateFolder][google.cloud.resourcemanager.v3.Folders.CreateFolder]
+ documentation. The caller must have
+ ``resourcemanager.folders.undelete`` permission on the
+ identified folder.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_undelete_folder():
+ # Create a client
+ client = resourcemanager_v3.FoldersAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.UndeleteFolderRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.undelete_folder(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.UndeleteFolderRequest, dict]]):
+ The request object. The UndeleteFolder request message.
+ name (:class:`str`):
+ Required. The resource name of the folder to undelete.
+ Must be of the form ``folders/{folder_id}``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Folder` A folder in an organization's resource hierarchy, used to
+ organize that organization's resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = folders.UndeleteFolderRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.undelete_folder,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ folders.Folder,
+ metadata_type=folders.UndeleteFolderMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.GetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Gets the access control policy for a folder. The returned policy
+ may be empty if no such policy or resource exists. The
+ ``resource`` field should be the folder's resource name, for
+ example: "folders/1234". The caller must have
+ ``resourcemanager.folders.getIamPolicy`` permission on the
+ identified folder.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_get_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.FoldersAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = await client.get_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.GetIamPolicyRequest, dict]]):
+ The request object. Request message for ``GetIamPolicy`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy is being requested. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.GetIamPolicyRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource=resource,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_iam_policy,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def set_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.SetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Sets the access control policy on a folder, replacing any
+ existing policy. The ``resource`` field should be the folder's
+ resource name, for example: "folders/1234". The caller must have
+ ``resourcemanager.folders.setIamPolicy`` permission on the
+ identified folder.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_set_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.FoldersAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = await client.set_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.SetIamPolicyRequest, dict]]):
+ The request object. Request message for ``SetIamPolicy`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy is being specified. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.SetIamPolicyRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource=resource,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.set_iam_policy,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def test_iam_permissions(
+ self,
+ request: Optional[Union[iam_policy_pb2.TestIamPermissionsRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ permissions: Optional[MutableSequence[str]] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Returns permissions that a caller has on the specified folder.
+ The ``resource`` field should be the folder's resource name, for
+ example: "folders/1234".
+
+ There are no permissions required for making this API call.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_test_iam_permissions():
+ # Create a client
+ client = resourcemanager_v3.FoldersAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource="resource_value",
+ permissions=['permissions_value1', 'permissions_value2'],
+ )
+
+ # Make the request
+ response = await client.test_iam_permissions(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest, dict]]):
+ The request object. Request message for ``TestIamPermissions`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy detail is being requested. See
+ the operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ permissions (:class:`MutableSequence[str]`):
+ The set of permissions to check for the ``resource``.
+ Permissions with wildcards (such as '*' or 'storage.*')
+ are not allowed. For more information see `IAM
+ Overview <https://cloud.google.com/iam/docs/overview#permissions>`__.
+
+ This corresponds to the ``permissions`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for TestIamPermissions method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource, permissions])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.TestIamPermissionsRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource=resource,
+ permissions=permissions,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.test_iam_permissions,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._client._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, exc_type, exc, tb):
+ await self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("FoldersAsyncClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/client.py
@@ -0,0 +1,2042 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import os
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+ cast,
+)
+
+from google.api_core import client_options as client_options_lib
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.exceptions import MutualTLSChannelError # type: ignore
+from google.auth.transport import mtls # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.folders import pagers
+from google.cloud.resourcemanager_v3.types import folders
+
+from .transports.base import DEFAULT_CLIENT_INFO, FoldersTransport
+from .transports.grpc import FoldersGrpcTransport
+from .transports.grpc_asyncio import FoldersGrpcAsyncIOTransport
+from .transports.rest import FoldersRestTransport
+
+
+class FoldersClientMeta(type):
+ """Metaclass for the Folders client.
+
+ This provides class-level methods for building and retrieving
+ support objects (e.g. transport) without polluting the client instance
+ objects.
+ """
+
+ _transport_registry = OrderedDict() # type: Dict[str, Type[FoldersTransport]]
+ _transport_registry["grpc"] = FoldersGrpcTransport
+ _transport_registry["grpc_asyncio"] = FoldersGrpcAsyncIOTransport
+ _transport_registry["rest"] = FoldersRestTransport
+
+ def get_transport_class(
+ cls,
+ label: Optional[str] = None,
+ ) -> Type[FoldersTransport]:
+ """Returns an appropriate transport class.
+
+ Args:
+ label: The name of the desired transport. If none is
+ provided, then the first transport in the registry is used.
+
+ Returns:
+ The transport class to use.
+ """
+ # If a specific transport is requested, return that one.
+ if label:
+ return cls._transport_registry[label]
+
+ # No transport is requested; return the default (that is, the first one
+ # in the dictionary).
+ return next(iter(cls._transport_registry.values()))
+
+
+class FoldersClient(metaclass=FoldersClientMeta):
+ """Manages Cloud Platform folder resources.
+ Folders can be used to organize the resources under an
+ organization and to control the policies applied to groups of
+ resources.
+ """
+
+ @staticmethod
+ def _get_default_mtls_endpoint(api_endpoint):
+ """Converts api endpoint to mTLS endpoint.
+
+ Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to
+ "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively.
+ Args:
+ api_endpoint (Optional[str]): the api endpoint to convert.
+ Returns:
+ str: converted mTLS api endpoint.
+ """
+ if not api_endpoint:
+ return api_endpoint
+
+ mtls_endpoint_re = re.compile(
+ r"(?P<name>[^.]+)(?P<mtls>\.mtls)?(?P<sandbox>\.sandbox)?(?P<googledomain>\.googleapis\.com)?"
+ )
+
+ m = mtls_endpoint_re.match(api_endpoint)
+ name, mtls, sandbox, googledomain = m.groups()
+ if mtls or not googledomain:
+ return api_endpoint
+
+ if sandbox:
+ return api_endpoint.replace(
+ "sandbox.googleapis.com", "mtls.sandbox.googleapis.com"
+ )
+
+ return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com")
+
+ DEFAULT_ENDPOINT = "cloudresourcemanager.googleapis.com"
+ DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore
+ DEFAULT_ENDPOINT
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ FoldersClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_info(info)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ FoldersClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_file(filename)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ from_service_account_json = from_service_account_file
+
+ @property
+ def transport(self) -> FoldersTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ FoldersTransport: The transport used by the client
+ instance.
+ """
+ return self._transport
+
+ @staticmethod
+ def folder_path(
+ folder: str,
+ ) -> str:
+ """Returns a fully-qualified folder string."""
+ return "folders/{folder}".format(
+ folder=folder,
+ )
+
+ @staticmethod
+ def parse_folder_path(path: str) -> Dict[str, str]:
+ """Parses a folder path into its component segments."""
+ m = re.match(r"^folders/(?P<folder>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_billing_account_path(
+ billing_account: str,
+ ) -> str:
+ """Returns a fully-qualified billing_account string."""
+ return "billingAccounts/{billing_account}".format(
+ billing_account=billing_account,
+ )
+
+ @staticmethod
+ def parse_common_billing_account_path(path: str) -> Dict[str, str]:
+ """Parse a billing_account path into its component segments."""
+ m = re.match(r"^billingAccounts/(?P<billing_account>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_folder_path(
+ folder: str,
+ ) -> str:
+ """Returns a fully-qualified folder string."""
+ return "folders/{folder}".format(
+ folder=folder,
+ )
+
+ @staticmethod
+ def parse_common_folder_path(path: str) -> Dict[str, str]:
+ """Parse a folder path into its component segments."""
+ m = re.match(r"^folders/(?P<folder>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_organization_path(
+ organization: str,
+ ) -> str:
+ """Returns a fully-qualified organization string."""
+ return "organizations/{organization}".format(
+ organization=organization,
+ )
+
+ @staticmethod
+ def parse_common_organization_path(path: str) -> Dict[str, str]:
+ """Parse a organization path into its component segments."""
+ m = re.match(r"^organizations/(?P<organization>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_project_path(
+ project: str,
+ ) -> str:
+ """Returns a fully-qualified project string."""
+ return "projects/{project}".format(
+ project=project,
+ )
+
+ @staticmethod
+ def parse_common_project_path(path: str) -> Dict[str, str]:
+ """Parse a project path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_location_path(
+ project: str,
+ location: str,
+ ) -> str:
+ """Returns a fully-qualified location string."""
+ return "projects/{project}/locations/{location}".format(
+ project=project,
+ location=location,
+ )
+
+ @staticmethod
+ def parse_common_location_path(path: str) -> Dict[str, str]:
+ """Parse a location path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[client_options_lib.ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ use_client_cert = os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")
+ use_mtls_endpoint = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto")
+ if use_client_cert not in ("true", "false"):
+ raise ValueError(
+ "Environment variable `GOOGLE_API_USE_CLIENT_CERTIFICATE` must be either `true` or `false`"
+ )
+ if use_mtls_endpoint not in ("auto", "never", "always"):
+ raise MutualTLSChannelError(
+ "Environment variable `GOOGLE_API_USE_MTLS_ENDPOINT` must be `never`, `auto` or `always`"
+ )
+
+ # Figure out the client cert source to use.
+ client_cert_source = None
+ if use_client_cert == "true":
+ if client_options.client_cert_source:
+ client_cert_source = client_options.client_cert_source
+ elif mtls.has_default_client_cert_source():
+ client_cert_source = mtls.default_client_cert_source()
+
+ # Figure out which api endpoint to use.
+ if client_options.api_endpoint is not None:
+ api_endpoint = client_options.api_endpoint
+ elif use_mtls_endpoint == "always" or (
+ use_mtls_endpoint == "auto" and client_cert_source
+ ):
+ api_endpoint = cls.DEFAULT_MTLS_ENDPOINT
+ else:
+ api_endpoint = cls.DEFAULT_ENDPOINT
+
+ return api_endpoint, client_cert_source
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Optional[Union[str, FoldersTransport]] = None,
+ client_options: Optional[Union[client_options_lib.ClientOptions, dict]] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the folders client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, FoldersTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (Optional[Union[google.api_core.client_options.ClientOptions, dict]]): Custom options for the
+ client. It won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ if isinstance(client_options, dict):
+ client_options = client_options_lib.from_dict(client_options)
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ client_options = cast(client_options_lib.ClientOptions, client_options)
+
+ api_endpoint, client_cert_source_func = self.get_mtls_endpoint_and_cert_source(
+ client_options
+ )
+
+ api_key_value = getattr(client_options, "api_key", None)
+ if api_key_value and credentials:
+ raise ValueError(
+ "client_options.api_key and credentials are mutually exclusive"
+ )
+
+ # Save or instantiate the transport.
+ # Ordinarily, we provide the transport, but allowing a custom transport
+ # instance provides an extensibility point for unusual situations.
+ if isinstance(transport, FoldersTransport):
+ # transport is a FoldersTransport instance.
+ if credentials or client_options.credentials_file or api_key_value:
+ raise ValueError(
+ "When providing a transport instance, "
+ "provide its credentials directly."
+ )
+ if client_options.scopes:
+ raise ValueError(
+ "When providing a transport instance, provide its scopes "
+ "directly."
+ )
+ self._transport = transport
+ else:
+ import google.auth._default # type: ignore
+
+ if api_key_value and hasattr(
+ google.auth._default, "get_api_key_credentials"
+ ):
+ credentials = google.auth._default.get_api_key_credentials(
+ api_key_value
+ )
+
+ Transport = type(self).get_transport_class(transport)
+ self._transport = Transport(
+ credentials=credentials,
+ credentials_file=client_options.credentials_file,
+ host=api_endpoint,
+ scopes=client_options.scopes,
+ client_cert_source_for_mtls=client_cert_source_func,
+ quota_project_id=client_options.quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=True,
+ api_audience=client_options.api_audience,
+ )
+
+ def get_folder(
+ self,
+ request: Optional[Union[folders.GetFolderRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> folders.Folder:
+ r"""Retrieves a folder identified by the supplied resource name.
+ Valid folder resource names have the format
+ ``folders/{folder_id}`` (for example, ``folders/1234``). The
+ caller must have ``resourcemanager.folders.get`` permission on
+ the identified folder.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_get_folder():
+ # Create a client
+ client = resourcemanager_v3.FoldersClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetFolderRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_folder(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.GetFolderRequest, dict]):
+ The request object. The GetFolder request message.
+ name (str):
+ Required. The resource name of the folder to retrieve.
+ Must be of the form ``folders/{folder_id}``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.Folder:
+ A folder in an organization's
+ resource hierarchy, used to organize
+ that organization's resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a folders.GetFolderRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, folders.GetFolderRequest):
+ request = folders.GetFolderRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_folder]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_folders(
+ self,
+ request: Optional[Union[folders.ListFoldersRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListFoldersPager:
+ r"""Lists the folders that are direct descendants of supplied parent
+ resource. ``list()`` provides a strongly consistent view of the
+ folders underneath the specified parent resource. ``list()``
+ returns folders sorted based upon the (ascending) lexical
+ ordering of their display_name. The caller must have
+ ``resourcemanager.folders.list`` permission on the identified
+ parent.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_list_folders():
+ # Create a client
+ client = resourcemanager_v3.FoldersClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListFoldersRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_folders(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.ListFoldersRequest, dict]):
+ The request object. The ListFolders request message.
+ parent (str):
+ Required. The name of the parent resource whose folders
+ are being listed. Only children of this parent resource
+ are listed; descendants are not listed.
+
+ If the parent is a folder, use the value
+ ``folders/{folder_id}``. If the parent is an
+ organization, use the value ``organizations/{org_id}``.
+
+ Access to this method is controlled by checking the
+ ``resourcemanager.folders.list`` permission on the
+ ``parent``.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.folders.pagers.ListFoldersPager:
+ The ListFolders response message.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a folders.ListFoldersRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, folders.ListFoldersRequest):
+ request = folders.ListFoldersRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_folders]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListFoldersPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def search_folders(
+ self,
+ request: Optional[Union[folders.SearchFoldersRequest, dict]] = None,
+ *,
+ query: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.SearchFoldersPager:
+ r"""Search for folders that match specific filter criteria.
+ ``search()`` provides an eventually consistent view of the
+ folders a user has access to which meet the specified filter
+ criteria.
+
+ This will only return folders on which the caller has the
+ permission ``resourcemanager.folders.get``.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_search_folders():
+ # Create a client
+ client = resourcemanager_v3.FoldersClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.SearchFoldersRequest(
+ )
+
+ # Make the request
+ page_result = client.search_folders(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.SearchFoldersRequest, dict]):
+ The request object. The request message for searching
+ folders.
+ query (str):
+ Optional. Search criteria used to select the folders to
+ return. If no search criteria is specified then all
+ accessible folders will be returned.
+
+ Query expressions can be used to restrict results based
+ upon displayName, state and parent, where the operators
+ ``=`` (``:``) ``NOT``, ``AND`` and ``OR`` can be used
+ along with the suffix wildcard symbol ``*``.
+
+ The ``displayName`` field in a query expression should
+ use escaped quotes for values that include whitespace to
+ prevent unexpected behavior.
+
+ ::
+
+ | Field | Description |
+ |-------------------------|----------------------------------------|
+ | displayName | Filters by displayName. |
+ | parent | Filters by parent (for example: folders/123). |
+ | state, lifecycleState | Filters by state. |
+
+ Some example queries are:
+
+ - Query ``displayName=Test*`` returns Folder resources
+ whose display name starts with "Test".
+ - Query ``state=ACTIVE`` returns Folder resources with
+ ``state`` set to ``ACTIVE``.
+ - Query ``parent=folders/123`` returns Folder resources
+ that have ``folders/123`` as a parent resource.
+ - Query ``parent=folders/123 AND state=ACTIVE`` returns
+ active Folder resources that have ``folders/123`` as
+ a parent resource.
+ - Query ``displayName=\\"Test String\\"`` returns
+ Folder resources with display names that include both
+ "Test" and "String".
+
+ This corresponds to the ``query`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.folders.pagers.SearchFoldersPager:
+ The response message for searching
+ folders.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([query])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a folders.SearchFoldersRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, folders.SearchFoldersRequest):
+ request = folders.SearchFoldersRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if query is not None:
+ request.query = query
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.search_folders]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.SearchFoldersPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def create_folder(
+ self,
+ request: Optional[Union[folders.CreateFolderRequest, dict]] = None,
+ *,
+ folder: Optional[folders.Folder] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Creates a folder in the resource hierarchy. Returns an
+ ``Operation`` which can be used to track the progress of the
+ folder creation workflow. Upon success, the
+ ``Operation.response`` field will be populated with the created
+ Folder.
+
+ In order to succeed, the addition of this new folder must not
+ violate the folder naming, height, or fanout constraints.
+
+ - The folder's ``display_name`` must be distinct from all other
+ folders that share its parent.
+ - The addition of the folder must not cause the active folder
+ hierarchy to exceed a height of 10. Note, the full active +
+ deleted folder hierarchy is allowed to reach a height of 20;
+ this provides additional headroom when moving folders that
+ contain deleted folders.
+ - The addition of the folder must not cause the total number of
+ folders under its parent to exceed 300.
+
+ If the operation fails due to a folder constraint violation,
+ some errors may be returned by the ``CreateFolder`` request,
+ with status code ``FAILED_PRECONDITION`` and an error
+ description. Other folder constraint violations will be
+ communicated in the ``Operation``, with the specific
+ ``PreconditionFailure`` returned in the details list in the
+ ``Operation.error`` field.
+
+ The caller must have ``resourcemanager.folders.create``
+ permission on the identified parent.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_create_folder():
+ # Create a client
+ client = resourcemanager_v3.FoldersClient()
+
+ # Initialize request argument(s)
+ folder = resourcemanager_v3.Folder()
+ folder.parent = "parent_value"
+
+ request = resourcemanager_v3.CreateFolderRequest(
+ folder=folder,
+ )
+
+ # Make the request
+ operation = client.create_folder(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.CreateFolderRequest, dict]):
+ The request object. The CreateFolder request message.
+ folder (google.cloud.resourcemanager_v3.types.Folder):
+ Required. The folder being created,
+ only the display name and parent will be
+ consulted. All other fields will be
+ ignored.
+
+ This corresponds to the ``folder`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Folder` A folder in an organization's resource hierarchy, used to
+ organize that organization's resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([folder])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a folders.CreateFolderRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, folders.CreateFolderRequest):
+ request = folders.CreateFolderRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if folder is not None:
+ request.folder = folder
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.create_folder]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ folders.Folder,
+ metadata_type=folders.CreateFolderMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def update_folder(
+ self,
+ request: Optional[Union[folders.UpdateFolderRequest, dict]] = None,
+ *,
+ folder: Optional[folders.Folder] = None,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Updates a folder, changing its ``display_name``. Changes to the
+ folder ``display_name`` will be rejected if they violate either
+ the ``display_name`` formatting rules or the naming constraints
+ described in the
+ [CreateFolder][google.cloud.resourcemanager.v3.Folders.CreateFolder]
+ documentation.
+
+ The folder's ``display_name`` must start and end with a letter
+ or digit, may contain letters, digits, spaces, hyphens and
+ underscores and can be between 3 and 30 characters. This is
+ captured by the regular expression:
+ ``[\p{L}\p{N}][\p{L}\p{N}_- ]{1,28}[\p{L}\p{N}]``. The caller
+ must have ``resourcemanager.folders.update`` permission on the
+ identified folder.
+
+ If the update fails due to the unique name constraint then a
+ ``PreconditionFailure`` explaining this violation will be
+ returned in the Status.details field.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_update_folder():
+ # Create a client
+ client = resourcemanager_v3.FoldersClient()
+
+ # Initialize request argument(s)
+ folder = resourcemanager_v3.Folder()
+ folder.parent = "parent_value"
+
+ request = resourcemanager_v3.UpdateFolderRequest(
+ folder=folder,
+ )
+
+ # Make the request
+ operation = client.update_folder(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.UpdateFolderRequest, dict]):
+ The request object. The request sent to the
+ [UpdateFolder][google.cloud.resourcemanager.v3.Folder.UpdateFolder]
+ method.
+
+ Only the ``display_name`` field can be changed. All
+ other fields will be ignored. Use the
+ [MoveFolder][google.cloud.resourcemanager.v3.Folders.MoveFolder]
+ method to change the ``parent`` field.
+ folder (google.cloud.resourcemanager_v3.types.Folder):
+ Required. The new definition of the Folder. It must
+ include the ``name`` field, which cannot be changed.
+
+ This corresponds to the ``folder`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Required. Fields to be updated. Only the
+ ``display_name`` can be updated.
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Folder` A folder in an organization's resource hierarchy, used to
+ organize that organization's resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([folder, update_mask])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a folders.UpdateFolderRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, folders.UpdateFolderRequest):
+ request = folders.UpdateFolderRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if folder is not None:
+ request.folder = folder
+ if update_mask is not None:
+ request.update_mask = update_mask
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.update_folder]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("folder.name", request.folder.name),)
+ ),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ folders.Folder,
+ metadata_type=folders.UpdateFolderMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def move_folder(
+ self,
+ request: Optional[Union[folders.MoveFolderRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ destination_parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Moves a folder under a new resource parent. Returns an
+ ``Operation`` which can be used to track the progress of the
+ folder move workflow. Upon success, the ``Operation.response``
+ field will be populated with the moved folder. Upon failure, a
+ ``FolderOperationError`` categorizing the failure cause will be
+ returned - if the failure occurs synchronously then the
+ ``FolderOperationError`` will be returned in the
+ ``Status.details`` field. If it occurs asynchronously, then the
+ FolderOperation will be returned in the ``Operation.error``
+ field. In addition, the ``Operation.metadata`` field will be
+ populated with a ``FolderOperation`` message as an aid to
+ stateless clients. Folder moves will be rejected if they violate
+ either the naming, height, or fanout constraints described in
+ the
+ [CreateFolder][google.cloud.resourcemanager.v3.Folders.CreateFolder]
+ documentation. The caller must have
+ ``resourcemanager.folders.move`` permission on the folder's
+ current and proposed new parent.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_move_folder():
+ # Create a client
+ client = resourcemanager_v3.FoldersClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.MoveFolderRequest(
+ name="name_value",
+ destination_parent="destination_parent_value",
+ )
+
+ # Make the request
+ operation = client.move_folder(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.MoveFolderRequest, dict]):
+ The request object. The MoveFolder request message.
+ name (str):
+ Required. The resource name of the Folder to move. Must
+ be of the form folders/{folder_id}
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ destination_parent (str):
+ Required. The resource name of the folder or
+ organization which should be the folder's new parent.
+ Must be of the form ``folders/{folder_id}`` or
+ ``organizations/{org_id}``.
+
+ This corresponds to the ``destination_parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Folder` A folder in an organization's resource hierarchy, used to
+ organize that organization's resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, destination_parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a folders.MoveFolderRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, folders.MoveFolderRequest):
+ request = folders.MoveFolderRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if destination_parent is not None:
+ request.destination_parent = destination_parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.move_folder]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ folders.Folder,
+ metadata_type=folders.MoveFolderMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def delete_folder(
+ self,
+ request: Optional[Union[folders.DeleteFolderRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Requests deletion of a folder. The folder is moved into the
+ [DELETE_REQUESTED][google.cloud.resourcemanager.v3.Folder.State.DELETE_REQUESTED]
+ state immediately, and is deleted approximately 30 days later.
+ This method may only be called on an empty folder, where a
+ folder is empty if it doesn't contain any folders or projects in
+ the
+ [ACTIVE][google.cloud.resourcemanager.v3.Folder.State.ACTIVE]
+ state. If called on a folder in
+ [DELETE_REQUESTED][google.cloud.resourcemanager.v3.Folder.State.DELETE_REQUESTED]
+ state the operation will result in a no-op success. The caller
+ must have ``resourcemanager.folders.delete`` permission on the
+ identified folder.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_delete_folder():
+ # Create a client
+ client = resourcemanager_v3.FoldersClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.DeleteFolderRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_folder(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.DeleteFolderRequest, dict]):
+ The request object. The DeleteFolder request message.
+ name (str):
+ Required. The resource name of the folder to be deleted.
+ Must be of the form ``folders/{folder_id}``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Folder` A folder in an organization's resource hierarchy, used to
+ organize that organization's resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a folders.DeleteFolderRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, folders.DeleteFolderRequest):
+ request = folders.DeleteFolderRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.delete_folder]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ folders.Folder,
+ metadata_type=folders.DeleteFolderMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def undelete_folder(
+ self,
+ request: Optional[Union[folders.UndeleteFolderRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Cancels the deletion request for a folder. This method may be
+ called on a folder in any state. If the folder is in the
+ [ACTIVE][google.cloud.resourcemanager.v3.Folder.State.ACTIVE]
+ state the result will be a no-op success. In order to succeed,
+ the folder's parent must be in the
+ [ACTIVE][google.cloud.resourcemanager.v3.Folder.State.ACTIVE]
+ state. In addition, reintroducing the folder into the tree must
+ not violate folder naming, height, and fanout constraints
+ described in the
+ [CreateFolder][google.cloud.resourcemanager.v3.Folders.CreateFolder]
+ documentation. The caller must have
+ ``resourcemanager.folders.undelete`` permission on the
+ identified folder.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_undelete_folder():
+ # Create a client
+ client = resourcemanager_v3.FoldersClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.UndeleteFolderRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.undelete_folder(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.UndeleteFolderRequest, dict]):
+ The request object. The UndeleteFolder request message.
+ name (str):
+ Required. The resource name of the folder to undelete.
+ Must be of the form ``folders/{folder_id}``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Folder` A folder in an organization's resource hierarchy, used to
+ organize that organization's resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a folders.UndeleteFolderRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, folders.UndeleteFolderRequest):
+ request = folders.UndeleteFolderRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.undelete_folder]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ folders.Folder,
+ metadata_type=folders.UndeleteFolderMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.GetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Gets the access control policy for a folder. The returned policy
+ may be empty if no such policy or resource exists. The
+ ``resource`` field should be the folder's resource name, for
+ example: "folders/1234". The caller must have
+ ``resourcemanager.folders.getIamPolicy`` permission on the
+ identified folder.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_get_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.FoldersClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = client.get_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.GetIamPolicyRequest, dict]):
+ The request object. Request message for ``GetIamPolicy`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy is being requested. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.GetIamPolicyRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.GetIamPolicyRequest()
+ if resource is not None:
+ request.resource = resource
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_iam_policy]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def set_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.SetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Sets the access control policy on a folder, replacing any
+ existing policy. The ``resource`` field should be the folder's
+ resource name, for example: "folders/1234". The caller must have
+ ``resourcemanager.folders.setIamPolicy`` permission on the
+ identified folder.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_set_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.FoldersClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = client.set_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.SetIamPolicyRequest, dict]):
+ The request object. Request message for ``SetIamPolicy`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy is being specified. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.SetIamPolicyRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.SetIamPolicyRequest()
+ if resource is not None:
+ request.resource = resource
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.set_iam_policy]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def test_iam_permissions(
+ self,
+ request: Optional[Union[iam_policy_pb2.TestIamPermissionsRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ permissions: Optional[MutableSequence[str]] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Returns permissions that a caller has on the specified folder.
+ The ``resource`` field should be the folder's resource name, for
+ example: "folders/1234".
+
+ There are no permissions required for making this API call.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_test_iam_permissions():
+ # Create a client
+ client = resourcemanager_v3.FoldersClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource="resource_value",
+ permissions=['permissions_value1', 'permissions_value2'],
+ )
+
+ # Make the request
+ response = client.test_iam_permissions(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest, dict]):
+ The request object. Request message for ``TestIamPermissions`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy detail is being requested. See
+ the operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ permissions (MutableSequence[str]):
+ The set of permissions to check for the ``resource``.
+ Permissions with wildcards (such as '*' or 'storage.*')
+ are not allowed. For more information see `IAM
+ Overview <https://cloud.google.com/iam/docs/overview#permissions>`__.
+
+ This corresponds to the ``permissions`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for TestIamPermissions method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource, permissions])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.TestIamPermissionsRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.TestIamPermissionsRequest()
+ if resource is not None:
+ request.resource = resource
+ if permissions:
+ request.permissions.extend(permissions)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.test_iam_permissions]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def __enter__(self) -> "FoldersClient":
+ return self
+
+ def __exit__(self, type, value, traceback):
+ """Releases underlying transport's resources.
+
+ .. warning::
+ ONLY use as a context manager if the transport is NOT shared
+ with other clients! Exiting the with block will CLOSE the transport
+ and may cause errors in other clients!
+ """
+ self.transport.close()
+
+ def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("FoldersClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/pagers.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/pagers.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/pagers.py
@@ -0,0 +1,283 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import (
+ Any,
+ AsyncIterator,
+ Awaitable,
+ Callable,
+ Iterator,
+ Optional,
+ Sequence,
+ Tuple,
+)
+
+from google.cloud.resourcemanager_v3.types import folders
+
+
+class ListFoldersPager:
+ """A pager for iterating through ``list_folders`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListFoldersResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``folders`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListFolders`` requests and continue to iterate
+ through the ``folders`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListFoldersResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., folders.ListFoldersResponse],
+ request: folders.ListFoldersRequest,
+ response: folders.ListFoldersResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListFoldersRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListFoldersResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = folders.ListFoldersRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[folders.ListFoldersResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[folders.Folder]:
+ for page in self.pages:
+ yield from page.folders
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListFoldersAsyncPager:
+ """A pager for iterating through ``list_folders`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListFoldersResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``folders`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListFolders`` requests and continue to iterate
+ through the ``folders`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListFoldersResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[folders.ListFoldersResponse]],
+ request: folders.ListFoldersRequest,
+ response: folders.ListFoldersResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListFoldersRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListFoldersResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = folders.ListFoldersRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[folders.ListFoldersResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[folders.Folder]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.folders:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class SearchFoldersPager:
+ """A pager for iterating through ``search_folders`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.SearchFoldersResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``folders`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``SearchFolders`` requests and continue to iterate
+ through the ``folders`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.SearchFoldersResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., folders.SearchFoldersResponse],
+ request: folders.SearchFoldersRequest,
+ response: folders.SearchFoldersResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.SearchFoldersRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.SearchFoldersResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = folders.SearchFoldersRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[folders.SearchFoldersResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[folders.Folder]:
+ for page in self.pages:
+ yield from page.folders
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class SearchFoldersAsyncPager:
+ """A pager for iterating through ``search_folders`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.SearchFoldersResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``folders`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``SearchFolders`` requests and continue to iterate
+ through the ``folders`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.SearchFoldersResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[folders.SearchFoldersResponse]],
+ request: folders.SearchFoldersRequest,
+ response: folders.SearchFoldersResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.SearchFoldersRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.SearchFoldersResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = folders.SearchFoldersRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[folders.SearchFoldersResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[folders.Folder]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.folders:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/__init__.py
@@ -0,0 +1,36 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+from typing import Dict, Type
+
+from .base import FoldersTransport
+from .grpc import FoldersGrpcTransport
+from .grpc_asyncio import FoldersGrpcAsyncIOTransport
+from .rest import FoldersRestInterceptor, FoldersRestTransport
+
+# Compile a registry of transports.
+_transport_registry = OrderedDict() # type: Dict[str, Type[FoldersTransport]]
+_transport_registry["grpc"] = FoldersGrpcTransport
+_transport_registry["grpc_asyncio"] = FoldersGrpcAsyncIOTransport
+_transport_registry["rest"] = FoldersRestTransport
+
+__all__ = (
+ "FoldersTransport",
+ "FoldersGrpcTransport",
+ "FoldersGrpcAsyncIOTransport",
+ "FoldersRestTransport",
+ "FoldersRestInterceptor",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/base.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/base.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/base.py
@@ -0,0 +1,344 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import abc
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Union
+
+import google.api_core
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1, operations_v1
+from google.api_core import retry as retries
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+from google.cloud.resourcemanager_v3.types import folders
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+class FoldersTransport(abc.ABC):
+ """Abstract transport class for Folders."""
+
+ AUTH_SCOPES = (
+ "https://www.googleapis.com/auth/cloud-platform",
+ "https://www.googleapis.com/auth/cloud-platform.read-only",
+ )
+
+ DEFAULT_HOST: str = "cloudresourcemanager.googleapis.com"
+
+ def __init__(
+ self,
+ *,
+ host: str = DEFAULT_HOST,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ **kwargs,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A list of scopes.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ """
+
+ scopes_kwargs = {"scopes": scopes, "default_scopes": self.AUTH_SCOPES}
+
+ # Save the scopes.
+ self._scopes = scopes
+
+ # If no credentials are provided, then determine the appropriate
+ # defaults.
+ if credentials and credentials_file:
+ raise core_exceptions.DuplicateCredentialArgs(
+ "'credentials_file' and 'credentials' are mutually exclusive"
+ )
+
+ if credentials_file is not None:
+ credentials, _ = google.auth.load_credentials_from_file(
+ credentials_file, **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ elif credentials is None:
+ credentials, _ = google.auth.default(
+ **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ # Don't apply audience if the credentials file passed from user.
+ if hasattr(credentials, "with_gdch_audience"):
+ credentials = credentials.with_gdch_audience(
+ api_audience if api_audience else host
+ )
+
+ # If the credentials are service account credentials, then always try to use self signed JWT.
+ if (
+ always_use_jwt_access
+ and isinstance(credentials, service_account.Credentials)
+ and hasattr(service_account.Credentials, "with_always_use_jwt_access")
+ ):
+ credentials = credentials.with_always_use_jwt_access(True)
+
+ # Save the credentials.
+ self._credentials = credentials
+
+ # Save the hostname. Default to port 443 (HTTPS) if none is specified.
+ if ":" not in host:
+ host += ":443"
+ self._host = host
+
+ def _prep_wrapped_messages(self, client_info):
+ # Precompute the wrapped methods.
+ self._wrapped_methods = {
+ self.get_folder: gapic_v1.method.wrap_method(
+ self.get_folder,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.list_folders: gapic_v1.method.wrap_method(
+ self.list_folders,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.search_folders: gapic_v1.method.wrap_method(
+ self.search_folders,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.create_folder: gapic_v1.method.wrap_method(
+ self.create_folder,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.update_folder: gapic_v1.method.wrap_method(
+ self.update_folder,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.move_folder: gapic_v1.method.wrap_method(
+ self.move_folder,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.delete_folder: gapic_v1.method.wrap_method(
+ self.delete_folder,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.undelete_folder: gapic_v1.method.wrap_method(
+ self.undelete_folder,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.get_iam_policy: gapic_v1.method.wrap_method(
+ self.get_iam_policy,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.set_iam_policy: gapic_v1.method.wrap_method(
+ self.set_iam_policy,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.test_iam_permissions: gapic_v1.method.wrap_method(
+ self.test_iam_permissions,
+ default_timeout=None,
+ client_info=client_info,
+ ),
+ }
+
+ def close(self):
+ """Closes resources associated with the transport.
+
+ .. warning::
+ Only call this method if the transport is NOT shared
+ with other clients - this may cause errors in other clients!
+ """
+ raise NotImplementedError()
+
+ @property
+ def operations_client(self):
+ """Return the client designed to process long-running operations."""
+ raise NotImplementedError()
+
+ @property
+ def get_folder(
+ self,
+ ) -> Callable[
+ [folders.GetFolderRequest], Union[folders.Folder, Awaitable[folders.Folder]]
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_folders(
+ self,
+ ) -> Callable[
+ [folders.ListFoldersRequest],
+ Union[folders.ListFoldersResponse, Awaitable[folders.ListFoldersResponse]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def search_folders(
+ self,
+ ) -> Callable[
+ [folders.SearchFoldersRequest],
+ Union[folders.SearchFoldersResponse, Awaitable[folders.SearchFoldersResponse]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def create_folder(
+ self,
+ ) -> Callable[
+ [folders.CreateFolderRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def update_folder(
+ self,
+ ) -> Callable[
+ [folders.UpdateFolderRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def move_folder(
+ self,
+ ) -> Callable[
+ [folders.MoveFolderRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def delete_folder(
+ self,
+ ) -> Callable[
+ [folders.DeleteFolderRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def undelete_folder(
+ self,
+ ) -> Callable[
+ [folders.UndeleteFolderRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.GetIamPolicyRequest],
+ Union[policy_pb2.Policy, Awaitable[policy_pb2.Policy]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.SetIamPolicyRequest],
+ Union[policy_pb2.Policy, Awaitable[policy_pb2.Policy]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ Union[
+ iam_policy_pb2.TestIamPermissionsResponse,
+ Awaitable[iam_policy_pb2.TestIamPermissionsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[
+ [operations_pb2.GetOperationRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def kind(self) -> str:
+ raise NotImplementedError()
+
+
+__all__ = ("FoldersTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/grpc.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/grpc.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/grpc.py
@@ -0,0 +1,678 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers, operations_v1
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+
+from google.cloud.resourcemanager_v3.types import folders
+
+from .base import DEFAULT_CLIENT_INFO, FoldersTransport
+
+
+class FoldersGrpcTransport(FoldersTransport):
+ """gRPC backend transport for Folders.
+
+ Manages Cloud Platform folder resources.
+ Folders can be used to organize the resources under an
+ organization and to control the policies applied to groups of
+ resources.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _stubs: Dict[str, Callable]
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[grpc.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ channel (Optional[grpc.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> grpc.Channel:
+ """Create and return a gRPC channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ grpc.Channel: A gRPC channel object.
+
+ Raises:
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+
+ return grpc_helpers.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ @property
+ def grpc_channel(self) -> grpc.Channel:
+ """Return the channel designed to connect to this service."""
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsClient(self.grpc_channel)
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def get_folder(self) -> Callable[[folders.GetFolderRequest], folders.Folder]:
+ r"""Return a callable for the get folder method over gRPC.
+
+ Retrieves a folder identified by the supplied resource name.
+ Valid folder resource names have the format
+ ``folders/{folder_id}`` (for example, ``folders/1234``). The
+ caller must have ``resourcemanager.folders.get`` permission on
+ the identified folder.
+
+ Returns:
+ Callable[[~.GetFolderRequest],
+ ~.Folder]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_folder" not in self._stubs:
+ self._stubs["get_folder"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/GetFolder",
+ request_serializer=folders.GetFolderRequest.serialize,
+ response_deserializer=folders.Folder.deserialize,
+ )
+ return self._stubs["get_folder"]
+
+ @property
+ def list_folders(
+ self,
+ ) -> Callable[[folders.ListFoldersRequest], folders.ListFoldersResponse]:
+ r"""Return a callable for the list folders method over gRPC.
+
+ Lists the folders that are direct descendants of supplied parent
+ resource. ``list()`` provides a strongly consistent view of the
+ folders underneath the specified parent resource. ``list()``
+ returns folders sorted based upon the (ascending) lexical
+ ordering of their display_name. The caller must have
+ ``resourcemanager.folders.list`` permission on the identified
+ parent.
+
+ Returns:
+ Callable[[~.ListFoldersRequest],
+ ~.ListFoldersResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_folders" not in self._stubs:
+ self._stubs["list_folders"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/ListFolders",
+ request_serializer=folders.ListFoldersRequest.serialize,
+ response_deserializer=folders.ListFoldersResponse.deserialize,
+ )
+ return self._stubs["list_folders"]
+
+ @property
+ def search_folders(
+ self,
+ ) -> Callable[[folders.SearchFoldersRequest], folders.SearchFoldersResponse]:
+ r"""Return a callable for the search folders method over gRPC.
+
+ Search for folders that match specific filter criteria.
+ ``search()`` provides an eventually consistent view of the
+ folders a user has access to which meet the specified filter
+ criteria.
+
+ This will only return folders on which the caller has the
+ permission ``resourcemanager.folders.get``.
+
+ Returns:
+ Callable[[~.SearchFoldersRequest],
+ ~.SearchFoldersResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "search_folders" not in self._stubs:
+ self._stubs["search_folders"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/SearchFolders",
+ request_serializer=folders.SearchFoldersRequest.serialize,
+ response_deserializer=folders.SearchFoldersResponse.deserialize,
+ )
+ return self._stubs["search_folders"]
+
+ @property
+ def create_folder(
+ self,
+ ) -> Callable[[folders.CreateFolderRequest], operations_pb2.Operation]:
+ r"""Return a callable for the create folder method over gRPC.
+
+ Creates a folder in the resource hierarchy. Returns an
+ ``Operation`` which can be used to track the progress of the
+ folder creation workflow. Upon success, the
+ ``Operation.response`` field will be populated with the created
+ Folder.
+
+ In order to succeed, the addition of this new folder must not
+ violate the folder naming, height, or fanout constraints.
+
+ - The folder's ``display_name`` must be distinct from all other
+ folders that share its parent.
+ - The addition of the folder must not cause the active folder
+ hierarchy to exceed a height of 10. Note, the full active +
+ deleted folder hierarchy is allowed to reach a height of 20;
+ this provides additional headroom when moving folders that
+ contain deleted folders.
+ - The addition of the folder must not cause the total number of
+ folders under its parent to exceed 300.
+
+ If the operation fails due to a folder constraint violation,
+ some errors may be returned by the ``CreateFolder`` request,
+ with status code ``FAILED_PRECONDITION`` and an error
+ description. Other folder constraint violations will be
+ communicated in the ``Operation``, with the specific
+ ``PreconditionFailure`` returned in the details list in the
+ ``Operation.error`` field.
+
+ The caller must have ``resourcemanager.folders.create``
+ permission on the identified parent.
+
+ Returns:
+ Callable[[~.CreateFolderRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_folder" not in self._stubs:
+ self._stubs["create_folder"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/CreateFolder",
+ request_serializer=folders.CreateFolderRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_folder"]
+
+ @property
+ def update_folder(
+ self,
+ ) -> Callable[[folders.UpdateFolderRequest], operations_pb2.Operation]:
+ r"""Return a callable for the update folder method over gRPC.
+
+ Updates a folder, changing its ``display_name``. Changes to the
+ folder ``display_name`` will be rejected if they violate either
+ the ``display_name`` formatting rules or the naming constraints
+ described in the
+ [CreateFolder][google.cloud.resourcemanager.v3.Folders.CreateFolder]
+ documentation.
+
+ The folder's ``display_name`` must start and end with a letter
+ or digit, may contain letters, digits, spaces, hyphens and
+ underscores and can be between 3 and 30 characters. This is
+ captured by the regular expression:
+ ``[\p{L}\p{N}][\p{L}\p{N}_- ]{1,28}[\p{L}\p{N}]``. The caller
+ must have ``resourcemanager.folders.update`` permission on the
+ identified folder.
+
+ If the update fails due to the unique name constraint then a
+ ``PreconditionFailure`` explaining this violation will be
+ returned in the Status.details field.
+
+ Returns:
+ Callable[[~.UpdateFolderRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_folder" not in self._stubs:
+ self._stubs["update_folder"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/UpdateFolder",
+ request_serializer=folders.UpdateFolderRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["update_folder"]
+
+ @property
+ def move_folder(
+ self,
+ ) -> Callable[[folders.MoveFolderRequest], operations_pb2.Operation]:
+ r"""Return a callable for the move folder method over gRPC.
+
+ Moves a folder under a new resource parent. Returns an
+ ``Operation`` which can be used to track the progress of the
+ folder move workflow. Upon success, the ``Operation.response``
+ field will be populated with the moved folder. Upon failure, a
+ ``FolderOperationError`` categorizing the failure cause will be
+ returned - if the failure occurs synchronously then the
+ ``FolderOperationError`` will be returned in the
+ ``Status.details`` field. If it occurs asynchronously, then the
+ FolderOperation will be returned in the ``Operation.error``
+ field. In addition, the ``Operation.metadata`` field will be
+ populated with a ``FolderOperation`` message as an aid to
+ stateless clients. Folder moves will be rejected if they violate
+ either the naming, height, or fanout constraints described in
+ the
+ [CreateFolder][google.cloud.resourcemanager.v3.Folders.CreateFolder]
+ documentation. The caller must have
+ ``resourcemanager.folders.move`` permission on the folder's
+ current and proposed new parent.
+
+ Returns:
+ Callable[[~.MoveFolderRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "move_folder" not in self._stubs:
+ self._stubs["move_folder"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/MoveFolder",
+ request_serializer=folders.MoveFolderRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["move_folder"]
+
+ @property
+ def delete_folder(
+ self,
+ ) -> Callable[[folders.DeleteFolderRequest], operations_pb2.Operation]:
+ r"""Return a callable for the delete folder method over gRPC.
+
+ Requests deletion of a folder. The folder is moved into the
+ [DELETE_REQUESTED][google.cloud.resourcemanager.v3.Folder.State.DELETE_REQUESTED]
+ state immediately, and is deleted approximately 30 days later.
+ This method may only be called on an empty folder, where a
+ folder is empty if it doesn't contain any folders or projects in
+ the
+ [ACTIVE][google.cloud.resourcemanager.v3.Folder.State.ACTIVE]
+ state. If called on a folder in
+ [DELETE_REQUESTED][google.cloud.resourcemanager.v3.Folder.State.DELETE_REQUESTED]
+ state the operation will result in a no-op success. The caller
+ must have ``resourcemanager.folders.delete`` permission on the
+ identified folder.
+
+ Returns:
+ Callable[[~.DeleteFolderRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_folder" not in self._stubs:
+ self._stubs["delete_folder"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/DeleteFolder",
+ request_serializer=folders.DeleteFolderRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_folder"]
+
+ @property
+ def undelete_folder(
+ self,
+ ) -> Callable[[folders.UndeleteFolderRequest], operations_pb2.Operation]:
+ r"""Return a callable for the undelete folder method over gRPC.
+
+ Cancels the deletion request for a folder. This method may be
+ called on a folder in any state. If the folder is in the
+ [ACTIVE][google.cloud.resourcemanager.v3.Folder.State.ACTIVE]
+ state the result will be a no-op success. In order to succeed,
+ the folder's parent must be in the
+ [ACTIVE][google.cloud.resourcemanager.v3.Folder.State.ACTIVE]
+ state. In addition, reintroducing the folder into the tree must
+ not violate folder naming, height, and fanout constraints
+ described in the
+ [CreateFolder][google.cloud.resourcemanager.v3.Folders.CreateFolder]
+ documentation. The caller must have
+ ``resourcemanager.folders.undelete`` permission on the
+ identified folder.
+
+ Returns:
+ Callable[[~.UndeleteFolderRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "undelete_folder" not in self._stubs:
+ self._stubs["undelete_folder"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/UndeleteFolder",
+ request_serializer=folders.UndeleteFolderRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["undelete_folder"]
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], policy_pb2.Policy]:
+ r"""Return a callable for the get iam policy method over gRPC.
+
+ Gets the access control policy for a folder. The returned policy
+ may be empty if no such policy or resource exists. The
+ ``resource`` field should be the folder's resource name, for
+ example: "folders/1234". The caller must have
+ ``resourcemanager.folders.getIamPolicy`` permission on the
+ identified folder.
+
+ Returns:
+ Callable[[~.GetIamPolicyRequest],
+ ~.Policy]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_iam_policy" not in self._stubs:
+ self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/GetIamPolicy",
+ request_serializer=iam_policy_pb2.GetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["get_iam_policy"]
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], policy_pb2.Policy]:
+ r"""Return a callable for the set iam policy method over gRPC.
+
+ Sets the access control policy on a folder, replacing any
+ existing policy. The ``resource`` field should be the folder's
+ resource name, for example: "folders/1234". The caller must have
+ ``resourcemanager.folders.setIamPolicy`` permission on the
+ identified folder.
+
+ Returns:
+ Callable[[~.SetIamPolicyRequest],
+ ~.Policy]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "set_iam_policy" not in self._stubs:
+ self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/SetIamPolicy",
+ request_serializer=iam_policy_pb2.SetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["set_iam_policy"]
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ iam_policy_pb2.TestIamPermissionsResponse,
+ ]:
+ r"""Return a callable for the test iam permissions method over gRPC.
+
+ Returns permissions that a caller has on the specified folder.
+ The ``resource`` field should be the folder's resource name, for
+ example: "folders/1234".
+
+ There are no permissions required for making this API call.
+
+ Returns:
+ Callable[[~.TestIamPermissionsRequest],
+ ~.TestIamPermissionsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "test_iam_permissions" not in self._stubs:
+ self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/TestIamPermissions",
+ request_serializer=iam_policy_pb2.TestIamPermissionsRequest.SerializeToString,
+ response_deserializer=iam_policy_pb2.TestIamPermissionsResponse.FromString,
+ )
+ return self._stubs["test_iam_permissions"]
+
+ def close(self):
+ self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+ @property
+ def kind(self) -> str:
+ return "grpc"
+
+
+__all__ = ("FoldersGrpcTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/grpc_asyncio.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/grpc_asyncio.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/grpc_asyncio.py
@@ -0,0 +1,683 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers_async, operations_v1
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+from grpc.experimental import aio # type: ignore
+
+from google.cloud.resourcemanager_v3.types import folders
+
+from .base import DEFAULT_CLIENT_INFO, FoldersTransport
+from .grpc import FoldersGrpcTransport
+
+
+class FoldersGrpcAsyncIOTransport(FoldersTransport):
+ """gRPC AsyncIO backend transport for Folders.
+
+ Manages Cloud Platform folder resources.
+ Folders can be used to organize the resources under an
+ organization and to control the policies applied to groups of
+ resources.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _grpc_channel: aio.Channel
+ _stubs: Dict[str, Callable] = {}
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> aio.Channel:
+ """Create and return a gRPC AsyncIO channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ aio.Channel: A gRPC AsyncIO channel object.
+ """
+
+ return grpc_helpers_async.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[aio.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ channel (Optional[aio.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsAsyncClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def grpc_channel(self) -> aio.Channel:
+ """Create the channel designed to connect to this service.
+
+ This property caches on the instance; repeated calls return
+ the same channel.
+ """
+ # Return the channel from cache.
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsAsyncClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsAsyncClient(
+ self.grpc_channel
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def get_folder(
+ self,
+ ) -> Callable[[folders.GetFolderRequest], Awaitable[folders.Folder]]:
+ r"""Return a callable for the get folder method over gRPC.
+
+ Retrieves a folder identified by the supplied resource name.
+ Valid folder resource names have the format
+ ``folders/{folder_id}`` (for example, ``folders/1234``). The
+ caller must have ``resourcemanager.folders.get`` permission on
+ the identified folder.
+
+ Returns:
+ Callable[[~.GetFolderRequest],
+ Awaitable[~.Folder]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_folder" not in self._stubs:
+ self._stubs["get_folder"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/GetFolder",
+ request_serializer=folders.GetFolderRequest.serialize,
+ response_deserializer=folders.Folder.deserialize,
+ )
+ return self._stubs["get_folder"]
+
+ @property
+ def list_folders(
+ self,
+ ) -> Callable[[folders.ListFoldersRequest], Awaitable[folders.ListFoldersResponse]]:
+ r"""Return a callable for the list folders method over gRPC.
+
+ Lists the folders that are direct descendants of supplied parent
+ resource. ``list()`` provides a strongly consistent view of the
+ folders underneath the specified parent resource. ``list()``
+ returns folders sorted based upon the (ascending) lexical
+ ordering of their display_name. The caller must have
+ ``resourcemanager.folders.list`` permission on the identified
+ parent.
+
+ Returns:
+ Callable[[~.ListFoldersRequest],
+ Awaitable[~.ListFoldersResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_folders" not in self._stubs:
+ self._stubs["list_folders"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/ListFolders",
+ request_serializer=folders.ListFoldersRequest.serialize,
+ response_deserializer=folders.ListFoldersResponse.deserialize,
+ )
+ return self._stubs["list_folders"]
+
+ @property
+ def search_folders(
+ self,
+ ) -> Callable[
+ [folders.SearchFoldersRequest], Awaitable[folders.SearchFoldersResponse]
+ ]:
+ r"""Return a callable for the search folders method over gRPC.
+
+ Search for folders that match specific filter criteria.
+ ``search()`` provides an eventually consistent view of the
+ folders a user has access to which meet the specified filter
+ criteria.
+
+ This will only return folders on which the caller has the
+ permission ``resourcemanager.folders.get``.
+
+ Returns:
+ Callable[[~.SearchFoldersRequest],
+ Awaitable[~.SearchFoldersResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "search_folders" not in self._stubs:
+ self._stubs["search_folders"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/SearchFolders",
+ request_serializer=folders.SearchFoldersRequest.serialize,
+ response_deserializer=folders.SearchFoldersResponse.deserialize,
+ )
+ return self._stubs["search_folders"]
+
+ @property
+ def create_folder(
+ self,
+ ) -> Callable[[folders.CreateFolderRequest], Awaitable[operations_pb2.Operation]]:
+ r"""Return a callable for the create folder method over gRPC.
+
+ Creates a folder in the resource hierarchy. Returns an
+ ``Operation`` which can be used to track the progress of the
+ folder creation workflow. Upon success, the
+ ``Operation.response`` field will be populated with the created
+ Folder.
+
+ In order to succeed, the addition of this new folder must not
+ violate the folder naming, height, or fanout constraints.
+
+ - The folder's ``display_name`` must be distinct from all other
+ folders that share its parent.
+ - The addition of the folder must not cause the active folder
+ hierarchy to exceed a height of 10. Note, the full active +
+ deleted folder hierarchy is allowed to reach a height of 20;
+ this provides additional headroom when moving folders that
+ contain deleted folders.
+ - The addition of the folder must not cause the total number of
+ folders under its parent to exceed 300.
+
+ If the operation fails due to a folder constraint violation,
+ some errors may be returned by the ``CreateFolder`` request,
+ with status code ``FAILED_PRECONDITION`` and an error
+ description. Other folder constraint violations will be
+ communicated in the ``Operation``, with the specific
+ ``PreconditionFailure`` returned in the details list in the
+ ``Operation.error`` field.
+
+ The caller must have ``resourcemanager.folders.create``
+ permission on the identified parent.
+
+ Returns:
+ Callable[[~.CreateFolderRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_folder" not in self._stubs:
+ self._stubs["create_folder"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/CreateFolder",
+ request_serializer=folders.CreateFolderRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_folder"]
+
+ @property
+ def update_folder(
+ self,
+ ) -> Callable[[folders.UpdateFolderRequest], Awaitable[operations_pb2.Operation]]:
+ r"""Return a callable for the update folder method over gRPC.
+
+ Updates a folder, changing its ``display_name``. Changes to the
+ folder ``display_name`` will be rejected if they violate either
+ the ``display_name`` formatting rules or the naming constraints
+ described in the
+ [CreateFolder][google.cloud.resourcemanager.v3.Folders.CreateFolder]
+ documentation.
+
+ The folder's ``display_name`` must start and end with a letter
+ or digit, may contain letters, digits, spaces, hyphens and
+ underscores and can be between 3 and 30 characters. This is
+ captured by the regular expression:
+ ``[\p{L}\p{N}][\p{L}\p{N}_- ]{1,28}[\p{L}\p{N}]``. The caller
+ must have ``resourcemanager.folders.update`` permission on the
+ identified folder.
+
+ If the update fails due to the unique name constraint then a
+ ``PreconditionFailure`` explaining this violation will be
+ returned in the Status.details field.
+
+ Returns:
+ Callable[[~.UpdateFolderRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_folder" not in self._stubs:
+ self._stubs["update_folder"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/UpdateFolder",
+ request_serializer=folders.UpdateFolderRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["update_folder"]
+
+ @property
+ def move_folder(
+ self,
+ ) -> Callable[[folders.MoveFolderRequest], Awaitable[operations_pb2.Operation]]:
+ r"""Return a callable for the move folder method over gRPC.
+
+ Moves a folder under a new resource parent. Returns an
+ ``Operation`` which can be used to track the progress of the
+ folder move workflow. Upon success, the ``Operation.response``
+ field will be populated with the moved folder. Upon failure, a
+ ``FolderOperationError`` categorizing the failure cause will be
+ returned - if the failure occurs synchronously then the
+ ``FolderOperationError`` will be returned in the
+ ``Status.details`` field. If it occurs asynchronously, then the
+ FolderOperation will be returned in the ``Operation.error``
+ field. In addition, the ``Operation.metadata`` field will be
+ populated with a ``FolderOperation`` message as an aid to
+ stateless clients. Folder moves will be rejected if they violate
+ either the naming, height, or fanout constraints described in
+ the
+ [CreateFolder][google.cloud.resourcemanager.v3.Folders.CreateFolder]
+ documentation. The caller must have
+ ``resourcemanager.folders.move`` permission on the folder's
+ current and proposed new parent.
+
+ Returns:
+ Callable[[~.MoveFolderRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "move_folder" not in self._stubs:
+ self._stubs["move_folder"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/MoveFolder",
+ request_serializer=folders.MoveFolderRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["move_folder"]
+
+ @property
+ def delete_folder(
+ self,
+ ) -> Callable[[folders.DeleteFolderRequest], Awaitable[operations_pb2.Operation]]:
+ r"""Return a callable for the delete folder method over gRPC.
+
+ Requests deletion of a folder. The folder is moved into the
+ [DELETE_REQUESTED][google.cloud.resourcemanager.v3.Folder.State.DELETE_REQUESTED]
+ state immediately, and is deleted approximately 30 days later.
+ This method may only be called on an empty folder, where a
+ folder is empty if it doesn't contain any folders or projects in
+ the
+ [ACTIVE][google.cloud.resourcemanager.v3.Folder.State.ACTIVE]
+ state. If called on a folder in
+ [DELETE_REQUESTED][google.cloud.resourcemanager.v3.Folder.State.DELETE_REQUESTED]
+ state the operation will result in a no-op success. The caller
+ must have ``resourcemanager.folders.delete`` permission on the
+ identified folder.
+
+ Returns:
+ Callable[[~.DeleteFolderRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_folder" not in self._stubs:
+ self._stubs["delete_folder"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/DeleteFolder",
+ request_serializer=folders.DeleteFolderRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_folder"]
+
+ @property
+ def undelete_folder(
+ self,
+ ) -> Callable[[folders.UndeleteFolderRequest], Awaitable[operations_pb2.Operation]]:
+ r"""Return a callable for the undelete folder method over gRPC.
+
+ Cancels the deletion request for a folder. This method may be
+ called on a folder in any state. If the folder is in the
+ [ACTIVE][google.cloud.resourcemanager.v3.Folder.State.ACTIVE]
+ state the result will be a no-op success. In order to succeed,
+ the folder's parent must be in the
+ [ACTIVE][google.cloud.resourcemanager.v3.Folder.State.ACTIVE]
+ state. In addition, reintroducing the folder into the tree must
+ not violate folder naming, height, and fanout constraints
+ described in the
+ [CreateFolder][google.cloud.resourcemanager.v3.Folders.CreateFolder]
+ documentation. The caller must have
+ ``resourcemanager.folders.undelete`` permission on the
+ identified folder.
+
+ Returns:
+ Callable[[~.UndeleteFolderRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "undelete_folder" not in self._stubs:
+ self._stubs["undelete_folder"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/UndeleteFolder",
+ request_serializer=folders.UndeleteFolderRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["undelete_folder"]
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], Awaitable[policy_pb2.Policy]]:
+ r"""Return a callable for the get iam policy method over gRPC.
+
+ Gets the access control policy for a folder. The returned policy
+ may be empty if no such policy or resource exists. The
+ ``resource`` field should be the folder's resource name, for
+ example: "folders/1234". The caller must have
+ ``resourcemanager.folders.getIamPolicy`` permission on the
+ identified folder.
+
+ Returns:
+ Callable[[~.GetIamPolicyRequest],
+ Awaitable[~.Policy]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_iam_policy" not in self._stubs:
+ self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/GetIamPolicy",
+ request_serializer=iam_policy_pb2.GetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["get_iam_policy"]
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], Awaitable[policy_pb2.Policy]]:
+ r"""Return a callable for the set iam policy method over gRPC.
+
+ Sets the access control policy on a folder, replacing any
+ existing policy. The ``resource`` field should be the folder's
+ resource name, for example: "folders/1234". The caller must have
+ ``resourcemanager.folders.setIamPolicy`` permission on the
+ identified folder.
+
+ Returns:
+ Callable[[~.SetIamPolicyRequest],
+ Awaitable[~.Policy]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "set_iam_policy" not in self._stubs:
+ self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/SetIamPolicy",
+ request_serializer=iam_policy_pb2.SetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["set_iam_policy"]
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ Awaitable[iam_policy_pb2.TestIamPermissionsResponse],
+ ]:
+ r"""Return a callable for the test iam permissions method over gRPC.
+
+ Returns permissions that a caller has on the specified folder.
+ The ``resource`` field should be the folder's resource name, for
+ example: "folders/1234".
+
+ There are no permissions required for making this API call.
+
+ Returns:
+ Callable[[~.TestIamPermissionsRequest],
+ Awaitable[~.TestIamPermissionsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "test_iam_permissions" not in self._stubs:
+ self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Folders/TestIamPermissions",
+ request_serializer=iam_policy_pb2.TestIamPermissionsRequest.SerializeToString,
+ response_deserializer=iam_policy_pb2.TestIamPermissionsResponse.FromString,
+ )
+ return self._stubs["test_iam_permissions"]
+
+ def close(self):
+ return self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+
+__all__ = ("FoldersGrpcAsyncIOTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/rest.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/rest.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/folders/transports/rest.py
@@ -0,0 +1,1898 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import dataclasses
+import json # type: ignore
+import re
+from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import (
+ gapic_v1,
+ operations_v1,
+ path_template,
+ rest_helpers,
+ rest_streaming,
+)
+from google.api_core import exceptions as core_exceptions
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.auth.transport.requests import AuthorizedSession # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import json_format
+import grpc # type: ignore
+from requests import __version__ as requests_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.types import folders
+
+from .base import DEFAULT_CLIENT_INFO as BASE_DEFAULT_CLIENT_INFO
+from .base import FoldersTransport
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=BASE_DEFAULT_CLIENT_INFO.gapic_version,
+ grpc_version=None,
+ rest_version=requests_version,
+)
+
+
+class FoldersRestInterceptor:
+ """Interceptor for Folders.
+
+ Interceptors are used to manipulate requests, request metadata, and responses
+ in arbitrary ways.
+ Example use cases include:
+ * Logging
+ * Verifying requests according to service or custom semantics
+ * Stripping extraneous information from responses
+
+ These use cases and more can be enabled by injecting an
+ instance of a custom subclass when constructing the FoldersRestTransport.
+
+ .. code-block:: python
+ class MyCustomFoldersInterceptor(FoldersRestInterceptor):
+ def pre_create_folder(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_create_folder(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_delete_folder(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_delete_folder(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_folder(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_folder(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_iam_policy(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_iam_policy(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_folders(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_folders(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_move_folder(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_move_folder(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_search_folders(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_search_folders(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_set_iam_policy(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_set_iam_policy(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_test_iam_permissions(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_test_iam_permissions(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_undelete_folder(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_undelete_folder(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_update_folder(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_update_folder(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ transport = FoldersRestTransport(interceptor=MyCustomFoldersInterceptor())
+ client = FoldersClient(transport=transport)
+
+
+ """
+
+ def pre_create_folder(
+ self, request: folders.CreateFolderRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[folders.CreateFolderRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for create_folder
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Folders server.
+ """
+ return request, metadata
+
+ def post_create_folder(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for create_folder
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Folders server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_delete_folder(
+ self, request: folders.DeleteFolderRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[folders.DeleteFolderRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for delete_folder
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Folders server.
+ """
+ return request, metadata
+
+ def post_delete_folder(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for delete_folder
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Folders server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_folder(
+ self, request: folders.GetFolderRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[folders.GetFolderRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_folder
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Folders server.
+ """
+ return request, metadata
+
+ def post_get_folder(self, response: folders.Folder) -> folders.Folder:
+ """Post-rpc interceptor for get_folder
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Folders server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_iam_policy(
+ self,
+ request: iam_policy_pb2.GetIamPolicyRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.GetIamPolicyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_iam_policy
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Folders server.
+ """
+ return request, metadata
+
+ def post_get_iam_policy(self, response: policy_pb2.Policy) -> policy_pb2.Policy:
+ """Post-rpc interceptor for get_iam_policy
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Folders server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_folders(
+ self, request: folders.ListFoldersRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[folders.ListFoldersRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_folders
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Folders server.
+ """
+ return request, metadata
+
+ def post_list_folders(
+ self, response: folders.ListFoldersResponse
+ ) -> folders.ListFoldersResponse:
+ """Post-rpc interceptor for list_folders
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Folders server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_move_folder(
+ self, request: folders.MoveFolderRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[folders.MoveFolderRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for move_folder
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Folders server.
+ """
+ return request, metadata
+
+ def post_move_folder(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for move_folder
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Folders server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_search_folders(
+ self, request: folders.SearchFoldersRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[folders.SearchFoldersRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for search_folders
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Folders server.
+ """
+ return request, metadata
+
+ def post_search_folders(
+ self, response: folders.SearchFoldersResponse
+ ) -> folders.SearchFoldersResponse:
+ """Post-rpc interceptor for search_folders
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Folders server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_set_iam_policy(
+ self,
+ request: iam_policy_pb2.SetIamPolicyRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.SetIamPolicyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for set_iam_policy
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Folders server.
+ """
+ return request, metadata
+
+ def post_set_iam_policy(self, response: policy_pb2.Policy) -> policy_pb2.Policy:
+ """Post-rpc interceptor for set_iam_policy
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Folders server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_test_iam_permissions(
+ self,
+ request: iam_policy_pb2.TestIamPermissionsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.TestIamPermissionsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for test_iam_permissions
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Folders server.
+ """
+ return request, metadata
+
+ def post_test_iam_permissions(
+ self, response: iam_policy_pb2.TestIamPermissionsResponse
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ """Post-rpc interceptor for test_iam_permissions
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Folders server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_undelete_folder(
+ self,
+ request: folders.UndeleteFolderRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[folders.UndeleteFolderRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for undelete_folder
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Folders server.
+ """
+ return request, metadata
+
+ def post_undelete_folder(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for undelete_folder
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Folders server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_update_folder(
+ self, request: folders.UpdateFolderRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[folders.UpdateFolderRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for update_folder
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Folders server.
+ """
+ return request, metadata
+
+ def post_update_folder(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for update_folder
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Folders server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_operation(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[operations_pb2.GetOperationRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Folders server.
+ """
+ return request, metadata
+
+ def post_get_operation(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Folders server but before
+ it is returned to user code.
+ """
+ return response
+
+
+@dataclasses.dataclass
+class FoldersRestStub:
+ _session: AuthorizedSession
+ _host: str
+ _interceptor: FoldersRestInterceptor
+
+
+class FoldersRestTransport(FoldersTransport):
+ """REST backend transport for Folders.
+
+ Manages Cloud Platform folder resources.
+ Folders can be used to organize the resources under an
+ organization and to control the policies applied to groups of
+ resources.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends JSON representations of protocol buffers over HTTP/1.1
+
+ """
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ url_scheme: str = "https",
+ interceptor: Optional[FoldersRestInterceptor] = None,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Callable[[], Tuple[bytes, bytes]]): Client
+ certificate to configure mutual TLS HTTP channel. It is ignored
+ if ``channel`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you are developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ url_scheme: the protocol scheme for the API endpoint. Normally
+ "https", but for testing or local servers,
+ "http" can be specified.
+ """
+ # Run the base constructor
+ # TODO(yon-mg): resolve other ctor params i.e. scopes, quota, etc.
+ # TODO: When custom host (api_endpoint) is set, `scopes` must *also* be set on the
+ # credentials object
+ maybe_url_match = re.match("^(?P<scheme>http(?:s)?://)?(?P<host>.*)$", host)
+ if maybe_url_match is None:
+ raise ValueError(
+ f"Unexpected hostname structure: {host}"
+ ) # pragma: NO COVER
+
+ url_match_items = maybe_url_match.groupdict()
+
+ host = f"{url_scheme}://{host}" if not url_match_items["scheme"] else host
+
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+ self._session = AuthorizedSession(
+ self._credentials, default_host=self.DEFAULT_HOST
+ )
+ self._operations_client: Optional[operations_v1.AbstractOperationsClient] = None
+ if client_cert_source_for_mtls:
+ self._session.configure_mtls_channel(client_cert_source_for_mtls)
+ self._interceptor = interceptor or FoldersRestInterceptor()
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def operations_client(self) -> operations_v1.AbstractOperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ http_options: Dict[str, List[Dict[str, str]]] = {
+ "google.longrunning.Operations.GetOperation": [
+ {
+ "method": "get",
+ "uri": "/v3/{name=operations/**}",
+ },
+ ],
+ }
+
+ rest_transport = operations_v1.OperationsRestTransport(
+ host=self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ scopes=self._scopes,
+ http_options=http_options,
+ path_prefix="v3",
+ )
+
+ self._operations_client = operations_v1.AbstractOperationsClient(
+ transport=rest_transport
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ class _CreateFolder(FoldersRestStub):
+ def __hash__(self):
+ return hash("CreateFolder")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: folders.CreateFolderRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the create folder method over HTTP.
+
+ Args:
+ request (~.folders.CreateFolderRequest):
+ The request object. The CreateFolder request message.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/folders",
+ "body": "folder",
+ },
+ ]
+ request, metadata = self._interceptor.pre_create_folder(request, metadata)
+ pb_request = folders.CreateFolderRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_create_folder(resp)
+ return resp
+
+ class _DeleteFolder(FoldersRestStub):
+ def __hash__(self):
+ return hash("DeleteFolder")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: folders.DeleteFolderRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the delete folder method over HTTP.
+
+ Args:
+ request (~.folders.DeleteFolderRequest):
+ The request object. The DeleteFolder request message.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "delete",
+ "uri": "/v3/{name=folders/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_delete_folder(request, metadata)
+ pb_request = folders.DeleteFolderRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_delete_folder(resp)
+ return resp
+
+ class _GetFolder(FoldersRestStub):
+ def __hash__(self):
+ return hash("GetFolder")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: folders.GetFolderRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> folders.Folder:
+ r"""Call the get folder method over HTTP.
+
+ Args:
+ request (~.folders.GetFolderRequest):
+ The request object. The GetFolder request message.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.folders.Folder:
+ A folder in an organization's
+ resource hierarchy, used to organize
+ that organization's resources.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/{name=folders/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_folder(request, metadata)
+ pb_request = folders.GetFolderRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = folders.Folder()
+ pb_resp = folders.Folder.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_folder(resp)
+ return resp
+
+ class _GetIamPolicy(FoldersRestStub):
+ def __hash__(self):
+ return hash("GetIamPolicy")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.GetIamPolicyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Call the get iam policy method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.GetIamPolicyRequest):
+ The request object. Request message for ``GetIamPolicy`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which
+ specifies access controls for Google Cloud resources.
+
+ A ``Policy`` is a collection of ``bindings``. A
+ ``binding`` binds one or more ``members``, or
+ principals, to a single ``role``. Principals can be user
+ accounts, service accounts, Google groups, and domains
+ (such as G Suite). A ``role`` is a named list of
+ permissions; each ``role`` can be an IAM predefined role
+ or a user-created custom role.
+
+ For some types of Google Cloud resources, a ``binding``
+ can also specify a ``condition``, which is a logical
+ expression that allows access to a resource only if the
+ expression evaluates to ``true``. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the `IAM
+ documentation <https://cloud.google.com/iam/help/conditions/resource-policies>`__.
+
+ **JSON example:**
+
+ ::
+
+ {
+ "bindings": [
+ {
+ "role": "roles/resourcemanager.organizationAdmin",
+ "members": [
+ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+ },
+ {
+ "role": "roles/resourcemanager.organizationViewer",
+ "members": [
+ "user:eve@example.com"
+ ],
+ "condition": {
+ "title": "expirable access",
+ "description": "Does not grant access after Sep 2020",
+ "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')",
+ }
+ }
+ ],
+ "etag": "BwWWja0YfJA=",
+ "version": 3
+ }
+
+ **YAML example:**
+
+ ::
+
+ bindings:
+ - members:
+ - user:mike@example.com
+ - group:admins@example.com
+ - domain:google.com
+ - serviceAccount:my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin
+ - members:
+ - user:eve@example.com
+ role: roles/resourcemanager.organizationViewer
+ condition:
+ title: expirable access
+ description: Does not grant access after Sep 2020
+ expression: request.time < timestamp('2020-10-01T00:00:00.000Z')
+ etag: BwWWja0YfJA=
+ version: 3
+
+ For a description of IAM and its features, see the `IAM
+ documentation <https://cloud.google.com/iam/docs/>`__.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=folders/*}:getIamPolicy",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_iam_policy(request, metadata)
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = policy_pb2.Policy()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_iam_policy(resp)
+ return resp
+
+ class _ListFolders(FoldersRestStub):
+ def __hash__(self):
+ return hash("ListFolders")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "parent": "",
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: folders.ListFoldersRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> folders.ListFoldersResponse:
+ r"""Call the list folders method over HTTP.
+
+ Args:
+ request (~.folders.ListFoldersRequest):
+ The request object. The ListFolders request message.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.folders.ListFoldersResponse:
+ The ListFolders response message.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/folders",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_folders(request, metadata)
+ pb_request = folders.ListFoldersRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = folders.ListFoldersResponse()
+ pb_resp = folders.ListFoldersResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_folders(resp)
+ return resp
+
+ class _MoveFolder(FoldersRestStub):
+ def __hash__(self):
+ return hash("MoveFolder")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: folders.MoveFolderRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the move folder method over HTTP.
+
+ Args:
+ request (~.folders.MoveFolderRequest):
+ The request object. The MoveFolder request message.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{name=folders/*}:move",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_move_folder(request, metadata)
+ pb_request = folders.MoveFolderRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_move_folder(resp)
+ return resp
+
+ class _SearchFolders(FoldersRestStub):
+ def __hash__(self):
+ return hash("SearchFolders")
+
+ def __call__(
+ self,
+ request: folders.SearchFoldersRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> folders.SearchFoldersResponse:
+ r"""Call the search folders method over HTTP.
+
+ Args:
+ request (~.folders.SearchFoldersRequest):
+ The request object. The request message for searching
+ folders.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.folders.SearchFoldersResponse:
+ The response message for searching
+ folders.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/folders:search",
+ },
+ ]
+ request, metadata = self._interceptor.pre_search_folders(request, metadata)
+ pb_request = folders.SearchFoldersRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = folders.SearchFoldersResponse()
+ pb_resp = folders.SearchFoldersResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_search_folders(resp)
+ return resp
+
+ class _SetIamPolicy(FoldersRestStub):
+ def __hash__(self):
+ return hash("SetIamPolicy")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.SetIamPolicyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Call the set iam policy method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.SetIamPolicyRequest):
+ The request object. Request message for ``SetIamPolicy`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which
+ specifies access controls for Google Cloud resources.
+
+ A ``Policy`` is a collection of ``bindings``. A
+ ``binding`` binds one or more ``members``, or
+ principals, to a single ``role``. Principals can be user
+ accounts, service accounts, Google groups, and domains
+ (such as G Suite). A ``role`` is a named list of
+ permissions; each ``role`` can be an IAM predefined role
+ or a user-created custom role.
+
+ For some types of Google Cloud resources, a ``binding``
+ can also specify a ``condition``, which is a logical
+ expression that allows access to a resource only if the
+ expression evaluates to ``true``. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the `IAM
+ documentation <https://cloud.google.com/iam/help/conditions/resource-policies>`__.
+
+ **JSON example:**
+
+ ::
+
+ {
+ "bindings": [
+ {
+ "role": "roles/resourcemanager.organizationAdmin",
+ "members": [
+ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+ },
+ {
+ "role": "roles/resourcemanager.organizationViewer",
+ "members": [
+ "user:eve@example.com"
+ ],
+ "condition": {
+ "title": "expirable access",
+ "description": "Does not grant access after Sep 2020",
+ "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')",
+ }
+ }
+ ],
+ "etag": "BwWWja0YfJA=",
+ "version": 3
+ }
+
+ **YAML example:**
+
+ ::
+
+ bindings:
+ - members:
+ - user:mike@example.com
+ - group:admins@example.com
+ - domain:google.com
+ - serviceAccount:my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin
+ - members:
+ - user:eve@example.com
+ role: roles/resourcemanager.organizationViewer
+ condition:
+ title: expirable access
+ description: Does not grant access after Sep 2020
+ expression: request.time < timestamp('2020-10-01T00:00:00.000Z')
+ etag: BwWWja0YfJA=
+ version: 3
+
+ For a description of IAM and its features, see the `IAM
+ documentation <https://cloud.google.com/iam/docs/>`__.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=folders/*}:setIamPolicy",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_set_iam_policy(request, metadata)
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = policy_pb2.Policy()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_set_iam_policy(resp)
+ return resp
+
+ class _TestIamPermissions(FoldersRestStub):
+ def __hash__(self):
+ return hash("TestIamPermissions")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.TestIamPermissionsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Call the test iam permissions method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.TestIamPermissionsRequest):
+ The request object. Request message for ``TestIamPermissions`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for ``TestIamPermissions`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=folders/*}:testIamPermissions",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_test_iam_permissions(
+ request, metadata
+ )
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = iam_policy_pb2.TestIamPermissionsResponse()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_test_iam_permissions(resp)
+ return resp
+
+ class _UndeleteFolder(FoldersRestStub):
+ def __hash__(self):
+ return hash("UndeleteFolder")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: folders.UndeleteFolderRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the undelete folder method over HTTP.
+
+ Args:
+ request (~.folders.UndeleteFolderRequest):
+ The request object. The UndeleteFolder request message.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{name=folders/*}:undelete",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_undelete_folder(request, metadata)
+ pb_request = folders.UndeleteFolderRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_undelete_folder(resp)
+ return resp
+
+ class _UpdateFolder(FoldersRestStub):
+ def __hash__(self):
+ return hash("UpdateFolder")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "updateMask": {},
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: folders.UpdateFolderRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the update folder method over HTTP.
+
+ Args:
+ request (~.folders.UpdateFolderRequest):
+ The request object. The request sent to the
+ [UpdateFolder][google.cloud.resourcemanager.v3.Folder.UpdateFolder]
+ method.
+
+ Only the ``display_name`` field can be changed. All
+ other fields will be ignored. Use the
+ [MoveFolder][google.cloud.resourcemanager.v3.Folders.MoveFolder]
+ method to change the ``parent`` field.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "patch",
+ "uri": "/v3/{folder.name=folders/*}",
+ "body": "folder",
+ },
+ ]
+ request, metadata = self._interceptor.pre_update_folder(request, metadata)
+ pb_request = folders.UpdateFolderRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_update_folder(resp)
+ return resp
+
+ @property
+ def create_folder(
+ self,
+ ) -> Callable[[folders.CreateFolderRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._CreateFolder(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def delete_folder(
+ self,
+ ) -> Callable[[folders.DeleteFolderRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._DeleteFolder(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_folder(self) -> Callable[[folders.GetFolderRequest], folders.Folder]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetFolder(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], policy_pb2.Policy]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetIamPolicy(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_folders(
+ self,
+ ) -> Callable[[folders.ListFoldersRequest], folders.ListFoldersResponse]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListFolders(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def move_folder(
+ self,
+ ) -> Callable[[folders.MoveFolderRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._MoveFolder(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def search_folders(
+ self,
+ ) -> Callable[[folders.SearchFoldersRequest], folders.SearchFoldersResponse]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._SearchFolders(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], policy_pb2.Policy]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._SetIamPolicy(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ iam_policy_pb2.TestIamPermissionsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._TestIamPermissions(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def undelete_folder(
+ self,
+ ) -> Callable[[folders.UndeleteFolderRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._UndeleteFolder(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def update_folder(
+ self,
+ ) -> Callable[[folders.UpdateFolderRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._UpdateFolder(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_operation(self):
+ return self._GetOperation(self._session, self._host, self._interceptor) # type: ignore
+
+ class _GetOperation(FoldersRestStub):
+ def __call__(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+
+ r"""Call the get operation method over HTTP.
+
+ Args:
+ request (operations_pb2.GetOperationRequest):
+ The request object for GetOperation method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ operations_pb2.Operation: Response from GetOperation method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/{name=operations/**}",
+ },
+ ]
+
+ request, metadata = self._interceptor.pre_get_operation(request, metadata)
+ request_kwargs = json_format.MessageToDict(request)
+ transcoded_request = path_template.transcode(http_options, **request_kwargs)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(json.dumps(transcoded_request["query_params"]))
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ resp = operations_pb2.Operation()
+ resp = json_format.Parse(response.content.decode("utf-8"), resp)
+ resp = self._interceptor.post_get_operation(resp)
+ return resp
+
+ @property
+ def kind(self) -> str:
+ return "rest"
+
+ def close(self):
+ self._session.close()
+
+
+__all__ = ("FoldersRestTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/__init__.py
@@ -0,0 +1,22 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .async_client import OrganizationsAsyncClient
+from .client import OrganizationsClient
+
+__all__ = (
+ "OrganizationsClient",
+ "OrganizationsAsyncClient",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/async_client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/async_client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/async_client.py
@@ -0,0 +1,1012 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import functools
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+)
+
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.api_core.client_options import ClientOptions
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.organizations import pagers
+from google.cloud.resourcemanager_v3.types import organizations
+
+from .client import OrganizationsClient
+from .transports.base import DEFAULT_CLIENT_INFO, OrganizationsTransport
+from .transports.grpc_asyncio import OrganizationsGrpcAsyncIOTransport
+
+
+class OrganizationsAsyncClient:
+ """Allows users to manage their organization resources."""
+
+ _client: OrganizationsClient
+
+ DEFAULT_ENDPOINT = OrganizationsClient.DEFAULT_ENDPOINT
+ DEFAULT_MTLS_ENDPOINT = OrganizationsClient.DEFAULT_MTLS_ENDPOINT
+
+ organization_path = staticmethod(OrganizationsClient.organization_path)
+ parse_organization_path = staticmethod(OrganizationsClient.parse_organization_path)
+ common_billing_account_path = staticmethod(
+ OrganizationsClient.common_billing_account_path
+ )
+ parse_common_billing_account_path = staticmethod(
+ OrganizationsClient.parse_common_billing_account_path
+ )
+ common_folder_path = staticmethod(OrganizationsClient.common_folder_path)
+ parse_common_folder_path = staticmethod(
+ OrganizationsClient.parse_common_folder_path
+ )
+ common_organization_path = staticmethod(
+ OrganizationsClient.common_organization_path
+ )
+ parse_common_organization_path = staticmethod(
+ OrganizationsClient.parse_common_organization_path
+ )
+ common_project_path = staticmethod(OrganizationsClient.common_project_path)
+ parse_common_project_path = staticmethod(
+ OrganizationsClient.parse_common_project_path
+ )
+ common_location_path = staticmethod(OrganizationsClient.common_location_path)
+ parse_common_location_path = staticmethod(
+ OrganizationsClient.parse_common_location_path
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ OrganizationsAsyncClient: The constructed client.
+ """
+ return OrganizationsClient.from_service_account_info.__func__(OrganizationsAsyncClient, info, *args, **kwargs) # type: ignore
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ OrganizationsAsyncClient: The constructed client.
+ """
+ return OrganizationsClient.from_service_account_file.__func__(OrganizationsAsyncClient, filename, *args, **kwargs) # type: ignore
+
+ from_service_account_json = from_service_account_file
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ return OrganizationsClient.get_mtls_endpoint_and_cert_source(client_options) # type: ignore
+
+ @property
+ def transport(self) -> OrganizationsTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ OrganizationsTransport: The transport used by the client instance.
+ """
+ return self._client.transport
+
+ get_transport_class = functools.partial(
+ type(OrganizationsClient).get_transport_class, type(OrganizationsClient)
+ )
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Union[str, OrganizationsTransport] = "grpc_asyncio",
+ client_options: Optional[ClientOptions] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the organizations client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, ~.OrganizationsTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (ClientOptions): Custom options for the client. It
+ won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ self._client = OrganizationsClient(
+ credentials=credentials,
+ transport=transport,
+ client_options=client_options,
+ client_info=client_info,
+ )
+
+ async def get_organization(
+ self,
+ request: Optional[Union[organizations.GetOrganizationRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> organizations.Organization:
+ r"""Fetches an organization resource identified by the
+ specified resource name.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_get_organization():
+ # Create a client
+ client = resourcemanager_v3.OrganizationsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetOrganizationRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_organization(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.GetOrganizationRequest, dict]]):
+ The request object. The request sent to the ``GetOrganization`` method. The
+ ``name`` field is required. ``organization_id`` is no
+ longer accepted.
+ name (:class:`str`):
+ Required. The resource name of the Organization to
+ fetch. This is the organization's relative path in the
+ API, formatted as "organizations/[organizationId]". For
+ example, "organizations/1234".
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.Organization:
+ The root node in the resource
+ hierarchy to which a particular entity's
+ (a company, for example) resources
+ belong.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = organizations.GetOrganizationRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_organization,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def search_organizations(
+ self,
+ request: Optional[Union[organizations.SearchOrganizationsRequest, dict]] = None,
+ *,
+ query: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.SearchOrganizationsAsyncPager:
+ r"""Searches organization resources that are visible to the user and
+ satisfy the specified filter. This method returns organizations
+ in an unspecified order. New organizations do not necessarily
+ appear at the end of the results, and may take a small amount of
+ time to appear.
+
+ Search will only return organizations on which the user has the
+ permission ``resourcemanager.organizations.get``
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_search_organizations():
+ # Create a client
+ client = resourcemanager_v3.OrganizationsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.SearchOrganizationsRequest(
+ )
+
+ # Make the request
+ page_result = client.search_organizations(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.SearchOrganizationsRequest, dict]]):
+ The request object. The request sent to the ``SearchOrganizations`` method.
+ query (:class:`str`):
+ Optional. An optional query string used to filter the
+ Organizations to return in the response. Query rules are
+ case-insensitive.
+
+ ::
+
+ | Field | Description |
+ |------------------|--------------------------------------------|
+ | directoryCustomerId, owner.directoryCustomerId | Filters by directory
+ customer id. |
+ | domain | Filters by domain. |
+
+ Organizations may be queried by ``directoryCustomerId``
+ or by ``domain``, where the domain is a G Suite domain,
+ for example:
+
+ - Query ``directorycustomerid:123456789`` returns
+ Organization resources with
+ ``owner.directory_customer_id`` equal to
+ ``123456789``.
+ - Query ``domain:google.com`` returns Organization
+ resources corresponding to the domain ``google.com``.
+
+ This corresponds to the ``query`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.organizations.pagers.SearchOrganizationsAsyncPager:
+ The response returned from the SearchOrganizations
+ method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([query])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = organizations.SearchOrganizationsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if query is not None:
+ request.query = query
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.search_organizations,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.SearchOrganizationsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.GetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Gets the access control policy for an organization resource. The
+ policy may be empty if no such policy or resource exists. The
+ ``resource`` field should be the organization's resource name,
+ for example: "organizations/123".
+
+ Authorization requires the IAM permission
+ ``resourcemanager.organizations.getIamPolicy`` on the specified
+ organization.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_get_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.OrganizationsAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = await client.get_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.GetIamPolicyRequest, dict]]):
+ The request object. Request message for ``GetIamPolicy`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy is being requested. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.GetIamPolicyRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource=resource,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_iam_policy,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def set_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.SetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Sets the access control policy on an organization resource.
+ Replaces any existing policy. The ``resource`` field should be
+ the organization's resource name, for example:
+ "organizations/123".
+
+ Authorization requires the IAM permission
+ ``resourcemanager.organizations.setIamPolicy`` on the specified
+ organization.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_set_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.OrganizationsAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = await client.set_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.SetIamPolicyRequest, dict]]):
+ The request object. Request message for ``SetIamPolicy`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy is being specified. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.SetIamPolicyRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource=resource,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.set_iam_policy,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def test_iam_permissions(
+ self,
+ request: Optional[Union[iam_policy_pb2.TestIamPermissionsRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ permissions: Optional[MutableSequence[str]] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Returns the permissions that a caller has on the specified
+ organization. The ``resource`` field should be the
+ organization's resource name, for example: "organizations/123".
+
+ There are no permissions required for making this API call.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_test_iam_permissions():
+ # Create a client
+ client = resourcemanager_v3.OrganizationsAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource="resource_value",
+ permissions=['permissions_value1', 'permissions_value2'],
+ )
+
+ # Make the request
+ response = await client.test_iam_permissions(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest, dict]]):
+ The request object. Request message for ``TestIamPermissions`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy detail is being requested. See
+ the operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ permissions (:class:`MutableSequence[str]`):
+ The set of permissions to check for the ``resource``.
+ Permissions with wildcards (such as '*' or 'storage.*')
+ are not allowed. For more information see `IAM
+ Overview <https://cloud.google.com/iam/docs/overview#permissions>`__.
+
+ This corresponds to the ``permissions`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for TestIamPermissions method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource, permissions])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.TestIamPermissionsRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource=resource,
+ permissions=permissions,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.test_iam_permissions,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._client._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, exc_type, exc, tb):
+ await self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("OrganizationsAsyncClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/client.py
@@ -0,0 +1,1213 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import os
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+ cast,
+)
+
+from google.api_core import client_options as client_options_lib
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.exceptions import MutualTLSChannelError # type: ignore
+from google.auth.transport import mtls # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.organizations import pagers
+from google.cloud.resourcemanager_v3.types import organizations
+
+from .transports.base import DEFAULT_CLIENT_INFO, OrganizationsTransport
+from .transports.grpc import OrganizationsGrpcTransport
+from .transports.grpc_asyncio import OrganizationsGrpcAsyncIOTransport
+from .transports.rest import OrganizationsRestTransport
+
+
+class OrganizationsClientMeta(type):
+ """Metaclass for the Organizations client.
+
+ This provides class-level methods for building and retrieving
+ support objects (e.g. transport) without polluting the client instance
+ objects.
+ """
+
+ _transport_registry = OrderedDict() # type: Dict[str, Type[OrganizationsTransport]]
+ _transport_registry["grpc"] = OrganizationsGrpcTransport
+ _transport_registry["grpc_asyncio"] = OrganizationsGrpcAsyncIOTransport
+ _transport_registry["rest"] = OrganizationsRestTransport
+
+ def get_transport_class(
+ cls,
+ label: Optional[str] = None,
+ ) -> Type[OrganizationsTransport]:
+ """Returns an appropriate transport class.
+
+ Args:
+ label: The name of the desired transport. If none is
+ provided, then the first transport in the registry is used.
+
+ Returns:
+ The transport class to use.
+ """
+ # If a specific transport is requested, return that one.
+ if label:
+ return cls._transport_registry[label]
+
+ # No transport is requested; return the default (that is, the first one
+ # in the dictionary).
+ return next(iter(cls._transport_registry.values()))
+
+
+class OrganizationsClient(metaclass=OrganizationsClientMeta):
+ """Allows users to manage their organization resources."""
+
+ @staticmethod
+ def _get_default_mtls_endpoint(api_endpoint):
+ """Converts api endpoint to mTLS endpoint.
+
+ Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to
+ "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively.
+ Args:
+ api_endpoint (Optional[str]): the api endpoint to convert.
+ Returns:
+ str: converted mTLS api endpoint.
+ """
+ if not api_endpoint:
+ return api_endpoint
+
+ mtls_endpoint_re = re.compile(
+ r"(?P<name>[^.]+)(?P<mtls>\.mtls)?(?P<sandbox>\.sandbox)?(?P<googledomain>\.googleapis\.com)?"
+ )
+
+ m = mtls_endpoint_re.match(api_endpoint)
+ name, mtls, sandbox, googledomain = m.groups()
+ if mtls or not googledomain:
+ return api_endpoint
+
+ if sandbox:
+ return api_endpoint.replace(
+ "sandbox.googleapis.com", "mtls.sandbox.googleapis.com"
+ )
+
+ return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com")
+
+ DEFAULT_ENDPOINT = "cloudresourcemanager.googleapis.com"
+ DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore
+ DEFAULT_ENDPOINT
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ OrganizationsClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_info(info)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ OrganizationsClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_file(filename)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ from_service_account_json = from_service_account_file
+
+ @property
+ def transport(self) -> OrganizationsTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ OrganizationsTransport: The transport used by the client
+ instance.
+ """
+ return self._transport
+
+ @staticmethod
+ def organization_path(
+ organization: str,
+ ) -> str:
+ """Returns a fully-qualified organization string."""
+ return "organizations/{organization}".format(
+ organization=organization,
+ )
+
+ @staticmethod
+ def parse_organization_path(path: str) -> Dict[str, str]:
+ """Parses a organization path into its component segments."""
+ m = re.match(r"^organizations/(?P<organization>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_billing_account_path(
+ billing_account: str,
+ ) -> str:
+ """Returns a fully-qualified billing_account string."""
+ return "billingAccounts/{billing_account}".format(
+ billing_account=billing_account,
+ )
+
+ @staticmethod
+ def parse_common_billing_account_path(path: str) -> Dict[str, str]:
+ """Parse a billing_account path into its component segments."""
+ m = re.match(r"^billingAccounts/(?P<billing_account>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_folder_path(
+ folder: str,
+ ) -> str:
+ """Returns a fully-qualified folder string."""
+ return "folders/{folder}".format(
+ folder=folder,
+ )
+
+ @staticmethod
+ def parse_common_folder_path(path: str) -> Dict[str, str]:
+ """Parse a folder path into its component segments."""
+ m = re.match(r"^folders/(?P<folder>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_organization_path(
+ organization: str,
+ ) -> str:
+ """Returns a fully-qualified organization string."""
+ return "organizations/{organization}".format(
+ organization=organization,
+ )
+
+ @staticmethod
+ def parse_common_organization_path(path: str) -> Dict[str, str]:
+ """Parse a organization path into its component segments."""
+ m = re.match(r"^organizations/(?P<organization>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_project_path(
+ project: str,
+ ) -> str:
+ """Returns a fully-qualified project string."""
+ return "projects/{project}".format(
+ project=project,
+ )
+
+ @staticmethod
+ def parse_common_project_path(path: str) -> Dict[str, str]:
+ """Parse a project path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_location_path(
+ project: str,
+ location: str,
+ ) -> str:
+ """Returns a fully-qualified location string."""
+ return "projects/{project}/locations/{location}".format(
+ project=project,
+ location=location,
+ )
+
+ @staticmethod
+ def parse_common_location_path(path: str) -> Dict[str, str]:
+ """Parse a location path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[client_options_lib.ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ use_client_cert = os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")
+ use_mtls_endpoint = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto")
+ if use_client_cert not in ("true", "false"):
+ raise ValueError(
+ "Environment variable `GOOGLE_API_USE_CLIENT_CERTIFICATE` must be either `true` or `false`"
+ )
+ if use_mtls_endpoint not in ("auto", "never", "always"):
+ raise MutualTLSChannelError(
+ "Environment variable `GOOGLE_API_USE_MTLS_ENDPOINT` must be `never`, `auto` or `always`"
+ )
+
+ # Figure out the client cert source to use.
+ client_cert_source = None
+ if use_client_cert == "true":
+ if client_options.client_cert_source:
+ client_cert_source = client_options.client_cert_source
+ elif mtls.has_default_client_cert_source():
+ client_cert_source = mtls.default_client_cert_source()
+
+ # Figure out which api endpoint to use.
+ if client_options.api_endpoint is not None:
+ api_endpoint = client_options.api_endpoint
+ elif use_mtls_endpoint == "always" or (
+ use_mtls_endpoint == "auto" and client_cert_source
+ ):
+ api_endpoint = cls.DEFAULT_MTLS_ENDPOINT
+ else:
+ api_endpoint = cls.DEFAULT_ENDPOINT
+
+ return api_endpoint, client_cert_source
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Optional[Union[str, OrganizationsTransport]] = None,
+ client_options: Optional[Union[client_options_lib.ClientOptions, dict]] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the organizations client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, OrganizationsTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (Optional[Union[google.api_core.client_options.ClientOptions, dict]]): Custom options for the
+ client. It won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ if isinstance(client_options, dict):
+ client_options = client_options_lib.from_dict(client_options)
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ client_options = cast(client_options_lib.ClientOptions, client_options)
+
+ api_endpoint, client_cert_source_func = self.get_mtls_endpoint_and_cert_source(
+ client_options
+ )
+
+ api_key_value = getattr(client_options, "api_key", None)
+ if api_key_value and credentials:
+ raise ValueError(
+ "client_options.api_key and credentials are mutually exclusive"
+ )
+
+ # Save or instantiate the transport.
+ # Ordinarily, we provide the transport, but allowing a custom transport
+ # instance provides an extensibility point for unusual situations.
+ if isinstance(transport, OrganizationsTransport):
+ # transport is a OrganizationsTransport instance.
+ if credentials or client_options.credentials_file or api_key_value:
+ raise ValueError(
+ "When providing a transport instance, "
+ "provide its credentials directly."
+ )
+ if client_options.scopes:
+ raise ValueError(
+ "When providing a transport instance, provide its scopes "
+ "directly."
+ )
+ self._transport = transport
+ else:
+ import google.auth._default # type: ignore
+
+ if api_key_value and hasattr(
+ google.auth._default, "get_api_key_credentials"
+ ):
+ credentials = google.auth._default.get_api_key_credentials(
+ api_key_value
+ )
+
+ Transport = type(self).get_transport_class(transport)
+ self._transport = Transport(
+ credentials=credentials,
+ credentials_file=client_options.credentials_file,
+ host=api_endpoint,
+ scopes=client_options.scopes,
+ client_cert_source_for_mtls=client_cert_source_func,
+ quota_project_id=client_options.quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=True,
+ api_audience=client_options.api_audience,
+ )
+
+ def get_organization(
+ self,
+ request: Optional[Union[organizations.GetOrganizationRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> organizations.Organization:
+ r"""Fetches an organization resource identified by the
+ specified resource name.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_get_organization():
+ # Create a client
+ client = resourcemanager_v3.OrganizationsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetOrganizationRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_organization(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.GetOrganizationRequest, dict]):
+ The request object. The request sent to the ``GetOrganization`` method. The
+ ``name`` field is required. ``organization_id`` is no
+ longer accepted.
+ name (str):
+ Required. The resource name of the Organization to
+ fetch. This is the organization's relative path in the
+ API, formatted as "organizations/[organizationId]". For
+ example, "organizations/1234".
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.Organization:
+ The root node in the resource
+ hierarchy to which a particular entity's
+ (a company, for example) resources
+ belong.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a organizations.GetOrganizationRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, organizations.GetOrganizationRequest):
+ request = organizations.GetOrganizationRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_organization]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def search_organizations(
+ self,
+ request: Optional[Union[organizations.SearchOrganizationsRequest, dict]] = None,
+ *,
+ query: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.SearchOrganizationsPager:
+ r"""Searches organization resources that are visible to the user and
+ satisfy the specified filter. This method returns organizations
+ in an unspecified order. New organizations do not necessarily
+ appear at the end of the results, and may take a small amount of
+ time to appear.
+
+ Search will only return organizations on which the user has the
+ permission ``resourcemanager.organizations.get``
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_search_organizations():
+ # Create a client
+ client = resourcemanager_v3.OrganizationsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.SearchOrganizationsRequest(
+ )
+
+ # Make the request
+ page_result = client.search_organizations(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.SearchOrganizationsRequest, dict]):
+ The request object. The request sent to the ``SearchOrganizations`` method.
+ query (str):
+ Optional. An optional query string used to filter the
+ Organizations to return in the response. Query rules are
+ case-insensitive.
+
+ ::
+
+ | Field | Description |
+ |------------------|--------------------------------------------|
+ | directoryCustomerId, owner.directoryCustomerId | Filters by directory
+ customer id. |
+ | domain | Filters by domain. |
+
+ Organizations may be queried by ``directoryCustomerId``
+ or by ``domain``, where the domain is a G Suite domain,
+ for example:
+
+ - Query ``directorycustomerid:123456789`` returns
+ Organization resources with
+ ``owner.directory_customer_id`` equal to
+ ``123456789``.
+ - Query ``domain:google.com`` returns Organization
+ resources corresponding to the domain ``google.com``.
+
+ This corresponds to the ``query`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.organizations.pagers.SearchOrganizationsPager:
+ The response returned from the SearchOrganizations
+ method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([query])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a organizations.SearchOrganizationsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, organizations.SearchOrganizationsRequest):
+ request = organizations.SearchOrganizationsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if query is not None:
+ request.query = query
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.search_organizations]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.SearchOrganizationsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.GetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Gets the access control policy for an organization resource. The
+ policy may be empty if no such policy or resource exists. The
+ ``resource`` field should be the organization's resource name,
+ for example: "organizations/123".
+
+ Authorization requires the IAM permission
+ ``resourcemanager.organizations.getIamPolicy`` on the specified
+ organization.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_get_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.OrganizationsClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = client.get_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.GetIamPolicyRequest, dict]):
+ The request object. Request message for ``GetIamPolicy`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy is being requested. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.GetIamPolicyRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.GetIamPolicyRequest()
+ if resource is not None:
+ request.resource = resource
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_iam_policy]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def set_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.SetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Sets the access control policy on an organization resource.
+ Replaces any existing policy. The ``resource`` field should be
+ the organization's resource name, for example:
+ "organizations/123".
+
+ Authorization requires the IAM permission
+ ``resourcemanager.organizations.setIamPolicy`` on the specified
+ organization.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_set_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.OrganizationsClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = client.set_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.SetIamPolicyRequest, dict]):
+ The request object. Request message for ``SetIamPolicy`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy is being specified. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.SetIamPolicyRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.SetIamPolicyRequest()
+ if resource is not None:
+ request.resource = resource
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.set_iam_policy]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def test_iam_permissions(
+ self,
+ request: Optional[Union[iam_policy_pb2.TestIamPermissionsRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ permissions: Optional[MutableSequence[str]] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Returns the permissions that a caller has on the specified
+ organization. The ``resource`` field should be the
+ organization's resource name, for example: "organizations/123".
+
+ There are no permissions required for making this API call.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_test_iam_permissions():
+ # Create a client
+ client = resourcemanager_v3.OrganizationsClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource="resource_value",
+ permissions=['permissions_value1', 'permissions_value2'],
+ )
+
+ # Make the request
+ response = client.test_iam_permissions(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest, dict]):
+ The request object. Request message for ``TestIamPermissions`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy detail is being requested. See
+ the operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ permissions (MutableSequence[str]):
+ The set of permissions to check for the ``resource``.
+ Permissions with wildcards (such as '*' or 'storage.*')
+ are not allowed. For more information see `IAM
+ Overview <https://cloud.google.com/iam/docs/overview#permissions>`__.
+
+ This corresponds to the ``permissions`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for TestIamPermissions method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource, permissions])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.TestIamPermissionsRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.TestIamPermissionsRequest()
+ if resource is not None:
+ request.resource = resource
+ if permissions:
+ request.permissions.extend(permissions)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.test_iam_permissions]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def __enter__(self) -> "OrganizationsClient":
+ return self
+
+ def __exit__(self, type, value, traceback):
+ """Releases underlying transport's resources.
+
+ .. warning::
+ ONLY use as a context manager if the transport is NOT shared
+ with other clients! Exiting the with block will CLOSE the transport
+ and may cause errors in other clients!
+ """
+ self.transport.close()
+
+ def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("OrganizationsClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/pagers.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/pagers.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/pagers.py
@@ -0,0 +1,155 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import (
+ Any,
+ AsyncIterator,
+ Awaitable,
+ Callable,
+ Iterator,
+ Optional,
+ Sequence,
+ Tuple,
+)
+
+from google.cloud.resourcemanager_v3.types import organizations
+
+
+class SearchOrganizationsPager:
+ """A pager for iterating through ``search_organizations`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.SearchOrganizationsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``organizations`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``SearchOrganizations`` requests and continue to iterate
+ through the ``organizations`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.SearchOrganizationsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., organizations.SearchOrganizationsResponse],
+ request: organizations.SearchOrganizationsRequest,
+ response: organizations.SearchOrganizationsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.SearchOrganizationsRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.SearchOrganizationsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = organizations.SearchOrganizationsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[organizations.SearchOrganizationsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[organizations.Organization]:
+ for page in self.pages:
+ yield from page.organizations
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class SearchOrganizationsAsyncPager:
+ """A pager for iterating through ``search_organizations`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.SearchOrganizationsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``organizations`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``SearchOrganizations`` requests and continue to iterate
+ through the ``organizations`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.SearchOrganizationsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[organizations.SearchOrganizationsResponse]],
+ request: organizations.SearchOrganizationsRequest,
+ response: organizations.SearchOrganizationsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.SearchOrganizationsRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.SearchOrganizationsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = organizations.SearchOrganizationsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[organizations.SearchOrganizationsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[organizations.Organization]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.organizations:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/__init__.py
@@ -0,0 +1,36 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+from typing import Dict, Type
+
+from .base import OrganizationsTransport
+from .grpc import OrganizationsGrpcTransport
+from .grpc_asyncio import OrganizationsGrpcAsyncIOTransport
+from .rest import OrganizationsRestInterceptor, OrganizationsRestTransport
+
+# Compile a registry of transports.
+_transport_registry = OrderedDict() # type: Dict[str, Type[OrganizationsTransport]]
+_transport_registry["grpc"] = OrganizationsGrpcTransport
+_transport_registry["grpc_asyncio"] = OrganizationsGrpcAsyncIOTransport
+_transport_registry["rest"] = OrganizationsRestTransport
+
+__all__ = (
+ "OrganizationsTransport",
+ "OrganizationsGrpcTransport",
+ "OrganizationsGrpcAsyncIOTransport",
+ "OrganizationsRestTransport",
+ "OrganizationsRestInterceptor",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/base.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/base.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/base.py
@@ -0,0 +1,250 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import abc
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Union
+
+import google.api_core
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+from google.cloud.resourcemanager_v3.types import organizations
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+class OrganizationsTransport(abc.ABC):
+ """Abstract transport class for Organizations."""
+
+ AUTH_SCOPES = (
+ "https://www.googleapis.com/auth/cloud-platform",
+ "https://www.googleapis.com/auth/cloud-platform.read-only",
+ )
+
+ DEFAULT_HOST: str = "cloudresourcemanager.googleapis.com"
+
+ def __init__(
+ self,
+ *,
+ host: str = DEFAULT_HOST,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ **kwargs,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A list of scopes.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ """
+
+ scopes_kwargs = {"scopes": scopes, "default_scopes": self.AUTH_SCOPES}
+
+ # Save the scopes.
+ self._scopes = scopes
+
+ # If no credentials are provided, then determine the appropriate
+ # defaults.
+ if credentials and credentials_file:
+ raise core_exceptions.DuplicateCredentialArgs(
+ "'credentials_file' and 'credentials' are mutually exclusive"
+ )
+
+ if credentials_file is not None:
+ credentials, _ = google.auth.load_credentials_from_file(
+ credentials_file, **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ elif credentials is None:
+ credentials, _ = google.auth.default(
+ **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ # Don't apply audience if the credentials file passed from user.
+ if hasattr(credentials, "with_gdch_audience"):
+ credentials = credentials.with_gdch_audience(
+ api_audience if api_audience else host
+ )
+
+ # If the credentials are service account credentials, then always try to use self signed JWT.
+ if (
+ always_use_jwt_access
+ and isinstance(credentials, service_account.Credentials)
+ and hasattr(service_account.Credentials, "with_always_use_jwt_access")
+ ):
+ credentials = credentials.with_always_use_jwt_access(True)
+
+ # Save the credentials.
+ self._credentials = credentials
+
+ # Save the hostname. Default to port 443 (HTTPS) if none is specified.
+ if ":" not in host:
+ host += ":443"
+ self._host = host
+
+ def _prep_wrapped_messages(self, client_info):
+ # Precompute the wrapped methods.
+ self._wrapped_methods = {
+ self.get_organization: gapic_v1.method.wrap_method(
+ self.get_organization,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.search_organizations: gapic_v1.method.wrap_method(
+ self.search_organizations,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.get_iam_policy: gapic_v1.method.wrap_method(
+ self.get_iam_policy,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.set_iam_policy: gapic_v1.method.wrap_method(
+ self.set_iam_policy,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.test_iam_permissions: gapic_v1.method.wrap_method(
+ self.test_iam_permissions,
+ default_timeout=None,
+ client_info=client_info,
+ ),
+ }
+
+ def close(self):
+ """Closes resources associated with the transport.
+
+ .. warning::
+ Only call this method if the transport is NOT shared
+ with other clients - this may cause errors in other clients!
+ """
+ raise NotImplementedError()
+
+ @property
+ def get_organization(
+ self,
+ ) -> Callable[
+ [organizations.GetOrganizationRequest],
+ Union[organizations.Organization, Awaitable[organizations.Organization]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def search_organizations(
+ self,
+ ) -> Callable[
+ [organizations.SearchOrganizationsRequest],
+ Union[
+ organizations.SearchOrganizationsResponse,
+ Awaitable[organizations.SearchOrganizationsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.GetIamPolicyRequest],
+ Union[policy_pb2.Policy, Awaitable[policy_pb2.Policy]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.SetIamPolicyRequest],
+ Union[policy_pb2.Policy, Awaitable[policy_pb2.Policy]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ Union[
+ iam_policy_pb2.TestIamPermissionsResponse,
+ Awaitable[iam_policy_pb2.TestIamPermissionsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[
+ [operations_pb2.GetOperationRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def kind(self) -> str:
+ raise NotImplementedError()
+
+
+__all__ = ("OrganizationsTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/grpc.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/grpc.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/grpc.py
@@ -0,0 +1,421 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2
+import grpc # type: ignore
+
+from google.cloud.resourcemanager_v3.types import organizations
+
+from .base import DEFAULT_CLIENT_INFO, OrganizationsTransport
+
+
+class OrganizationsGrpcTransport(OrganizationsTransport):
+ """gRPC backend transport for Organizations.
+
+ Allows users to manage their organization resources.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _stubs: Dict[str, Callable]
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[grpc.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ channel (Optional[grpc.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> grpc.Channel:
+ """Create and return a gRPC channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ grpc.Channel: A gRPC channel object.
+
+ Raises:
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+
+ return grpc_helpers.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ @property
+ def grpc_channel(self) -> grpc.Channel:
+ """Return the channel designed to connect to this service."""
+ return self._grpc_channel
+
+ @property
+ def get_organization(
+ self,
+ ) -> Callable[[organizations.GetOrganizationRequest], organizations.Organization]:
+ r"""Return a callable for the get organization method over gRPC.
+
+ Fetches an organization resource identified by the
+ specified resource name.
+
+ Returns:
+ Callable[[~.GetOrganizationRequest],
+ ~.Organization]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_organization" not in self._stubs:
+ self._stubs["get_organization"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Organizations/GetOrganization",
+ request_serializer=organizations.GetOrganizationRequest.serialize,
+ response_deserializer=organizations.Organization.deserialize,
+ )
+ return self._stubs["get_organization"]
+
+ @property
+ def search_organizations(
+ self,
+ ) -> Callable[
+ [organizations.SearchOrganizationsRequest],
+ organizations.SearchOrganizationsResponse,
+ ]:
+ r"""Return a callable for the search organizations method over gRPC.
+
+ Searches organization resources that are visible to the user and
+ satisfy the specified filter. This method returns organizations
+ in an unspecified order. New organizations do not necessarily
+ appear at the end of the results, and may take a small amount of
+ time to appear.
+
+ Search will only return organizations on which the user has the
+ permission ``resourcemanager.organizations.get``
+
+ Returns:
+ Callable[[~.SearchOrganizationsRequest],
+ ~.SearchOrganizationsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "search_organizations" not in self._stubs:
+ self._stubs["search_organizations"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Organizations/SearchOrganizations",
+ request_serializer=organizations.SearchOrganizationsRequest.serialize,
+ response_deserializer=organizations.SearchOrganizationsResponse.deserialize,
+ )
+ return self._stubs["search_organizations"]
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], policy_pb2.Policy]:
+ r"""Return a callable for the get iam policy method over gRPC.
+
+ Gets the access control policy for an organization resource. The
+ policy may be empty if no such policy or resource exists. The
+ ``resource`` field should be the organization's resource name,
+ for example: "organizations/123".
+
+ Authorization requires the IAM permission
+ ``resourcemanager.organizations.getIamPolicy`` on the specified
+ organization.
+
+ Returns:
+ Callable[[~.GetIamPolicyRequest],
+ ~.Policy]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_iam_policy" not in self._stubs:
+ self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Organizations/GetIamPolicy",
+ request_serializer=iam_policy_pb2.GetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["get_iam_policy"]
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], policy_pb2.Policy]:
+ r"""Return a callable for the set iam policy method over gRPC.
+
+ Sets the access control policy on an organization resource.
+ Replaces any existing policy. The ``resource`` field should be
+ the organization's resource name, for example:
+ "organizations/123".
+
+ Authorization requires the IAM permission
+ ``resourcemanager.organizations.setIamPolicy`` on the specified
+ organization.
+
+ Returns:
+ Callable[[~.SetIamPolicyRequest],
+ ~.Policy]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "set_iam_policy" not in self._stubs:
+ self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Organizations/SetIamPolicy",
+ request_serializer=iam_policy_pb2.SetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["set_iam_policy"]
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ iam_policy_pb2.TestIamPermissionsResponse,
+ ]:
+ r"""Return a callable for the test iam permissions method over gRPC.
+
+ Returns the permissions that a caller has on the specified
+ organization. The ``resource`` field should be the
+ organization's resource name, for example: "organizations/123".
+
+ There are no permissions required for making this API call.
+
+ Returns:
+ Callable[[~.TestIamPermissionsRequest],
+ ~.TestIamPermissionsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "test_iam_permissions" not in self._stubs:
+ self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Organizations/TestIamPermissions",
+ request_serializer=iam_policy_pb2.TestIamPermissionsRequest.SerializeToString,
+ response_deserializer=iam_policy_pb2.TestIamPermissionsResponse.FromString,
+ )
+ return self._stubs["test_iam_permissions"]
+
+ def close(self):
+ self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+ @property
+ def kind(self) -> str:
+ return "grpc"
+
+
+__all__ = ("OrganizationsGrpcTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/grpc_asyncio.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/grpc_asyncio.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/grpc_asyncio.py
@@ -0,0 +1,422 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers_async
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2
+import grpc # type: ignore
+from grpc.experimental import aio # type: ignore
+
+from google.cloud.resourcemanager_v3.types import organizations
+
+from .base import DEFAULT_CLIENT_INFO, OrganizationsTransport
+from .grpc import OrganizationsGrpcTransport
+
+
+class OrganizationsGrpcAsyncIOTransport(OrganizationsTransport):
+ """gRPC AsyncIO backend transport for Organizations.
+
+ Allows users to manage their organization resources.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _grpc_channel: aio.Channel
+ _stubs: Dict[str, Callable] = {}
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> aio.Channel:
+ """Create and return a gRPC AsyncIO channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ aio.Channel: A gRPC AsyncIO channel object.
+ """
+
+ return grpc_helpers_async.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[aio.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ channel (Optional[aio.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def grpc_channel(self) -> aio.Channel:
+ """Create the channel designed to connect to this service.
+
+ This property caches on the instance; repeated calls return
+ the same channel.
+ """
+ # Return the channel from cache.
+ return self._grpc_channel
+
+ @property
+ def get_organization(
+ self,
+ ) -> Callable[
+ [organizations.GetOrganizationRequest], Awaitable[organizations.Organization]
+ ]:
+ r"""Return a callable for the get organization method over gRPC.
+
+ Fetches an organization resource identified by the
+ specified resource name.
+
+ Returns:
+ Callable[[~.GetOrganizationRequest],
+ Awaitable[~.Organization]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_organization" not in self._stubs:
+ self._stubs["get_organization"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Organizations/GetOrganization",
+ request_serializer=organizations.GetOrganizationRequest.serialize,
+ response_deserializer=organizations.Organization.deserialize,
+ )
+ return self._stubs["get_organization"]
+
+ @property
+ def search_organizations(
+ self,
+ ) -> Callable[
+ [organizations.SearchOrganizationsRequest],
+ Awaitable[organizations.SearchOrganizationsResponse],
+ ]:
+ r"""Return a callable for the search organizations method over gRPC.
+
+ Searches organization resources that are visible to the user and
+ satisfy the specified filter. This method returns organizations
+ in an unspecified order. New organizations do not necessarily
+ appear at the end of the results, and may take a small amount of
+ time to appear.
+
+ Search will only return organizations on which the user has the
+ permission ``resourcemanager.organizations.get``
+
+ Returns:
+ Callable[[~.SearchOrganizationsRequest],
+ Awaitable[~.SearchOrganizationsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "search_organizations" not in self._stubs:
+ self._stubs["search_organizations"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Organizations/SearchOrganizations",
+ request_serializer=organizations.SearchOrganizationsRequest.serialize,
+ response_deserializer=organizations.SearchOrganizationsResponse.deserialize,
+ )
+ return self._stubs["search_organizations"]
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], Awaitable[policy_pb2.Policy]]:
+ r"""Return a callable for the get iam policy method over gRPC.
+
+ Gets the access control policy for an organization resource. The
+ policy may be empty if no such policy or resource exists. The
+ ``resource`` field should be the organization's resource name,
+ for example: "organizations/123".
+
+ Authorization requires the IAM permission
+ ``resourcemanager.organizations.getIamPolicy`` on the specified
+ organization.
+
+ Returns:
+ Callable[[~.GetIamPolicyRequest],
+ Awaitable[~.Policy]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_iam_policy" not in self._stubs:
+ self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Organizations/GetIamPolicy",
+ request_serializer=iam_policy_pb2.GetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["get_iam_policy"]
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], Awaitable[policy_pb2.Policy]]:
+ r"""Return a callable for the set iam policy method over gRPC.
+
+ Sets the access control policy on an organization resource.
+ Replaces any existing policy. The ``resource`` field should be
+ the organization's resource name, for example:
+ "organizations/123".
+
+ Authorization requires the IAM permission
+ ``resourcemanager.organizations.setIamPolicy`` on the specified
+ organization.
+
+ Returns:
+ Callable[[~.SetIamPolicyRequest],
+ Awaitable[~.Policy]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "set_iam_policy" not in self._stubs:
+ self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Organizations/SetIamPolicy",
+ request_serializer=iam_policy_pb2.SetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["set_iam_policy"]
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ Awaitable[iam_policy_pb2.TestIamPermissionsResponse],
+ ]:
+ r"""Return a callable for the test iam permissions method over gRPC.
+
+ Returns the permissions that a caller has on the specified
+ organization. The ``resource`` field should be the
+ organization's resource name, for example: "organizations/123".
+
+ There are no permissions required for making this API call.
+
+ Returns:
+ Callable[[~.TestIamPermissionsRequest],
+ Awaitable[~.TestIamPermissionsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "test_iam_permissions" not in self._stubs:
+ self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Organizations/TestIamPermissions",
+ request_serializer=iam_policy_pb2.TestIamPermissionsRequest.SerializeToString,
+ response_deserializer=iam_policy_pb2.TestIamPermissionsResponse.FromString,
+ )
+ return self._stubs["test_iam_permissions"]
+
+ def close(self):
+ return self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+
+__all__ = ("OrganizationsGrpcAsyncIOTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/rest.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/rest.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/organizations/transports/rest.py
@@ -0,0 +1,1078 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import dataclasses
+import json # type: ignore
+import re
+from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, path_template, rest_helpers, rest_streaming
+from google.api_core import exceptions as core_exceptions
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.auth.transport.requests import AuthorizedSession # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import json_format
+import grpc # type: ignore
+from requests import __version__ as requests_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.types import organizations
+
+from .base import DEFAULT_CLIENT_INFO as BASE_DEFAULT_CLIENT_INFO
+from .base import OrganizationsTransport
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=BASE_DEFAULT_CLIENT_INFO.gapic_version,
+ grpc_version=None,
+ rest_version=requests_version,
+)
+
+
+class OrganizationsRestInterceptor:
+ """Interceptor for Organizations.
+
+ Interceptors are used to manipulate requests, request metadata, and responses
+ in arbitrary ways.
+ Example use cases include:
+ * Logging
+ * Verifying requests according to service or custom semantics
+ * Stripping extraneous information from responses
+
+ These use cases and more can be enabled by injecting an
+ instance of a custom subclass when constructing the OrganizationsRestTransport.
+
+ .. code-block:: python
+ class MyCustomOrganizationsInterceptor(OrganizationsRestInterceptor):
+ def pre_get_iam_policy(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_iam_policy(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_organization(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_organization(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_search_organizations(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_search_organizations(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_set_iam_policy(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_set_iam_policy(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_test_iam_permissions(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_test_iam_permissions(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ transport = OrganizationsRestTransport(interceptor=MyCustomOrganizationsInterceptor())
+ client = OrganizationsClient(transport=transport)
+
+
+ """
+
+ def pre_get_iam_policy(
+ self,
+ request: iam_policy_pb2.GetIamPolicyRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.GetIamPolicyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_iam_policy
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Organizations server.
+ """
+ return request, metadata
+
+ def post_get_iam_policy(self, response: policy_pb2.Policy) -> policy_pb2.Policy:
+ """Post-rpc interceptor for get_iam_policy
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Organizations server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_organization(
+ self,
+ request: organizations.GetOrganizationRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[organizations.GetOrganizationRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_organization
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Organizations server.
+ """
+ return request, metadata
+
+ def post_get_organization(
+ self, response: organizations.Organization
+ ) -> organizations.Organization:
+ """Post-rpc interceptor for get_organization
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Organizations server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_search_organizations(
+ self,
+ request: organizations.SearchOrganizationsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[organizations.SearchOrganizationsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for search_organizations
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Organizations server.
+ """
+ return request, metadata
+
+ def post_search_organizations(
+ self, response: organizations.SearchOrganizationsResponse
+ ) -> organizations.SearchOrganizationsResponse:
+ """Post-rpc interceptor for search_organizations
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Organizations server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_set_iam_policy(
+ self,
+ request: iam_policy_pb2.SetIamPolicyRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.SetIamPolicyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for set_iam_policy
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Organizations server.
+ """
+ return request, metadata
+
+ def post_set_iam_policy(self, response: policy_pb2.Policy) -> policy_pb2.Policy:
+ """Post-rpc interceptor for set_iam_policy
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Organizations server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_test_iam_permissions(
+ self,
+ request: iam_policy_pb2.TestIamPermissionsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.TestIamPermissionsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for test_iam_permissions
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Organizations server.
+ """
+ return request, metadata
+
+ def post_test_iam_permissions(
+ self, response: iam_policy_pb2.TestIamPermissionsResponse
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ """Post-rpc interceptor for test_iam_permissions
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Organizations server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_operation(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[operations_pb2.GetOperationRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Organizations server.
+ """
+ return request, metadata
+
+ def post_get_operation(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Organizations server but before
+ it is returned to user code.
+ """
+ return response
+
+
+@dataclasses.dataclass
+class OrganizationsRestStub:
+ _session: AuthorizedSession
+ _host: str
+ _interceptor: OrganizationsRestInterceptor
+
+
+class OrganizationsRestTransport(OrganizationsTransport):
+ """REST backend transport for Organizations.
+
+ Allows users to manage their organization resources.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends JSON representations of protocol buffers over HTTP/1.1
+
+ """
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ url_scheme: str = "https",
+ interceptor: Optional[OrganizationsRestInterceptor] = None,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Callable[[], Tuple[bytes, bytes]]): Client
+ certificate to configure mutual TLS HTTP channel. It is ignored
+ if ``channel`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you are developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ url_scheme: the protocol scheme for the API endpoint. Normally
+ "https", but for testing or local servers,
+ "http" can be specified.
+ """
+ # Run the base constructor
+ # TODO(yon-mg): resolve other ctor params i.e. scopes, quota, etc.
+ # TODO: When custom host (api_endpoint) is set, `scopes` must *also* be set on the
+ # credentials object
+ maybe_url_match = re.match("^(?P<scheme>http(?:s)?://)?(?P<host>.*)$", host)
+ if maybe_url_match is None:
+ raise ValueError(
+ f"Unexpected hostname structure: {host}"
+ ) # pragma: NO COVER
+
+ url_match_items = maybe_url_match.groupdict()
+
+ host = f"{url_scheme}://{host}" if not url_match_items["scheme"] else host
+
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+ self._session = AuthorizedSession(
+ self._credentials, default_host=self.DEFAULT_HOST
+ )
+ if client_cert_source_for_mtls:
+ self._session.configure_mtls_channel(client_cert_source_for_mtls)
+ self._interceptor = interceptor or OrganizationsRestInterceptor()
+ self._prep_wrapped_messages(client_info)
+
+ class _GetIamPolicy(OrganizationsRestStub):
+ def __hash__(self):
+ return hash("GetIamPolicy")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.GetIamPolicyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Call the get iam policy method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.GetIamPolicyRequest):
+ The request object. Request message for ``GetIamPolicy`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which
+ specifies access controls for Google Cloud resources.
+
+ A ``Policy`` is a collection of ``bindings``. A
+ ``binding`` binds one or more ``members``, or
+ principals, to a single ``role``. Principals can be user
+ accounts, service accounts, Google groups, and domains
+ (such as G Suite). A ``role`` is a named list of
+ permissions; each ``role`` can be an IAM predefined role
+ or a user-created custom role.
+
+ For some types of Google Cloud resources, a ``binding``
+ can also specify a ``condition``, which is a logical
+ expression that allows access to a resource only if the
+ expression evaluates to ``true``. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the `IAM
+ documentation <https://cloud.google.com/iam/help/conditions/resource-policies>`__.
+
+ **JSON example:**
+
+ ::
+
+ {
+ "bindings": [
+ {
+ "role": "roles/resourcemanager.organizationAdmin",
+ "members": [
+ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+ },
+ {
+ "role": "roles/resourcemanager.organizationViewer",
+ "members": [
+ "user:eve@example.com"
+ ],
+ "condition": {
+ "title": "expirable access",
+ "description": "Does not grant access after Sep 2020",
+ "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')",
+ }
+ }
+ ],
+ "etag": "BwWWja0YfJA=",
+ "version": 3
+ }
+
+ **YAML example:**
+
+ ::
+
+ bindings:
+ - members:
+ - user:mike@example.com
+ - group:admins@example.com
+ - domain:google.com
+ - serviceAccount:my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin
+ - members:
+ - user:eve@example.com
+ role: roles/resourcemanager.organizationViewer
+ condition:
+ title: expirable access
+ description: Does not grant access after Sep 2020
+ expression: request.time < timestamp('2020-10-01T00:00:00.000Z')
+ etag: BwWWja0YfJA=
+ version: 3
+
+ For a description of IAM and its features, see the `IAM
+ documentation <https://cloud.google.com/iam/docs/>`__.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=organizations/*}:getIamPolicy",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_iam_policy(request, metadata)
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = policy_pb2.Policy()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_iam_policy(resp)
+ return resp
+
+ class _GetOrganization(OrganizationsRestStub):
+ def __hash__(self):
+ return hash("GetOrganization")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: organizations.GetOrganizationRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> organizations.Organization:
+ r"""Call the get organization method over HTTP.
+
+ Args:
+ request (~.organizations.GetOrganizationRequest):
+ The request object. The request sent to the ``GetOrganization`` method. The
+ ``name`` field is required. ``organization_id`` is no
+ longer accepted.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.organizations.Organization:
+ The root node in the resource
+ hierarchy to which a particular entity's
+ (a company, for example) resources
+ belong.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/{name=organizations/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_organization(
+ request, metadata
+ )
+ pb_request = organizations.GetOrganizationRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = organizations.Organization()
+ pb_resp = organizations.Organization.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_organization(resp)
+ return resp
+
+ class _SearchOrganizations(OrganizationsRestStub):
+ def __hash__(self):
+ return hash("SearchOrganizations")
+
+ def __call__(
+ self,
+ request: organizations.SearchOrganizationsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> organizations.SearchOrganizationsResponse:
+ r"""Call the search organizations method over HTTP.
+
+ Args:
+ request (~.organizations.SearchOrganizationsRequest):
+ The request object. The request sent to the ``SearchOrganizations`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.organizations.SearchOrganizationsResponse:
+ The response returned from the ``SearchOrganizations``
+ method.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/organizations:search",
+ },
+ ]
+ request, metadata = self._interceptor.pre_search_organizations(
+ request, metadata
+ )
+ pb_request = organizations.SearchOrganizationsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = organizations.SearchOrganizationsResponse()
+ pb_resp = organizations.SearchOrganizationsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_search_organizations(resp)
+ return resp
+
+ class _SetIamPolicy(OrganizationsRestStub):
+ def __hash__(self):
+ return hash("SetIamPolicy")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.SetIamPolicyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Call the set iam policy method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.SetIamPolicyRequest):
+ The request object. Request message for ``SetIamPolicy`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which
+ specifies access controls for Google Cloud resources.
+
+ A ``Policy`` is a collection of ``bindings``. A
+ ``binding`` binds one or more ``members``, or
+ principals, to a single ``role``. Principals can be user
+ accounts, service accounts, Google groups, and domains
+ (such as G Suite). A ``role`` is a named list of
+ permissions; each ``role`` can be an IAM predefined role
+ or a user-created custom role.
+
+ For some types of Google Cloud resources, a ``binding``
+ can also specify a ``condition``, which is a logical
+ expression that allows access to a resource only if the
+ expression evaluates to ``true``. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the `IAM
+ documentation <https://cloud.google.com/iam/help/conditions/resource-policies>`__.
+
+ **JSON example:**
+
+ ::
+
+ {
+ "bindings": [
+ {
+ "role": "roles/resourcemanager.organizationAdmin",
+ "members": [
+ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+ },
+ {
+ "role": "roles/resourcemanager.organizationViewer",
+ "members": [
+ "user:eve@example.com"
+ ],
+ "condition": {
+ "title": "expirable access",
+ "description": "Does not grant access after Sep 2020",
+ "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')",
+ }
+ }
+ ],
+ "etag": "BwWWja0YfJA=",
+ "version": 3
+ }
+
+ **YAML example:**
+
+ ::
+
+ bindings:
+ - members:
+ - user:mike@example.com
+ - group:admins@example.com
+ - domain:google.com
+ - serviceAccount:my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin
+ - members:
+ - user:eve@example.com
+ role: roles/resourcemanager.organizationViewer
+ condition:
+ title: expirable access
+ description: Does not grant access after Sep 2020
+ expression: request.time < timestamp('2020-10-01T00:00:00.000Z')
+ etag: BwWWja0YfJA=
+ version: 3
+
+ For a description of IAM and its features, see the `IAM
+ documentation <https://cloud.google.com/iam/docs/>`__.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=organizations/*}:setIamPolicy",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_set_iam_policy(request, metadata)
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = policy_pb2.Policy()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_set_iam_policy(resp)
+ return resp
+
+ class _TestIamPermissions(OrganizationsRestStub):
+ def __hash__(self):
+ return hash("TestIamPermissions")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.TestIamPermissionsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Call the test iam permissions method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.TestIamPermissionsRequest):
+ The request object. Request message for ``TestIamPermissions`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for ``TestIamPermissions`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=organizations/*}:testIamPermissions",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_test_iam_permissions(
+ request, metadata
+ )
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = iam_policy_pb2.TestIamPermissionsResponse()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_test_iam_permissions(resp)
+ return resp
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], policy_pb2.Policy]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetIamPolicy(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_organization(
+ self,
+ ) -> Callable[[organizations.GetOrganizationRequest], organizations.Organization]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetOrganization(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def search_organizations(
+ self,
+ ) -> Callable[
+ [organizations.SearchOrganizationsRequest],
+ organizations.SearchOrganizationsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._SearchOrganizations(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], policy_pb2.Policy]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._SetIamPolicy(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ iam_policy_pb2.TestIamPermissionsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._TestIamPermissions(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_operation(self):
+ return self._GetOperation(self._session, self._host, self._interceptor) # type: ignore
+
+ class _GetOperation(OrganizationsRestStub):
+ def __call__(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+
+ r"""Call the get operation method over HTTP.
+
+ Args:
+ request (operations_pb2.GetOperationRequest):
+ The request object for GetOperation method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ operations_pb2.Operation: Response from GetOperation method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/{name=operations/**}",
+ },
+ ]
+
+ request, metadata = self._interceptor.pre_get_operation(request, metadata)
+ request_kwargs = json_format.MessageToDict(request)
+ transcoded_request = path_template.transcode(http_options, **request_kwargs)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(json.dumps(transcoded_request["query_params"]))
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ resp = operations_pb2.Operation()
+ resp = json_format.Parse(response.content.decode("utf-8"), resp)
+ resp = self._interceptor.post_get_operation(resp)
+ return resp
+
+ @property
+ def kind(self) -> str:
+ return "rest"
+
+ def close(self):
+ self._session.close()
+
+
+__all__ = ("OrganizationsRestTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/__init__.py
@@ -0,0 +1,22 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .async_client import ProjectsAsyncClient
+from .client import ProjectsClient
+
+__all__ = (
+ "ProjectsClient",
+ "ProjectsAsyncClient",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/async_client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/async_client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/async_client.py
@@ -0,0 +1,1899 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import functools
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+)
+
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.api_core.client_options import ClientOptions
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.projects import pagers
+from google.cloud.resourcemanager_v3.types import projects
+
+from .client import ProjectsClient
+from .transports.base import DEFAULT_CLIENT_INFO, ProjectsTransport
+from .transports.grpc_asyncio import ProjectsGrpcAsyncIOTransport
+
+
+class ProjectsAsyncClient:
+ """Manages Google Cloud Projects."""
+
+ _client: ProjectsClient
+
+ DEFAULT_ENDPOINT = ProjectsClient.DEFAULT_ENDPOINT
+ DEFAULT_MTLS_ENDPOINT = ProjectsClient.DEFAULT_MTLS_ENDPOINT
+
+ project_path = staticmethod(ProjectsClient.project_path)
+ parse_project_path = staticmethod(ProjectsClient.parse_project_path)
+ common_billing_account_path = staticmethod(
+ ProjectsClient.common_billing_account_path
+ )
+ parse_common_billing_account_path = staticmethod(
+ ProjectsClient.parse_common_billing_account_path
+ )
+ common_folder_path = staticmethod(ProjectsClient.common_folder_path)
+ parse_common_folder_path = staticmethod(ProjectsClient.parse_common_folder_path)
+ common_organization_path = staticmethod(ProjectsClient.common_organization_path)
+ parse_common_organization_path = staticmethod(
+ ProjectsClient.parse_common_organization_path
+ )
+ common_project_path = staticmethod(ProjectsClient.common_project_path)
+ parse_common_project_path = staticmethod(ProjectsClient.parse_common_project_path)
+ common_location_path = staticmethod(ProjectsClient.common_location_path)
+ parse_common_location_path = staticmethod(ProjectsClient.parse_common_location_path)
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ ProjectsAsyncClient: The constructed client.
+ """
+ return ProjectsClient.from_service_account_info.__func__(ProjectsAsyncClient, info, *args, **kwargs) # type: ignore
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ ProjectsAsyncClient: The constructed client.
+ """
+ return ProjectsClient.from_service_account_file.__func__(ProjectsAsyncClient, filename, *args, **kwargs) # type: ignore
+
+ from_service_account_json = from_service_account_file
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ return ProjectsClient.get_mtls_endpoint_and_cert_source(client_options) # type: ignore
+
+ @property
+ def transport(self) -> ProjectsTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ ProjectsTransport: The transport used by the client instance.
+ """
+ return self._client.transport
+
+ get_transport_class = functools.partial(
+ type(ProjectsClient).get_transport_class, type(ProjectsClient)
+ )
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Union[str, ProjectsTransport] = "grpc_asyncio",
+ client_options: Optional[ClientOptions] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the projects client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, ~.ProjectsTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (ClientOptions): Custom options for the client. It
+ won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ self._client = ProjectsClient(
+ credentials=credentials,
+ transport=transport,
+ client_options=client_options,
+ client_info=client_info,
+ )
+
+ async def get_project(
+ self,
+ request: Optional[Union[projects.GetProjectRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> projects.Project:
+ r"""Retrieves the project identified by the specified ``name`` (for
+ example, ``projects/415104041262``).
+
+ The caller must have ``resourcemanager.projects.get`` permission
+ for this project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_get_project():
+ # Create a client
+ client = resourcemanager_v3.ProjectsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetProjectRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_project(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.GetProjectRequest, dict]]):
+ The request object. The request sent to the
+ [GetProject][google.cloud.resourcemanager.v3.Projects.GetProject]
+ method.
+ name (:class:`str`):
+ Required. The name of the project (for example,
+ ``projects/415104041262``).
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.Project:
+ A project is a high-level Google
+ Cloud entity. It is a container for
+ ACLs, APIs, App Engine Apps, VMs, and
+ other Google Cloud Platform resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = projects.GetProjectRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_project,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_projects(
+ self,
+ request: Optional[Union[projects.ListProjectsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListProjectsAsyncPager:
+ r"""Lists projects that are direct children of the specified folder
+ or organization resource. ``list()`` provides a strongly
+ consistent view of the projects underneath the specified parent
+ resource. ``list()`` returns projects sorted based upon the
+ (ascending) lexical ordering of their ``display_name``. The
+ caller must have ``resourcemanager.projects.list`` permission on
+ the identified parent.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_list_projects():
+ # Create a client
+ client = resourcemanager_v3.ProjectsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListProjectsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_projects(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.ListProjectsRequest, dict]]):
+ The request object. The request sent to the
+ [ListProjects][google.cloud.resourcemanager.v3.Projects.ListProjects]
+ method.
+ parent (:class:`str`):
+ Required. The name of the parent resource whose projects
+ are being listed. Only children of this parent resource
+ are listed; descendants are not listed.
+
+ If the parent is a folder, use the value
+ ``folders/{folder_id}``. If the parent is an
+ organization, use the value ``organizations/{org_id}``.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.projects.pagers.ListProjectsAsyncPager:
+ A page of the response received from the
+ [ListProjects][google.cloud.resourcemanager.v3.Projects.ListProjects]
+ method.
+
+ A paginated response where more pages are available
+ has next_page_token set. This token can be used in a
+ subsequent request to retrieve the next request page.
+
+ NOTE: A response may contain fewer elements than the
+ request page_size and still have a next_page_token.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = projects.ListProjectsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_projects,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListProjectsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def search_projects(
+ self,
+ request: Optional[Union[projects.SearchProjectsRequest, dict]] = None,
+ *,
+ query: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.SearchProjectsAsyncPager:
+ r"""Search for projects that the caller has both
+ ``resourcemanager.projects.get`` permission on, and also satisfy
+ the specified query.
+
+ This method returns projects in an unspecified order.
+
+ This method is eventually consistent with project mutations;
+ this means that a newly created project may not appear in the
+ results or recent updates to an existing project may not be
+ reflected in the results. To retrieve the latest state of a
+ project, use the
+ [GetProject][google.cloud.resourcemanager.v3.Projects.GetProject]
+ method.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_search_projects():
+ # Create a client
+ client = resourcemanager_v3.ProjectsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.SearchProjectsRequest(
+ )
+
+ # Make the request
+ page_result = client.search_projects(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.SearchProjectsRequest, dict]]):
+ The request object. The request sent to the
+ [SearchProjects][google.cloud.resourcemanager.v3.Projects.SearchProjects]
+ method.
+ query (:class:`str`):
+ Optional. A query string for searching for projects that
+ the caller has ``resourcemanager.projects.get``
+ permission to. If multiple fields are included in the
+ query, then it will return results that match any of the
+ fields. Some eligible fields are:
+
+ - **``displayName``, ``name``**: Filters by
+ displayName.
+ - **``parent``**: Project's parent (for example:
+ ``folders/123``, ``organizations/*``). Prefer
+ ``parent`` field over ``parent.type`` and
+ ``parent.id``.
+ - **``parent.type``**: Parent's type: ``folder`` or
+ ``organization``.
+ - **``parent.id``**: Parent's id number (for example:
+ ``123``).
+ - **``id``, ``projectId``**: Filters by projectId.
+ - **``state``, ``lifecycleState``**: Filters by state.
+ - **``labels``**: Filters by label name or value.
+ - **``labels.<key>`` (where ``<key>`` is the name of a
+ label)**: Filters by label name.
+
+ Search expressions are case insensitive.
+
+ Some examples queries:
+
+ - **``name:how*``**: The project's name starts with
+ "how".
+ - **``name:Howl``**: The project's name is ``Howl`` or
+ ``howl``.
+ - **``name:HOWL``**: Equivalent to above.
+ - **``NAME:howl``**: Equivalent to above.
+ - **``labels.color:*``**: The project has the label
+ ``color``.
+ - **``labels.color:red``**: The project's label
+ ``color`` has the value ``red``.
+ - **``labels.color:red labels.size:big``**: The
+ project's label ``color`` has the value ``red`` or
+ its label ``size`` has the value ``big``.
+
+ If no query is specified, the call will return projects
+ for which the user has the
+ ``resourcemanager.projects.get`` permission.
+
+ This corresponds to the ``query`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.projects.pagers.SearchProjectsAsyncPager:
+ A page of the response received from the
+ [SearchProjects][google.cloud.resourcemanager.v3.Projects.SearchProjects]
+ method.
+
+ A paginated response where more pages are available
+ has next_page_token set. This token can be used in a
+ subsequent request to retrieve the next request page.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([query])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = projects.SearchProjectsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if query is not None:
+ request.query = query
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.search_projects,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.SearchProjectsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def create_project(
+ self,
+ request: Optional[Union[projects.CreateProjectRequest, dict]] = None,
+ *,
+ project: Optional[projects.Project] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Request that a new project be created. The result is an
+ ``Operation`` which can be used to track the creation process.
+ This process usually takes a few seconds, but can sometimes take
+ much longer. The tracking ``Operation`` is automatically deleted
+ after a few hours, so there is no need to call
+ ``DeleteOperation``.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_create_project():
+ # Create a client
+ client = resourcemanager_v3.ProjectsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.CreateProjectRequest(
+ )
+
+ # Make the request
+ operation = client.create_project(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.CreateProjectRequest, dict]]):
+ The request object. The request sent to the
+ [CreateProject][google.cloud.resourcemanager.v3.Projects.CreateProject]
+ method.
+ project (:class:`google.cloud.resourcemanager_v3.types.Project`):
+ Required. The Project to create.
+
+ Project ID is required. If the requested ID is
+ unavailable, the request fails.
+
+ If the ``parent`` field is set, the
+ ``resourcemanager.projects.create`` permission is
+ checked on the parent resource. If no parent is set and
+ the authorization credentials belong to an Organization,
+ the parent will be set to that Organization.
+
+ This corresponds to the ``project`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Project` A project is a high-level Google Cloud entity. It is a
+ container for ACLs, APIs, App Engine Apps, VMs, and
+ other Google Cloud Platform resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([project])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = projects.CreateProjectRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if project is not None:
+ request.project = project
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.create_project,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ projects.Project,
+ metadata_type=projects.CreateProjectMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def update_project(
+ self,
+ request: Optional[Union[projects.UpdateProjectRequest, dict]] = None,
+ *,
+ project: Optional[projects.Project] = None,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Updates the ``display_name`` and labels of the project
+ identified by the specified ``name`` (for example,
+ ``projects/415104041262``). Deleting all labels requires an
+ update mask for labels field.
+
+ The caller must have ``resourcemanager.projects.update``
+ permission for this project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_update_project():
+ # Create a client
+ client = resourcemanager_v3.ProjectsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.UpdateProjectRequest(
+ )
+
+ # Make the request
+ operation = client.update_project(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.UpdateProjectRequest, dict]]):
+ The request object. The request sent to the
+ [UpdateProject][google.cloud.resourcemanager.v3.Projects.UpdateProject]
+ method.
+
+ Only the ``display_name`` and ``labels`` fields can be
+ change. Use the
+ [MoveProject][google.cloud.resourcemanager.v3.Projects.MoveProject]
+ method to change the ``parent`` field.
+ project (:class:`google.cloud.resourcemanager_v3.types.Project`):
+ Required. The new definition of the
+ project.
+
+ This corresponds to the ``project`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`):
+ Optional. An update mask to
+ selectively update fields.
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Project` A project is a high-level Google Cloud entity. It is a
+ container for ACLs, APIs, App Engine Apps, VMs, and
+ other Google Cloud Platform resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([project, update_mask])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = projects.UpdateProjectRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if project is not None:
+ request.project = project
+ if update_mask is not None:
+ request.update_mask = update_mask
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.update_project,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("project.name", request.project.name),)
+ ),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ projects.Project,
+ metadata_type=projects.UpdateProjectMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def move_project(
+ self,
+ request: Optional[Union[projects.MoveProjectRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ destination_parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Move a project to another place in your resource hierarchy,
+ under a new resource parent.
+
+ Returns an operation which can be used to track the process of
+ the project move workflow. Upon success, the
+ ``Operation.response`` field will be populated with the moved
+ project.
+
+ The caller must have ``resourcemanager.projects.move``
+ permission on the project, on the project's current and proposed
+ new parent.
+
+ If project has no current parent, or it currently does not have
+ an associated organization resource, you will also need the
+ ``resourcemanager.projects.setIamPolicy`` permission in the
+ project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_move_project():
+ # Create a client
+ client = resourcemanager_v3.ProjectsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.MoveProjectRequest(
+ name="name_value",
+ destination_parent="destination_parent_value",
+ )
+
+ # Make the request
+ operation = client.move_project(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.MoveProjectRequest, dict]]):
+ The request object. The request sent to
+ [MoveProject][google.cloud.resourcemanager.v3.Projects.MoveProject]
+ method.
+ name (:class:`str`):
+ Required. The name of the project to
+ move.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ destination_parent (:class:`str`):
+ Required. The new parent to move the
+ Project under.
+
+ This corresponds to the ``destination_parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Project` A project is a high-level Google Cloud entity. It is a
+ container for ACLs, APIs, App Engine Apps, VMs, and
+ other Google Cloud Platform resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, destination_parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = projects.MoveProjectRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if destination_parent is not None:
+ request.destination_parent = destination_parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.move_project,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ projects.Project,
+ metadata_type=projects.MoveProjectMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def delete_project(
+ self,
+ request: Optional[Union[projects.DeleteProjectRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Marks the project identified by the specified ``name`` (for
+ example, ``projects/415104041262``) for deletion.
+
+ This method will only affect the project if it has a lifecycle
+ state of
+ [ACTIVE][google.cloud.resourcemanager.v3.Project.State.ACTIVE].
+
+ This method changes the Project's lifecycle state from
+ [ACTIVE][google.cloud.resourcemanager.v3.Project.State.ACTIVE]
+ to
+ [DELETE_REQUESTED][google.cloud.resourcemanager.v3.Project.State.DELETE_REQUESTED].
+ The deletion starts at an unspecified time, at which point the
+ Project is no longer accessible.
+
+ Until the deletion completes, you can check the lifecycle state
+ checked by retrieving the project with [GetProject]
+ [google.cloud.resourcemanager.v3.Projects.GetProject], and the
+ project remains visible to [ListProjects]
+ [google.cloud.resourcemanager.v3.Projects.ListProjects].
+ However, you cannot update the project.
+
+ After the deletion completes, the project is not retrievable by
+ the [GetProject]
+ [google.cloud.resourcemanager.v3.Projects.GetProject],
+ [ListProjects]
+ [google.cloud.resourcemanager.v3.Projects.ListProjects], and
+ [SearchProjects][google.cloud.resourcemanager.v3.Projects.SearchProjects]
+ methods.
+
+ This method behaves idempotently, such that deleting a
+ ``DELETE_REQUESTED`` project will not cause an error, but also
+ won't do anything.
+
+ The caller must have ``resourcemanager.projects.delete``
+ permissions for this project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_delete_project():
+ # Create a client
+ client = resourcemanager_v3.ProjectsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.DeleteProjectRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_project(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.DeleteProjectRequest, dict]]):
+ The request object. [DeleteProject][google.cloud.resourcemanager.v3.Projects.DeleteProject]
+ method.
+ name (:class:`str`):
+ Required. The name of the Project (for example,
+ ``projects/415104041262``).
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Project` A project is a high-level Google Cloud entity. It is a
+ container for ACLs, APIs, App Engine Apps, VMs, and
+ other Google Cloud Platform resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = projects.DeleteProjectRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.delete_project,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ projects.Project,
+ metadata_type=projects.DeleteProjectMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def undelete_project(
+ self,
+ request: Optional[Union[projects.UndeleteProjectRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Restores the project identified by the specified ``name`` (for
+ example, ``projects/415104041262``). You can only use this
+ method for a project that has a lifecycle state of
+ [DELETE_REQUESTED] [Projects.State.DELETE_REQUESTED]. After
+ deletion starts, the project cannot be restored.
+
+ The caller must have ``resourcemanager.projects.undelete``
+ permission for this project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_undelete_project():
+ # Create a client
+ client = resourcemanager_v3.ProjectsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.UndeleteProjectRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.undelete_project(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.UndeleteProjectRequest, dict]]):
+ The request object. The request sent to the [UndeleteProject]
+ [google.cloud.resourcemanager.v3.Projects.UndeleteProject]
+ method.
+ name (:class:`str`):
+ Required. The name of the project (for example,
+ ``projects/415104041262``).
+
+ Required.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Project` A project is a high-level Google Cloud entity. It is a
+ container for ACLs, APIs, App Engine Apps, VMs, and
+ other Google Cloud Platform resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = projects.UndeleteProjectRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.undelete_project,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ projects.Project,
+ metadata_type=projects.UndeleteProjectMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.GetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Returns the IAM access control policy for the specified project,
+ in the format ``projects/{ProjectIdOrNumber}`` e.g.
+ projects/123. Permission is denied if the policy or the resource
+ do not exist.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_get_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.ProjectsAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = await client.get_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.GetIamPolicyRequest, dict]]):
+ The request object. Request message for ``GetIamPolicy`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy is being requested. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.GetIamPolicyRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource=resource,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_iam_policy,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def set_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.SetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Sets the IAM access control policy for the specified project, in
+ the format ``projects/{ProjectIdOrNumber}`` e.g. projects/123.
+
+ CAUTION: This method will replace the existing policy, and
+ cannot be used to append additional IAM settings.
+
+ Note: Removing service accounts from policies or changing their
+ roles can render services completely inoperable. It is important
+ to understand how the service account is being used before
+ removing or updating its roles.
+
+ The following constraints apply when using ``setIamPolicy()``:
+
+ - Project does not support ``allUsers`` and
+ ``allAuthenticatedUsers`` as ``members`` in a ``Binding`` of
+ a ``Policy``.
+
+ - The owner role can be granted to a ``user``,
+ ``serviceAccount``, or a group that is part of an
+ organization. For example, group@myownpersonaldomain.com
+ could be added as an owner to a project in the
+ myownpersonaldomain.com organization, but not the
+ examplepetstore.com organization.
+
+ - Service accounts can be made owners of a project directly
+ without any restrictions. However, to be added as an owner, a
+ user must be invited using the Cloud Platform console and
+ must accept the invitation.
+
+ - A user cannot be granted the owner role using
+ ``setIamPolicy()``. The user must be granted the owner role
+ using the Cloud Platform Console and must explicitly accept
+ the invitation.
+
+ - Invitations to grant the owner role cannot be sent using
+ ``setIamPolicy()``; they must be sent only using the Cloud
+ Platform Console.
+
+ - If the project is not part of an organization, there must be
+ at least one owner who has accepted the Terms of Service
+ (ToS) agreement in the policy. Calling ``setIamPolicy()`` to
+ remove the last ToS-accepted owner from the policy will fail.
+ This restriction also applies to legacy projects that no
+ longer have owners who have accepted the ToS. Edits to IAM
+ policies will be rejected until the lack of a ToS-accepting
+ owner is rectified. If the project is part of an
+ organization, you can remove all owners, potentially making
+ the organization inaccessible.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_set_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.ProjectsAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = await client.set_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.SetIamPolicyRequest, dict]]):
+ The request object. Request message for ``SetIamPolicy`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy is being specified. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.SetIamPolicyRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource=resource,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.set_iam_policy,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def test_iam_permissions(
+ self,
+ request: Optional[Union[iam_policy_pb2.TestIamPermissionsRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ permissions: Optional[MutableSequence[str]] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Returns permissions that a caller has on the specified project,
+ in the format ``projects/{ProjectIdOrNumber}`` e.g.
+ projects/123..
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_test_iam_permissions():
+ # Create a client
+ client = resourcemanager_v3.ProjectsAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource="resource_value",
+ permissions=['permissions_value1', 'permissions_value2'],
+ )
+
+ # Make the request
+ response = await client.test_iam_permissions(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest, dict]]):
+ The request object. Request message for ``TestIamPermissions`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy detail is being requested. See
+ the operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ permissions (:class:`MutableSequence[str]`):
+ The set of permissions to check for the ``resource``.
+ Permissions with wildcards (such as '*' or 'storage.*')
+ are not allowed. For more information see `IAM
+ Overview <https://cloud.google.com/iam/docs/overview#permissions>`__.
+
+ This corresponds to the ``permissions`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for TestIamPermissions method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource, permissions])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.TestIamPermissionsRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource=resource,
+ permissions=permissions,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.test_iam_permissions,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._client._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, exc_type, exc, tb):
+ await self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("ProjectsAsyncClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/client.py
@@ -0,0 +1,2099 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import os
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+ cast,
+)
+
+from google.api_core import client_options as client_options_lib
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.exceptions import MutualTLSChannelError # type: ignore
+from google.auth.transport import mtls # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.projects import pagers
+from google.cloud.resourcemanager_v3.types import projects
+
+from .transports.base import DEFAULT_CLIENT_INFO, ProjectsTransport
+from .transports.grpc import ProjectsGrpcTransport
+from .transports.grpc_asyncio import ProjectsGrpcAsyncIOTransport
+from .transports.rest import ProjectsRestTransport
+
+
+class ProjectsClientMeta(type):
+ """Metaclass for the Projects client.
+
+ This provides class-level methods for building and retrieving
+ support objects (e.g. transport) without polluting the client instance
+ objects.
+ """
+
+ _transport_registry = OrderedDict() # type: Dict[str, Type[ProjectsTransport]]
+ _transport_registry["grpc"] = ProjectsGrpcTransport
+ _transport_registry["grpc_asyncio"] = ProjectsGrpcAsyncIOTransport
+ _transport_registry["rest"] = ProjectsRestTransport
+
+ def get_transport_class(
+ cls,
+ label: Optional[str] = None,
+ ) -> Type[ProjectsTransport]:
+ """Returns an appropriate transport class.
+
+ Args:
+ label: The name of the desired transport. If none is
+ provided, then the first transport in the registry is used.
+
+ Returns:
+ The transport class to use.
+ """
+ # If a specific transport is requested, return that one.
+ if label:
+ return cls._transport_registry[label]
+
+ # No transport is requested; return the default (that is, the first one
+ # in the dictionary).
+ return next(iter(cls._transport_registry.values()))
+
+
+class ProjectsClient(metaclass=ProjectsClientMeta):
+ """Manages Google Cloud Projects."""
+
+ @staticmethod
+ def _get_default_mtls_endpoint(api_endpoint):
+ """Converts api endpoint to mTLS endpoint.
+
+ Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to
+ "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively.
+ Args:
+ api_endpoint (Optional[str]): the api endpoint to convert.
+ Returns:
+ str: converted mTLS api endpoint.
+ """
+ if not api_endpoint:
+ return api_endpoint
+
+ mtls_endpoint_re = re.compile(
+ r"(?P<name>[^.]+)(?P<mtls>\.mtls)?(?P<sandbox>\.sandbox)?(?P<googledomain>\.googleapis\.com)?"
+ )
+
+ m = mtls_endpoint_re.match(api_endpoint)
+ name, mtls, sandbox, googledomain = m.groups()
+ if mtls or not googledomain:
+ return api_endpoint
+
+ if sandbox:
+ return api_endpoint.replace(
+ "sandbox.googleapis.com", "mtls.sandbox.googleapis.com"
+ )
+
+ return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com")
+
+ DEFAULT_ENDPOINT = "cloudresourcemanager.googleapis.com"
+ DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore
+ DEFAULT_ENDPOINT
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ ProjectsClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_info(info)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ ProjectsClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_file(filename)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ from_service_account_json = from_service_account_file
+
+ @property
+ def transport(self) -> ProjectsTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ ProjectsTransport: The transport used by the client
+ instance.
+ """
+ return self._transport
+
+ @staticmethod
+ def project_path(
+ project: str,
+ ) -> str:
+ """Returns a fully-qualified project string."""
+ return "projects/{project}".format(
+ project=project,
+ )
+
+ @staticmethod
+ def parse_project_path(path: str) -> Dict[str, str]:
+ """Parses a project path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_billing_account_path(
+ billing_account: str,
+ ) -> str:
+ """Returns a fully-qualified billing_account string."""
+ return "billingAccounts/{billing_account}".format(
+ billing_account=billing_account,
+ )
+
+ @staticmethod
+ def parse_common_billing_account_path(path: str) -> Dict[str, str]:
+ """Parse a billing_account path into its component segments."""
+ m = re.match(r"^billingAccounts/(?P<billing_account>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_folder_path(
+ folder: str,
+ ) -> str:
+ """Returns a fully-qualified folder string."""
+ return "folders/{folder}".format(
+ folder=folder,
+ )
+
+ @staticmethod
+ def parse_common_folder_path(path: str) -> Dict[str, str]:
+ """Parse a folder path into its component segments."""
+ m = re.match(r"^folders/(?P<folder>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_organization_path(
+ organization: str,
+ ) -> str:
+ """Returns a fully-qualified organization string."""
+ return "organizations/{organization}".format(
+ organization=organization,
+ )
+
+ @staticmethod
+ def parse_common_organization_path(path: str) -> Dict[str, str]:
+ """Parse a organization path into its component segments."""
+ m = re.match(r"^organizations/(?P<organization>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_project_path(
+ project: str,
+ ) -> str:
+ """Returns a fully-qualified project string."""
+ return "projects/{project}".format(
+ project=project,
+ )
+
+ @staticmethod
+ def parse_common_project_path(path: str) -> Dict[str, str]:
+ """Parse a project path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_location_path(
+ project: str,
+ location: str,
+ ) -> str:
+ """Returns a fully-qualified location string."""
+ return "projects/{project}/locations/{location}".format(
+ project=project,
+ location=location,
+ )
+
+ @staticmethod
+ def parse_common_location_path(path: str) -> Dict[str, str]:
+ """Parse a location path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[client_options_lib.ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ use_client_cert = os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")
+ use_mtls_endpoint = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto")
+ if use_client_cert not in ("true", "false"):
+ raise ValueError(
+ "Environment variable `GOOGLE_API_USE_CLIENT_CERTIFICATE` must be either `true` or `false`"
+ )
+ if use_mtls_endpoint not in ("auto", "never", "always"):
+ raise MutualTLSChannelError(
+ "Environment variable `GOOGLE_API_USE_MTLS_ENDPOINT` must be `never`, `auto` or `always`"
+ )
+
+ # Figure out the client cert source to use.
+ client_cert_source = None
+ if use_client_cert == "true":
+ if client_options.client_cert_source:
+ client_cert_source = client_options.client_cert_source
+ elif mtls.has_default_client_cert_source():
+ client_cert_source = mtls.default_client_cert_source()
+
+ # Figure out which api endpoint to use.
+ if client_options.api_endpoint is not None:
+ api_endpoint = client_options.api_endpoint
+ elif use_mtls_endpoint == "always" or (
+ use_mtls_endpoint == "auto" and client_cert_source
+ ):
+ api_endpoint = cls.DEFAULT_MTLS_ENDPOINT
+ else:
+ api_endpoint = cls.DEFAULT_ENDPOINT
+
+ return api_endpoint, client_cert_source
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Optional[Union[str, ProjectsTransport]] = None,
+ client_options: Optional[Union[client_options_lib.ClientOptions, dict]] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the projects client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, ProjectsTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (Optional[Union[google.api_core.client_options.ClientOptions, dict]]): Custom options for the
+ client. It won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ if isinstance(client_options, dict):
+ client_options = client_options_lib.from_dict(client_options)
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ client_options = cast(client_options_lib.ClientOptions, client_options)
+
+ api_endpoint, client_cert_source_func = self.get_mtls_endpoint_and_cert_source(
+ client_options
+ )
+
+ api_key_value = getattr(client_options, "api_key", None)
+ if api_key_value and credentials:
+ raise ValueError(
+ "client_options.api_key and credentials are mutually exclusive"
+ )
+
+ # Save or instantiate the transport.
+ # Ordinarily, we provide the transport, but allowing a custom transport
+ # instance provides an extensibility point for unusual situations.
+ if isinstance(transport, ProjectsTransport):
+ # transport is a ProjectsTransport instance.
+ if credentials or client_options.credentials_file or api_key_value:
+ raise ValueError(
+ "When providing a transport instance, "
+ "provide its credentials directly."
+ )
+ if client_options.scopes:
+ raise ValueError(
+ "When providing a transport instance, provide its scopes "
+ "directly."
+ )
+ self._transport = transport
+ else:
+ import google.auth._default # type: ignore
+
+ if api_key_value and hasattr(
+ google.auth._default, "get_api_key_credentials"
+ ):
+ credentials = google.auth._default.get_api_key_credentials(
+ api_key_value
+ )
+
+ Transport = type(self).get_transport_class(transport)
+ self._transport = Transport(
+ credentials=credentials,
+ credentials_file=client_options.credentials_file,
+ host=api_endpoint,
+ scopes=client_options.scopes,
+ client_cert_source_for_mtls=client_cert_source_func,
+ quota_project_id=client_options.quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=True,
+ api_audience=client_options.api_audience,
+ )
+
+ def get_project(
+ self,
+ request: Optional[Union[projects.GetProjectRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> projects.Project:
+ r"""Retrieves the project identified by the specified ``name`` (for
+ example, ``projects/415104041262``).
+
+ The caller must have ``resourcemanager.projects.get`` permission
+ for this project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_get_project():
+ # Create a client
+ client = resourcemanager_v3.ProjectsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetProjectRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_project(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.GetProjectRequest, dict]):
+ The request object. The request sent to the
+ [GetProject][google.cloud.resourcemanager.v3.Projects.GetProject]
+ method.
+ name (str):
+ Required. The name of the project (for example,
+ ``projects/415104041262``).
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.Project:
+ A project is a high-level Google
+ Cloud entity. It is a container for
+ ACLs, APIs, App Engine Apps, VMs, and
+ other Google Cloud Platform resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a projects.GetProjectRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, projects.GetProjectRequest):
+ request = projects.GetProjectRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_project]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_projects(
+ self,
+ request: Optional[Union[projects.ListProjectsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListProjectsPager:
+ r"""Lists projects that are direct children of the specified folder
+ or organization resource. ``list()`` provides a strongly
+ consistent view of the projects underneath the specified parent
+ resource. ``list()`` returns projects sorted based upon the
+ (ascending) lexical ordering of their ``display_name``. The
+ caller must have ``resourcemanager.projects.list`` permission on
+ the identified parent.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_list_projects():
+ # Create a client
+ client = resourcemanager_v3.ProjectsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListProjectsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_projects(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.ListProjectsRequest, dict]):
+ The request object. The request sent to the
+ [ListProjects][google.cloud.resourcemanager.v3.Projects.ListProjects]
+ method.
+ parent (str):
+ Required. The name of the parent resource whose projects
+ are being listed. Only children of this parent resource
+ are listed; descendants are not listed.
+
+ If the parent is a folder, use the value
+ ``folders/{folder_id}``. If the parent is an
+ organization, use the value ``organizations/{org_id}``.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.projects.pagers.ListProjectsPager:
+ A page of the response received from the
+ [ListProjects][google.cloud.resourcemanager.v3.Projects.ListProjects]
+ method.
+
+ A paginated response where more pages are available
+ has next_page_token set. This token can be used in a
+ subsequent request to retrieve the next request page.
+
+ NOTE: A response may contain fewer elements than the
+ request page_size and still have a next_page_token.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a projects.ListProjectsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, projects.ListProjectsRequest):
+ request = projects.ListProjectsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_projects]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListProjectsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def search_projects(
+ self,
+ request: Optional[Union[projects.SearchProjectsRequest, dict]] = None,
+ *,
+ query: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.SearchProjectsPager:
+ r"""Search for projects that the caller has both
+ ``resourcemanager.projects.get`` permission on, and also satisfy
+ the specified query.
+
+ This method returns projects in an unspecified order.
+
+ This method is eventually consistent with project mutations;
+ this means that a newly created project may not appear in the
+ results or recent updates to an existing project may not be
+ reflected in the results. To retrieve the latest state of a
+ project, use the
+ [GetProject][google.cloud.resourcemanager.v3.Projects.GetProject]
+ method.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_search_projects():
+ # Create a client
+ client = resourcemanager_v3.ProjectsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.SearchProjectsRequest(
+ )
+
+ # Make the request
+ page_result = client.search_projects(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.SearchProjectsRequest, dict]):
+ The request object. The request sent to the
+ [SearchProjects][google.cloud.resourcemanager.v3.Projects.SearchProjects]
+ method.
+ query (str):
+ Optional. A query string for searching for projects that
+ the caller has ``resourcemanager.projects.get``
+ permission to. If multiple fields are included in the
+ query, then it will return results that match any of the
+ fields. Some eligible fields are:
+
+ - **``displayName``, ``name``**: Filters by
+ displayName.
+ - **``parent``**: Project's parent (for example:
+ ``folders/123``, ``organizations/*``). Prefer
+ ``parent`` field over ``parent.type`` and
+ ``parent.id``.
+ - **``parent.type``**: Parent's type: ``folder`` or
+ ``organization``.
+ - **``parent.id``**: Parent's id number (for example:
+ ``123``).
+ - **``id``, ``projectId``**: Filters by projectId.
+ - **``state``, ``lifecycleState``**: Filters by state.
+ - **``labels``**: Filters by label name or value.
+ - **``labels.<key>`` (where ``<key>`` is the name of a
+ label)**: Filters by label name.
+
+ Search expressions are case insensitive.
+
+ Some examples queries:
+
+ - **``name:how*``**: The project's name starts with
+ "how".
+ - **``name:Howl``**: The project's name is ``Howl`` or
+ ``howl``.
+ - **``name:HOWL``**: Equivalent to above.
+ - **``NAME:howl``**: Equivalent to above.
+ - **``labels.color:*``**: The project has the label
+ ``color``.
+ - **``labels.color:red``**: The project's label
+ ``color`` has the value ``red``.
+ - **``labels.color:red labels.size:big``**: The
+ project's label ``color`` has the value ``red`` or
+ its label ``size`` has the value ``big``.
+
+ If no query is specified, the call will return projects
+ for which the user has the
+ ``resourcemanager.projects.get`` permission.
+
+ This corresponds to the ``query`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.projects.pagers.SearchProjectsPager:
+ A page of the response received from the
+ [SearchProjects][google.cloud.resourcemanager.v3.Projects.SearchProjects]
+ method.
+
+ A paginated response where more pages are available
+ has next_page_token set. This token can be used in a
+ subsequent request to retrieve the next request page.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([query])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a projects.SearchProjectsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, projects.SearchProjectsRequest):
+ request = projects.SearchProjectsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if query is not None:
+ request.query = query
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.search_projects]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.SearchProjectsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def create_project(
+ self,
+ request: Optional[Union[projects.CreateProjectRequest, dict]] = None,
+ *,
+ project: Optional[projects.Project] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Request that a new project be created. The result is an
+ ``Operation`` which can be used to track the creation process.
+ This process usually takes a few seconds, but can sometimes take
+ much longer. The tracking ``Operation`` is automatically deleted
+ after a few hours, so there is no need to call
+ ``DeleteOperation``.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_create_project():
+ # Create a client
+ client = resourcemanager_v3.ProjectsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.CreateProjectRequest(
+ )
+
+ # Make the request
+ operation = client.create_project(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.CreateProjectRequest, dict]):
+ The request object. The request sent to the
+ [CreateProject][google.cloud.resourcemanager.v3.Projects.CreateProject]
+ method.
+ project (google.cloud.resourcemanager_v3.types.Project):
+ Required. The Project to create.
+
+ Project ID is required. If the requested ID is
+ unavailable, the request fails.
+
+ If the ``parent`` field is set, the
+ ``resourcemanager.projects.create`` permission is
+ checked on the parent resource. If no parent is set and
+ the authorization credentials belong to an Organization,
+ the parent will be set to that Organization.
+
+ This corresponds to the ``project`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Project` A project is a high-level Google Cloud entity. It is a
+ container for ACLs, APIs, App Engine Apps, VMs, and
+ other Google Cloud Platform resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([project])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a projects.CreateProjectRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, projects.CreateProjectRequest):
+ request = projects.CreateProjectRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if project is not None:
+ request.project = project
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.create_project]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ projects.Project,
+ metadata_type=projects.CreateProjectMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def update_project(
+ self,
+ request: Optional[Union[projects.UpdateProjectRequest, dict]] = None,
+ *,
+ project: Optional[projects.Project] = None,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Updates the ``display_name`` and labels of the project
+ identified by the specified ``name`` (for example,
+ ``projects/415104041262``). Deleting all labels requires an
+ update mask for labels field.
+
+ The caller must have ``resourcemanager.projects.update``
+ permission for this project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_update_project():
+ # Create a client
+ client = resourcemanager_v3.ProjectsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.UpdateProjectRequest(
+ )
+
+ # Make the request
+ operation = client.update_project(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.UpdateProjectRequest, dict]):
+ The request object. The request sent to the
+ [UpdateProject][google.cloud.resourcemanager.v3.Projects.UpdateProject]
+ method.
+
+ Only the ``display_name`` and ``labels`` fields can be
+ change. Use the
+ [MoveProject][google.cloud.resourcemanager.v3.Projects.MoveProject]
+ method to change the ``parent`` field.
+ project (google.cloud.resourcemanager_v3.types.Project):
+ Required. The new definition of the
+ project.
+
+ This corresponds to the ``project`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Optional. An update mask to
+ selectively update fields.
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Project` A project is a high-level Google Cloud entity. It is a
+ container for ACLs, APIs, App Engine Apps, VMs, and
+ other Google Cloud Platform resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([project, update_mask])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a projects.UpdateProjectRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, projects.UpdateProjectRequest):
+ request = projects.UpdateProjectRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if project is not None:
+ request.project = project
+ if update_mask is not None:
+ request.update_mask = update_mask
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.update_project]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("project.name", request.project.name),)
+ ),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ projects.Project,
+ metadata_type=projects.UpdateProjectMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def move_project(
+ self,
+ request: Optional[Union[projects.MoveProjectRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ destination_parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Move a project to another place in your resource hierarchy,
+ under a new resource parent.
+
+ Returns an operation which can be used to track the process of
+ the project move workflow. Upon success, the
+ ``Operation.response`` field will be populated with the moved
+ project.
+
+ The caller must have ``resourcemanager.projects.move``
+ permission on the project, on the project's current and proposed
+ new parent.
+
+ If project has no current parent, or it currently does not have
+ an associated organization resource, you will also need the
+ ``resourcemanager.projects.setIamPolicy`` permission in the
+ project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_move_project():
+ # Create a client
+ client = resourcemanager_v3.ProjectsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.MoveProjectRequest(
+ name="name_value",
+ destination_parent="destination_parent_value",
+ )
+
+ # Make the request
+ operation = client.move_project(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.MoveProjectRequest, dict]):
+ The request object. The request sent to
+ [MoveProject][google.cloud.resourcemanager.v3.Projects.MoveProject]
+ method.
+ name (str):
+ Required. The name of the project to
+ move.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ destination_parent (str):
+ Required. The new parent to move the
+ Project under.
+
+ This corresponds to the ``destination_parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Project` A project is a high-level Google Cloud entity. It is a
+ container for ACLs, APIs, App Engine Apps, VMs, and
+ other Google Cloud Platform resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name, destination_parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a projects.MoveProjectRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, projects.MoveProjectRequest):
+ request = projects.MoveProjectRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+ if destination_parent is not None:
+ request.destination_parent = destination_parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.move_project]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ projects.Project,
+ metadata_type=projects.MoveProjectMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def delete_project(
+ self,
+ request: Optional[Union[projects.DeleteProjectRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Marks the project identified by the specified ``name`` (for
+ example, ``projects/415104041262``) for deletion.
+
+ This method will only affect the project if it has a lifecycle
+ state of
+ [ACTIVE][google.cloud.resourcemanager.v3.Project.State.ACTIVE].
+
+ This method changes the Project's lifecycle state from
+ [ACTIVE][google.cloud.resourcemanager.v3.Project.State.ACTIVE]
+ to
+ [DELETE_REQUESTED][google.cloud.resourcemanager.v3.Project.State.DELETE_REQUESTED].
+ The deletion starts at an unspecified time, at which point the
+ Project is no longer accessible.
+
+ Until the deletion completes, you can check the lifecycle state
+ checked by retrieving the project with [GetProject]
+ [google.cloud.resourcemanager.v3.Projects.GetProject], and the
+ project remains visible to [ListProjects]
+ [google.cloud.resourcemanager.v3.Projects.ListProjects].
+ However, you cannot update the project.
+
+ After the deletion completes, the project is not retrievable by
+ the [GetProject]
+ [google.cloud.resourcemanager.v3.Projects.GetProject],
+ [ListProjects]
+ [google.cloud.resourcemanager.v3.Projects.ListProjects], and
+ [SearchProjects][google.cloud.resourcemanager.v3.Projects.SearchProjects]
+ methods.
+
+ This method behaves idempotently, such that deleting a
+ ``DELETE_REQUESTED`` project will not cause an error, but also
+ won't do anything.
+
+ The caller must have ``resourcemanager.projects.delete``
+ permissions for this project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_delete_project():
+ # Create a client
+ client = resourcemanager_v3.ProjectsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.DeleteProjectRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_project(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.DeleteProjectRequest, dict]):
+ The request object. [DeleteProject][google.cloud.resourcemanager.v3.Projects.DeleteProject]
+ method.
+ name (str):
+ Required. The name of the Project (for example,
+ ``projects/415104041262``).
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Project` A project is a high-level Google Cloud entity. It is a
+ container for ACLs, APIs, App Engine Apps, VMs, and
+ other Google Cloud Platform resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a projects.DeleteProjectRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, projects.DeleteProjectRequest):
+ request = projects.DeleteProjectRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.delete_project]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ projects.Project,
+ metadata_type=projects.DeleteProjectMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def undelete_project(
+ self,
+ request: Optional[Union[projects.UndeleteProjectRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Restores the project identified by the specified ``name`` (for
+ example, ``projects/415104041262``). You can only use this
+ method for a project that has a lifecycle state of
+ [DELETE_REQUESTED] [Projects.State.DELETE_REQUESTED]. After
+ deletion starts, the project cannot be restored.
+
+ The caller must have ``resourcemanager.projects.undelete``
+ permission for this project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_undelete_project():
+ # Create a client
+ client = resourcemanager_v3.ProjectsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.UndeleteProjectRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.undelete_project(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.UndeleteProjectRequest, dict]):
+ The request object. The request sent to the [UndeleteProject]
+ [google.cloud.resourcemanager.v3.Projects.UndeleteProject]
+ method.
+ name (str):
+ Required. The name of the project (for example,
+ ``projects/415104041262``).
+
+ Required.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.Project` A project is a high-level Google Cloud entity. It is a
+ container for ACLs, APIs, App Engine Apps, VMs, and
+ other Google Cloud Platform resources.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a projects.UndeleteProjectRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, projects.UndeleteProjectRequest):
+ request = projects.UndeleteProjectRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.undelete_project]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ projects.Project,
+ metadata_type=projects.UndeleteProjectMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.GetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Returns the IAM access control policy for the specified project,
+ in the format ``projects/{ProjectIdOrNumber}`` e.g.
+ projects/123. Permission is denied if the policy or the resource
+ do not exist.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_get_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.ProjectsClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = client.get_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.GetIamPolicyRequest, dict]):
+ The request object. Request message for ``GetIamPolicy`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy is being requested. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.GetIamPolicyRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.GetIamPolicyRequest()
+ if resource is not None:
+ request.resource = resource
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_iam_policy]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def set_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.SetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Sets the IAM access control policy for the specified project, in
+ the format ``projects/{ProjectIdOrNumber}`` e.g. projects/123.
+
+ CAUTION: This method will replace the existing policy, and
+ cannot be used to append additional IAM settings.
+
+ Note: Removing service accounts from policies or changing their
+ roles can render services completely inoperable. It is important
+ to understand how the service account is being used before
+ removing or updating its roles.
+
+ The following constraints apply when using ``setIamPolicy()``:
+
+ - Project does not support ``allUsers`` and
+ ``allAuthenticatedUsers`` as ``members`` in a ``Binding`` of
+ a ``Policy``.
+
+ - The owner role can be granted to a ``user``,
+ ``serviceAccount``, or a group that is part of an
+ organization. For example, group@myownpersonaldomain.com
+ could be added as an owner to a project in the
+ myownpersonaldomain.com organization, but not the
+ examplepetstore.com organization.
+
+ - Service accounts can be made owners of a project directly
+ without any restrictions. However, to be added as an owner, a
+ user must be invited using the Cloud Platform console and
+ must accept the invitation.
+
+ - A user cannot be granted the owner role using
+ ``setIamPolicy()``. The user must be granted the owner role
+ using the Cloud Platform Console and must explicitly accept
+ the invitation.
+
+ - Invitations to grant the owner role cannot be sent using
+ ``setIamPolicy()``; they must be sent only using the Cloud
+ Platform Console.
+
+ - If the project is not part of an organization, there must be
+ at least one owner who has accepted the Terms of Service
+ (ToS) agreement in the policy. Calling ``setIamPolicy()`` to
+ remove the last ToS-accepted owner from the policy will fail.
+ This restriction also applies to legacy projects that no
+ longer have owners who have accepted the ToS. Edits to IAM
+ policies will be rejected until the lack of a ToS-accepting
+ owner is rectified. If the project is part of an
+ organization, you can remove all owners, potentially making
+ the organization inaccessible.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_set_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.ProjectsClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = client.set_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.SetIamPolicyRequest, dict]):
+ The request object. Request message for ``SetIamPolicy`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy is being specified. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.SetIamPolicyRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.SetIamPolicyRequest()
+ if resource is not None:
+ request.resource = resource
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.set_iam_policy]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def test_iam_permissions(
+ self,
+ request: Optional[Union[iam_policy_pb2.TestIamPermissionsRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ permissions: Optional[MutableSequence[str]] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Returns permissions that a caller has on the specified project,
+ in the format ``projects/{ProjectIdOrNumber}`` e.g.
+ projects/123..
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_test_iam_permissions():
+ # Create a client
+ client = resourcemanager_v3.ProjectsClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource="resource_value",
+ permissions=['permissions_value1', 'permissions_value2'],
+ )
+
+ # Make the request
+ response = client.test_iam_permissions(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest, dict]):
+ The request object. Request message for ``TestIamPermissions`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy detail is being requested. See
+ the operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ permissions (MutableSequence[str]):
+ The set of permissions to check for the ``resource``.
+ Permissions with wildcards (such as '*' or 'storage.*')
+ are not allowed. For more information see `IAM
+ Overview <https://cloud.google.com/iam/docs/overview#permissions>`__.
+
+ This corresponds to the ``permissions`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for TestIamPermissions method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource, permissions])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.TestIamPermissionsRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.TestIamPermissionsRequest()
+ if resource is not None:
+ request.resource = resource
+ if permissions:
+ request.permissions.extend(permissions)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.test_iam_permissions]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def __enter__(self) -> "ProjectsClient":
+ return self
+
+ def __exit__(self, type, value, traceback):
+ """Releases underlying transport's resources.
+
+ .. warning::
+ ONLY use as a context manager if the transport is NOT shared
+ with other clients! Exiting the with block will CLOSE the transport
+ and may cause errors in other clients!
+ """
+ self.transport.close()
+
+ def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("ProjectsClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/pagers.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/pagers.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/pagers.py
@@ -0,0 +1,283 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import (
+ Any,
+ AsyncIterator,
+ Awaitable,
+ Callable,
+ Iterator,
+ Optional,
+ Sequence,
+ Tuple,
+)
+
+from google.cloud.resourcemanager_v3.types import projects
+
+
+class ListProjectsPager:
+ """A pager for iterating through ``list_projects`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListProjectsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``projects`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListProjects`` requests and continue to iterate
+ through the ``projects`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListProjectsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., projects.ListProjectsResponse],
+ request: projects.ListProjectsRequest,
+ response: projects.ListProjectsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListProjectsRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListProjectsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = projects.ListProjectsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[projects.ListProjectsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[projects.Project]:
+ for page in self.pages:
+ yield from page.projects
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListProjectsAsyncPager:
+ """A pager for iterating through ``list_projects`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListProjectsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``projects`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListProjects`` requests and continue to iterate
+ through the ``projects`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListProjectsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[projects.ListProjectsResponse]],
+ request: projects.ListProjectsRequest,
+ response: projects.ListProjectsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListProjectsRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListProjectsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = projects.ListProjectsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[projects.ListProjectsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[projects.Project]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.projects:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class SearchProjectsPager:
+ """A pager for iterating through ``search_projects`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.SearchProjectsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``projects`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``SearchProjects`` requests and continue to iterate
+ through the ``projects`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.SearchProjectsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., projects.SearchProjectsResponse],
+ request: projects.SearchProjectsRequest,
+ response: projects.SearchProjectsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.SearchProjectsRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.SearchProjectsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = projects.SearchProjectsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[projects.SearchProjectsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[projects.Project]:
+ for page in self.pages:
+ yield from page.projects
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class SearchProjectsAsyncPager:
+ """A pager for iterating through ``search_projects`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.SearchProjectsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``projects`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``SearchProjects`` requests and continue to iterate
+ through the ``projects`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.SearchProjectsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[projects.SearchProjectsResponse]],
+ request: projects.SearchProjectsRequest,
+ response: projects.SearchProjectsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.SearchProjectsRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.SearchProjectsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = projects.SearchProjectsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[projects.SearchProjectsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[projects.Project]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.projects:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/__init__.py
@@ -0,0 +1,36 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+from typing import Dict, Type
+
+from .base import ProjectsTransport
+from .grpc import ProjectsGrpcTransport
+from .grpc_asyncio import ProjectsGrpcAsyncIOTransport
+from .rest import ProjectsRestInterceptor, ProjectsRestTransport
+
+# Compile a registry of transports.
+_transport_registry = OrderedDict() # type: Dict[str, Type[ProjectsTransport]]
+_transport_registry["grpc"] = ProjectsGrpcTransport
+_transport_registry["grpc_asyncio"] = ProjectsGrpcAsyncIOTransport
+_transport_registry["rest"] = ProjectsRestTransport
+
+__all__ = (
+ "ProjectsTransport",
+ "ProjectsGrpcTransport",
+ "ProjectsGrpcAsyncIOTransport",
+ "ProjectsRestTransport",
+ "ProjectsRestInterceptor",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/base.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/base.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/base.py
@@ -0,0 +1,347 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import abc
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Union
+
+import google.api_core
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1, operations_v1
+from google.api_core import retry as retries
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+from google.cloud.resourcemanager_v3.types import projects
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+class ProjectsTransport(abc.ABC):
+ """Abstract transport class for Projects."""
+
+ AUTH_SCOPES = (
+ "https://www.googleapis.com/auth/cloud-platform",
+ "https://www.googleapis.com/auth/cloud-platform.read-only",
+ )
+
+ DEFAULT_HOST: str = "cloudresourcemanager.googleapis.com"
+
+ def __init__(
+ self,
+ *,
+ host: str = DEFAULT_HOST,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ **kwargs,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A list of scopes.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ """
+
+ scopes_kwargs = {"scopes": scopes, "default_scopes": self.AUTH_SCOPES}
+
+ # Save the scopes.
+ self._scopes = scopes
+
+ # If no credentials are provided, then determine the appropriate
+ # defaults.
+ if credentials and credentials_file:
+ raise core_exceptions.DuplicateCredentialArgs(
+ "'credentials_file' and 'credentials' are mutually exclusive"
+ )
+
+ if credentials_file is not None:
+ credentials, _ = google.auth.load_credentials_from_file(
+ credentials_file, **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ elif credentials is None:
+ credentials, _ = google.auth.default(
+ **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ # Don't apply audience if the credentials file passed from user.
+ if hasattr(credentials, "with_gdch_audience"):
+ credentials = credentials.with_gdch_audience(
+ api_audience if api_audience else host
+ )
+
+ # If the credentials are service account credentials, then always try to use self signed JWT.
+ if (
+ always_use_jwt_access
+ and isinstance(credentials, service_account.Credentials)
+ and hasattr(service_account.Credentials, "with_always_use_jwt_access")
+ ):
+ credentials = credentials.with_always_use_jwt_access(True)
+
+ # Save the credentials.
+ self._credentials = credentials
+
+ # Save the hostname. Default to port 443 (HTTPS) if none is specified.
+ if ":" not in host:
+ host += ":443"
+ self._host = host
+
+ def _prep_wrapped_messages(self, client_info):
+ # Precompute the wrapped methods.
+ self._wrapped_methods = {
+ self.get_project: gapic_v1.method.wrap_method(
+ self.get_project,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.list_projects: gapic_v1.method.wrap_method(
+ self.list_projects,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.search_projects: gapic_v1.method.wrap_method(
+ self.search_projects,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.create_project: gapic_v1.method.wrap_method(
+ self.create_project,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.update_project: gapic_v1.method.wrap_method(
+ self.update_project,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.move_project: gapic_v1.method.wrap_method(
+ self.move_project,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.delete_project: gapic_v1.method.wrap_method(
+ self.delete_project,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.undelete_project: gapic_v1.method.wrap_method(
+ self.undelete_project,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.get_iam_policy: gapic_v1.method.wrap_method(
+ self.get_iam_policy,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.set_iam_policy: gapic_v1.method.wrap_method(
+ self.set_iam_policy,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.test_iam_permissions: gapic_v1.method.wrap_method(
+ self.test_iam_permissions,
+ default_timeout=None,
+ client_info=client_info,
+ ),
+ }
+
+ def close(self):
+ """Closes resources associated with the transport.
+
+ .. warning::
+ Only call this method if the transport is NOT shared
+ with other clients - this may cause errors in other clients!
+ """
+ raise NotImplementedError()
+
+ @property
+ def operations_client(self):
+ """Return the client designed to process long-running operations."""
+ raise NotImplementedError()
+
+ @property
+ def get_project(
+ self,
+ ) -> Callable[
+ [projects.GetProjectRequest],
+ Union[projects.Project, Awaitable[projects.Project]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_projects(
+ self,
+ ) -> Callable[
+ [projects.ListProjectsRequest],
+ Union[projects.ListProjectsResponse, Awaitable[projects.ListProjectsResponse]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def search_projects(
+ self,
+ ) -> Callable[
+ [projects.SearchProjectsRequest],
+ Union[
+ projects.SearchProjectsResponse, Awaitable[projects.SearchProjectsResponse]
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def create_project(
+ self,
+ ) -> Callable[
+ [projects.CreateProjectRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def update_project(
+ self,
+ ) -> Callable[
+ [projects.UpdateProjectRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def move_project(
+ self,
+ ) -> Callable[
+ [projects.MoveProjectRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def delete_project(
+ self,
+ ) -> Callable[
+ [projects.DeleteProjectRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def undelete_project(
+ self,
+ ) -> Callable[
+ [projects.UndeleteProjectRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.GetIamPolicyRequest],
+ Union[policy_pb2.Policy, Awaitable[policy_pb2.Policy]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.SetIamPolicyRequest],
+ Union[policy_pb2.Policy, Awaitable[policy_pb2.Policy]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ Union[
+ iam_policy_pb2.TestIamPermissionsResponse,
+ Awaitable[iam_policy_pb2.TestIamPermissionsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[
+ [operations_pb2.GetOperationRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def kind(self) -> str:
+ raise NotImplementedError()
+
+
+__all__ = ("ProjectsTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/grpc.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/grpc.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/grpc.py
@@ -0,0 +1,702 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers, operations_v1
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+
+from google.cloud.resourcemanager_v3.types import projects
+
+from .base import DEFAULT_CLIENT_INFO, ProjectsTransport
+
+
+class ProjectsGrpcTransport(ProjectsTransport):
+ """gRPC backend transport for Projects.
+
+ Manages Google Cloud Projects.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _stubs: Dict[str, Callable]
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[grpc.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ channel (Optional[grpc.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> grpc.Channel:
+ """Create and return a gRPC channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ grpc.Channel: A gRPC channel object.
+
+ Raises:
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+
+ return grpc_helpers.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ @property
+ def grpc_channel(self) -> grpc.Channel:
+ """Return the channel designed to connect to this service."""
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsClient(self.grpc_channel)
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def get_project(self) -> Callable[[projects.GetProjectRequest], projects.Project]:
+ r"""Return a callable for the get project method over gRPC.
+
+ Retrieves the project identified by the specified ``name`` (for
+ example, ``projects/415104041262``).
+
+ The caller must have ``resourcemanager.projects.get`` permission
+ for this project.
+
+ Returns:
+ Callable[[~.GetProjectRequest],
+ ~.Project]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_project" not in self._stubs:
+ self._stubs["get_project"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/GetProject",
+ request_serializer=projects.GetProjectRequest.serialize,
+ response_deserializer=projects.Project.deserialize,
+ )
+ return self._stubs["get_project"]
+
+ @property
+ def list_projects(
+ self,
+ ) -> Callable[[projects.ListProjectsRequest], projects.ListProjectsResponse]:
+ r"""Return a callable for the list projects method over gRPC.
+
+ Lists projects that are direct children of the specified folder
+ or organization resource. ``list()`` provides a strongly
+ consistent view of the projects underneath the specified parent
+ resource. ``list()`` returns projects sorted based upon the
+ (ascending) lexical ordering of their ``display_name``. The
+ caller must have ``resourcemanager.projects.list`` permission on
+ the identified parent.
+
+ Returns:
+ Callable[[~.ListProjectsRequest],
+ ~.ListProjectsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_projects" not in self._stubs:
+ self._stubs["list_projects"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/ListProjects",
+ request_serializer=projects.ListProjectsRequest.serialize,
+ response_deserializer=projects.ListProjectsResponse.deserialize,
+ )
+ return self._stubs["list_projects"]
+
+ @property
+ def search_projects(
+ self,
+ ) -> Callable[[projects.SearchProjectsRequest], projects.SearchProjectsResponse]:
+ r"""Return a callable for the search projects method over gRPC.
+
+ Search for projects that the caller has both
+ ``resourcemanager.projects.get`` permission on, and also satisfy
+ the specified query.
+
+ This method returns projects in an unspecified order.
+
+ This method is eventually consistent with project mutations;
+ this means that a newly created project may not appear in the
+ results or recent updates to an existing project may not be
+ reflected in the results. To retrieve the latest state of a
+ project, use the
+ [GetProject][google.cloud.resourcemanager.v3.Projects.GetProject]
+ method.
+
+ Returns:
+ Callable[[~.SearchProjectsRequest],
+ ~.SearchProjectsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "search_projects" not in self._stubs:
+ self._stubs["search_projects"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/SearchProjects",
+ request_serializer=projects.SearchProjectsRequest.serialize,
+ response_deserializer=projects.SearchProjectsResponse.deserialize,
+ )
+ return self._stubs["search_projects"]
+
+ @property
+ def create_project(
+ self,
+ ) -> Callable[[projects.CreateProjectRequest], operations_pb2.Operation]:
+ r"""Return a callable for the create project method over gRPC.
+
+ Request that a new project be created. The result is an
+ ``Operation`` which can be used to track the creation process.
+ This process usually takes a few seconds, but can sometimes take
+ much longer. The tracking ``Operation`` is automatically deleted
+ after a few hours, so there is no need to call
+ ``DeleteOperation``.
+
+ Returns:
+ Callable[[~.CreateProjectRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_project" not in self._stubs:
+ self._stubs["create_project"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/CreateProject",
+ request_serializer=projects.CreateProjectRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_project"]
+
+ @property
+ def update_project(
+ self,
+ ) -> Callable[[projects.UpdateProjectRequest], operations_pb2.Operation]:
+ r"""Return a callable for the update project method over gRPC.
+
+ Updates the ``display_name`` and labels of the project
+ identified by the specified ``name`` (for example,
+ ``projects/415104041262``). Deleting all labels requires an
+ update mask for labels field.
+
+ The caller must have ``resourcemanager.projects.update``
+ permission for this project.
+
+ Returns:
+ Callable[[~.UpdateProjectRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_project" not in self._stubs:
+ self._stubs["update_project"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/UpdateProject",
+ request_serializer=projects.UpdateProjectRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["update_project"]
+
+ @property
+ def move_project(
+ self,
+ ) -> Callable[[projects.MoveProjectRequest], operations_pb2.Operation]:
+ r"""Return a callable for the move project method over gRPC.
+
+ Move a project to another place in your resource hierarchy,
+ under a new resource parent.
+
+ Returns an operation which can be used to track the process of
+ the project move workflow. Upon success, the
+ ``Operation.response`` field will be populated with the moved
+ project.
+
+ The caller must have ``resourcemanager.projects.move``
+ permission on the project, on the project's current and proposed
+ new parent.
+
+ If project has no current parent, or it currently does not have
+ an associated organization resource, you will also need the
+ ``resourcemanager.projects.setIamPolicy`` permission in the
+ project.
+
+ Returns:
+ Callable[[~.MoveProjectRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "move_project" not in self._stubs:
+ self._stubs["move_project"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/MoveProject",
+ request_serializer=projects.MoveProjectRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["move_project"]
+
+ @property
+ def delete_project(
+ self,
+ ) -> Callable[[projects.DeleteProjectRequest], operations_pb2.Operation]:
+ r"""Return a callable for the delete project method over gRPC.
+
+ Marks the project identified by the specified ``name`` (for
+ example, ``projects/415104041262``) for deletion.
+
+ This method will only affect the project if it has a lifecycle
+ state of
+ [ACTIVE][google.cloud.resourcemanager.v3.Project.State.ACTIVE].
+
+ This method changes the Project's lifecycle state from
+ [ACTIVE][google.cloud.resourcemanager.v3.Project.State.ACTIVE]
+ to
+ [DELETE_REQUESTED][google.cloud.resourcemanager.v3.Project.State.DELETE_REQUESTED].
+ The deletion starts at an unspecified time, at which point the
+ Project is no longer accessible.
+
+ Until the deletion completes, you can check the lifecycle state
+ checked by retrieving the project with [GetProject]
+ [google.cloud.resourcemanager.v3.Projects.GetProject], and the
+ project remains visible to [ListProjects]
+ [google.cloud.resourcemanager.v3.Projects.ListProjects].
+ However, you cannot update the project.
+
+ After the deletion completes, the project is not retrievable by
+ the [GetProject]
+ [google.cloud.resourcemanager.v3.Projects.GetProject],
+ [ListProjects]
+ [google.cloud.resourcemanager.v3.Projects.ListProjects], and
+ [SearchProjects][google.cloud.resourcemanager.v3.Projects.SearchProjects]
+ methods.
+
+ This method behaves idempotently, such that deleting a
+ ``DELETE_REQUESTED`` project will not cause an error, but also
+ won't do anything.
+
+ The caller must have ``resourcemanager.projects.delete``
+ permissions for this project.
+
+ Returns:
+ Callable[[~.DeleteProjectRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_project" not in self._stubs:
+ self._stubs["delete_project"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/DeleteProject",
+ request_serializer=projects.DeleteProjectRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_project"]
+
+ @property
+ def undelete_project(
+ self,
+ ) -> Callable[[projects.UndeleteProjectRequest], operations_pb2.Operation]:
+ r"""Return a callable for the undelete project method over gRPC.
+
+ Restores the project identified by the specified ``name`` (for
+ example, ``projects/415104041262``). You can only use this
+ method for a project that has a lifecycle state of
+ [DELETE_REQUESTED] [Projects.State.DELETE_REQUESTED]. After
+ deletion starts, the project cannot be restored.
+
+ The caller must have ``resourcemanager.projects.undelete``
+ permission for this project.
+
+ Returns:
+ Callable[[~.UndeleteProjectRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "undelete_project" not in self._stubs:
+ self._stubs["undelete_project"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/UndeleteProject",
+ request_serializer=projects.UndeleteProjectRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["undelete_project"]
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], policy_pb2.Policy]:
+ r"""Return a callable for the get iam policy method over gRPC.
+
+ Returns the IAM access control policy for the specified project,
+ in the format ``projects/{ProjectIdOrNumber}`` e.g.
+ projects/123. Permission is denied if the policy or the resource
+ do not exist.
+
+ Returns:
+ Callable[[~.GetIamPolicyRequest],
+ ~.Policy]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_iam_policy" not in self._stubs:
+ self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/GetIamPolicy",
+ request_serializer=iam_policy_pb2.GetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["get_iam_policy"]
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], policy_pb2.Policy]:
+ r"""Return a callable for the set iam policy method over gRPC.
+
+ Sets the IAM access control policy for the specified project, in
+ the format ``projects/{ProjectIdOrNumber}`` e.g. projects/123.
+
+ CAUTION: This method will replace the existing policy, and
+ cannot be used to append additional IAM settings.
+
+ Note: Removing service accounts from policies or changing their
+ roles can render services completely inoperable. It is important
+ to understand how the service account is being used before
+ removing or updating its roles.
+
+ The following constraints apply when using ``setIamPolicy()``:
+
+ - Project does not support ``allUsers`` and
+ ``allAuthenticatedUsers`` as ``members`` in a ``Binding`` of
+ a ``Policy``.
+
+ - The owner role can be granted to a ``user``,
+ ``serviceAccount``, or a group that is part of an
+ organization. For example, group@myownpersonaldomain.com
+ could be added as an owner to a project in the
+ myownpersonaldomain.com organization, but not the
+ examplepetstore.com organization.
+
+ - Service accounts can be made owners of a project directly
+ without any restrictions. However, to be added as an owner, a
+ user must be invited using the Cloud Platform console and
+ must accept the invitation.
+
+ - A user cannot be granted the owner role using
+ ``setIamPolicy()``. The user must be granted the owner role
+ using the Cloud Platform Console and must explicitly accept
+ the invitation.
+
+ - Invitations to grant the owner role cannot be sent using
+ ``setIamPolicy()``; they must be sent only using the Cloud
+ Platform Console.
+
+ - If the project is not part of an organization, there must be
+ at least one owner who has accepted the Terms of Service
+ (ToS) agreement in the policy. Calling ``setIamPolicy()`` to
+ remove the last ToS-accepted owner from the policy will fail.
+ This restriction also applies to legacy projects that no
+ longer have owners who have accepted the ToS. Edits to IAM
+ policies will be rejected until the lack of a ToS-accepting
+ owner is rectified. If the project is part of an
+ organization, you can remove all owners, potentially making
+ the organization inaccessible.
+
+ Returns:
+ Callable[[~.SetIamPolicyRequest],
+ ~.Policy]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "set_iam_policy" not in self._stubs:
+ self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/SetIamPolicy",
+ request_serializer=iam_policy_pb2.SetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["set_iam_policy"]
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ iam_policy_pb2.TestIamPermissionsResponse,
+ ]:
+ r"""Return a callable for the test iam permissions method over gRPC.
+
+ Returns permissions that a caller has on the specified project,
+ in the format ``projects/{ProjectIdOrNumber}`` e.g.
+ projects/123..
+
+ Returns:
+ Callable[[~.TestIamPermissionsRequest],
+ ~.TestIamPermissionsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "test_iam_permissions" not in self._stubs:
+ self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/TestIamPermissions",
+ request_serializer=iam_policy_pb2.TestIamPermissionsRequest.SerializeToString,
+ response_deserializer=iam_policy_pb2.TestIamPermissionsResponse.FromString,
+ )
+ return self._stubs["test_iam_permissions"]
+
+ def close(self):
+ self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+ @property
+ def kind(self) -> str:
+ return "grpc"
+
+
+__all__ = ("ProjectsGrpcTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/grpc_asyncio.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/grpc_asyncio.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/grpc_asyncio.py
@@ -0,0 +1,711 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers_async, operations_v1
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+from grpc.experimental import aio # type: ignore
+
+from google.cloud.resourcemanager_v3.types import projects
+
+from .base import DEFAULT_CLIENT_INFO, ProjectsTransport
+from .grpc import ProjectsGrpcTransport
+
+
+class ProjectsGrpcAsyncIOTransport(ProjectsTransport):
+ """gRPC AsyncIO backend transport for Projects.
+
+ Manages Google Cloud Projects.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _grpc_channel: aio.Channel
+ _stubs: Dict[str, Callable] = {}
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> aio.Channel:
+ """Create and return a gRPC AsyncIO channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ aio.Channel: A gRPC AsyncIO channel object.
+ """
+
+ return grpc_helpers_async.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[aio.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ channel (Optional[aio.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsAsyncClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def grpc_channel(self) -> aio.Channel:
+ """Create the channel designed to connect to this service.
+
+ This property caches on the instance; repeated calls return
+ the same channel.
+ """
+ # Return the channel from cache.
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsAsyncClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsAsyncClient(
+ self.grpc_channel
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def get_project(
+ self,
+ ) -> Callable[[projects.GetProjectRequest], Awaitable[projects.Project]]:
+ r"""Return a callable for the get project method over gRPC.
+
+ Retrieves the project identified by the specified ``name`` (for
+ example, ``projects/415104041262``).
+
+ The caller must have ``resourcemanager.projects.get`` permission
+ for this project.
+
+ Returns:
+ Callable[[~.GetProjectRequest],
+ Awaitable[~.Project]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_project" not in self._stubs:
+ self._stubs["get_project"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/GetProject",
+ request_serializer=projects.GetProjectRequest.serialize,
+ response_deserializer=projects.Project.deserialize,
+ )
+ return self._stubs["get_project"]
+
+ @property
+ def list_projects(
+ self,
+ ) -> Callable[
+ [projects.ListProjectsRequest], Awaitable[projects.ListProjectsResponse]
+ ]:
+ r"""Return a callable for the list projects method over gRPC.
+
+ Lists projects that are direct children of the specified folder
+ or organization resource. ``list()`` provides a strongly
+ consistent view of the projects underneath the specified parent
+ resource. ``list()`` returns projects sorted based upon the
+ (ascending) lexical ordering of their ``display_name``. The
+ caller must have ``resourcemanager.projects.list`` permission on
+ the identified parent.
+
+ Returns:
+ Callable[[~.ListProjectsRequest],
+ Awaitable[~.ListProjectsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_projects" not in self._stubs:
+ self._stubs["list_projects"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/ListProjects",
+ request_serializer=projects.ListProjectsRequest.serialize,
+ response_deserializer=projects.ListProjectsResponse.deserialize,
+ )
+ return self._stubs["list_projects"]
+
+ @property
+ def search_projects(
+ self,
+ ) -> Callable[
+ [projects.SearchProjectsRequest], Awaitable[projects.SearchProjectsResponse]
+ ]:
+ r"""Return a callable for the search projects method over gRPC.
+
+ Search for projects that the caller has both
+ ``resourcemanager.projects.get`` permission on, and also satisfy
+ the specified query.
+
+ This method returns projects in an unspecified order.
+
+ This method is eventually consistent with project mutations;
+ this means that a newly created project may not appear in the
+ results or recent updates to an existing project may not be
+ reflected in the results. To retrieve the latest state of a
+ project, use the
+ [GetProject][google.cloud.resourcemanager.v3.Projects.GetProject]
+ method.
+
+ Returns:
+ Callable[[~.SearchProjectsRequest],
+ Awaitable[~.SearchProjectsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "search_projects" not in self._stubs:
+ self._stubs["search_projects"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/SearchProjects",
+ request_serializer=projects.SearchProjectsRequest.serialize,
+ response_deserializer=projects.SearchProjectsResponse.deserialize,
+ )
+ return self._stubs["search_projects"]
+
+ @property
+ def create_project(
+ self,
+ ) -> Callable[[projects.CreateProjectRequest], Awaitable[operations_pb2.Operation]]:
+ r"""Return a callable for the create project method over gRPC.
+
+ Request that a new project be created. The result is an
+ ``Operation`` which can be used to track the creation process.
+ This process usually takes a few seconds, but can sometimes take
+ much longer. The tracking ``Operation`` is automatically deleted
+ after a few hours, so there is no need to call
+ ``DeleteOperation``.
+
+ Returns:
+ Callable[[~.CreateProjectRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_project" not in self._stubs:
+ self._stubs["create_project"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/CreateProject",
+ request_serializer=projects.CreateProjectRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_project"]
+
+ @property
+ def update_project(
+ self,
+ ) -> Callable[[projects.UpdateProjectRequest], Awaitable[operations_pb2.Operation]]:
+ r"""Return a callable for the update project method over gRPC.
+
+ Updates the ``display_name`` and labels of the project
+ identified by the specified ``name`` (for example,
+ ``projects/415104041262``). Deleting all labels requires an
+ update mask for labels field.
+
+ The caller must have ``resourcemanager.projects.update``
+ permission for this project.
+
+ Returns:
+ Callable[[~.UpdateProjectRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_project" not in self._stubs:
+ self._stubs["update_project"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/UpdateProject",
+ request_serializer=projects.UpdateProjectRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["update_project"]
+
+ @property
+ def move_project(
+ self,
+ ) -> Callable[[projects.MoveProjectRequest], Awaitable[operations_pb2.Operation]]:
+ r"""Return a callable for the move project method over gRPC.
+
+ Move a project to another place in your resource hierarchy,
+ under a new resource parent.
+
+ Returns an operation which can be used to track the process of
+ the project move workflow. Upon success, the
+ ``Operation.response`` field will be populated with the moved
+ project.
+
+ The caller must have ``resourcemanager.projects.move``
+ permission on the project, on the project's current and proposed
+ new parent.
+
+ If project has no current parent, or it currently does not have
+ an associated organization resource, you will also need the
+ ``resourcemanager.projects.setIamPolicy`` permission in the
+ project.
+
+ Returns:
+ Callable[[~.MoveProjectRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "move_project" not in self._stubs:
+ self._stubs["move_project"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/MoveProject",
+ request_serializer=projects.MoveProjectRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["move_project"]
+
+ @property
+ def delete_project(
+ self,
+ ) -> Callable[[projects.DeleteProjectRequest], Awaitable[operations_pb2.Operation]]:
+ r"""Return a callable for the delete project method over gRPC.
+
+ Marks the project identified by the specified ``name`` (for
+ example, ``projects/415104041262``) for deletion.
+
+ This method will only affect the project if it has a lifecycle
+ state of
+ [ACTIVE][google.cloud.resourcemanager.v3.Project.State.ACTIVE].
+
+ This method changes the Project's lifecycle state from
+ [ACTIVE][google.cloud.resourcemanager.v3.Project.State.ACTIVE]
+ to
+ [DELETE_REQUESTED][google.cloud.resourcemanager.v3.Project.State.DELETE_REQUESTED].
+ The deletion starts at an unspecified time, at which point the
+ Project is no longer accessible.
+
+ Until the deletion completes, you can check the lifecycle state
+ checked by retrieving the project with [GetProject]
+ [google.cloud.resourcemanager.v3.Projects.GetProject], and the
+ project remains visible to [ListProjects]
+ [google.cloud.resourcemanager.v3.Projects.ListProjects].
+ However, you cannot update the project.
+
+ After the deletion completes, the project is not retrievable by
+ the [GetProject]
+ [google.cloud.resourcemanager.v3.Projects.GetProject],
+ [ListProjects]
+ [google.cloud.resourcemanager.v3.Projects.ListProjects], and
+ [SearchProjects][google.cloud.resourcemanager.v3.Projects.SearchProjects]
+ methods.
+
+ This method behaves idempotently, such that deleting a
+ ``DELETE_REQUESTED`` project will not cause an error, but also
+ won't do anything.
+
+ The caller must have ``resourcemanager.projects.delete``
+ permissions for this project.
+
+ Returns:
+ Callable[[~.DeleteProjectRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_project" not in self._stubs:
+ self._stubs["delete_project"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/DeleteProject",
+ request_serializer=projects.DeleteProjectRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_project"]
+
+ @property
+ def undelete_project(
+ self,
+ ) -> Callable[
+ [projects.UndeleteProjectRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the undelete project method over gRPC.
+
+ Restores the project identified by the specified ``name`` (for
+ example, ``projects/415104041262``). You can only use this
+ method for a project that has a lifecycle state of
+ [DELETE_REQUESTED] [Projects.State.DELETE_REQUESTED]. After
+ deletion starts, the project cannot be restored.
+
+ The caller must have ``resourcemanager.projects.undelete``
+ permission for this project.
+
+ Returns:
+ Callable[[~.UndeleteProjectRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "undelete_project" not in self._stubs:
+ self._stubs["undelete_project"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/UndeleteProject",
+ request_serializer=projects.UndeleteProjectRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["undelete_project"]
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], Awaitable[policy_pb2.Policy]]:
+ r"""Return a callable for the get iam policy method over gRPC.
+
+ Returns the IAM access control policy for the specified project,
+ in the format ``projects/{ProjectIdOrNumber}`` e.g.
+ projects/123. Permission is denied if the policy or the resource
+ do not exist.
+
+ Returns:
+ Callable[[~.GetIamPolicyRequest],
+ Awaitable[~.Policy]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_iam_policy" not in self._stubs:
+ self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/GetIamPolicy",
+ request_serializer=iam_policy_pb2.GetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["get_iam_policy"]
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], Awaitable[policy_pb2.Policy]]:
+ r"""Return a callable for the set iam policy method over gRPC.
+
+ Sets the IAM access control policy for the specified project, in
+ the format ``projects/{ProjectIdOrNumber}`` e.g. projects/123.
+
+ CAUTION: This method will replace the existing policy, and
+ cannot be used to append additional IAM settings.
+
+ Note: Removing service accounts from policies or changing their
+ roles can render services completely inoperable. It is important
+ to understand how the service account is being used before
+ removing or updating its roles.
+
+ The following constraints apply when using ``setIamPolicy()``:
+
+ - Project does not support ``allUsers`` and
+ ``allAuthenticatedUsers`` as ``members`` in a ``Binding`` of
+ a ``Policy``.
+
+ - The owner role can be granted to a ``user``,
+ ``serviceAccount``, or a group that is part of an
+ organization. For example, group@myownpersonaldomain.com
+ could be added as an owner to a project in the
+ myownpersonaldomain.com organization, but not the
+ examplepetstore.com organization.
+
+ - Service accounts can be made owners of a project directly
+ without any restrictions. However, to be added as an owner, a
+ user must be invited using the Cloud Platform console and
+ must accept the invitation.
+
+ - A user cannot be granted the owner role using
+ ``setIamPolicy()``. The user must be granted the owner role
+ using the Cloud Platform Console and must explicitly accept
+ the invitation.
+
+ - Invitations to grant the owner role cannot be sent using
+ ``setIamPolicy()``; they must be sent only using the Cloud
+ Platform Console.
+
+ - If the project is not part of an organization, there must be
+ at least one owner who has accepted the Terms of Service
+ (ToS) agreement in the policy. Calling ``setIamPolicy()`` to
+ remove the last ToS-accepted owner from the policy will fail.
+ This restriction also applies to legacy projects that no
+ longer have owners who have accepted the ToS. Edits to IAM
+ policies will be rejected until the lack of a ToS-accepting
+ owner is rectified. If the project is part of an
+ organization, you can remove all owners, potentially making
+ the organization inaccessible.
+
+ Returns:
+ Callable[[~.SetIamPolicyRequest],
+ Awaitable[~.Policy]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "set_iam_policy" not in self._stubs:
+ self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/SetIamPolicy",
+ request_serializer=iam_policy_pb2.SetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["set_iam_policy"]
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ Awaitable[iam_policy_pb2.TestIamPermissionsResponse],
+ ]:
+ r"""Return a callable for the test iam permissions method over gRPC.
+
+ Returns permissions that a caller has on the specified project,
+ in the format ``projects/{ProjectIdOrNumber}`` e.g.
+ projects/123..
+
+ Returns:
+ Callable[[~.TestIamPermissionsRequest],
+ Awaitable[~.TestIamPermissionsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "test_iam_permissions" not in self._stubs:
+ self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.Projects/TestIamPermissions",
+ request_serializer=iam_policy_pb2.TestIamPermissionsRequest.SerializeToString,
+ response_deserializer=iam_policy_pb2.TestIamPermissionsResponse.FromString,
+ )
+ return self._stubs["test_iam_permissions"]
+
+ def close(self):
+ return self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+
+__all__ = ("ProjectsGrpcAsyncIOTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/rest.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/rest.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/projects/transports/rest.py
@@ -0,0 +1,1932 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import dataclasses
+import json # type: ignore
+import re
+from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import (
+ gapic_v1,
+ operations_v1,
+ path_template,
+ rest_helpers,
+ rest_streaming,
+)
+from google.api_core import exceptions as core_exceptions
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.auth.transport.requests import AuthorizedSession # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import json_format
+import grpc # type: ignore
+from requests import __version__ as requests_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.types import projects
+
+from .base import DEFAULT_CLIENT_INFO as BASE_DEFAULT_CLIENT_INFO
+from .base import ProjectsTransport
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=BASE_DEFAULT_CLIENT_INFO.gapic_version,
+ grpc_version=None,
+ rest_version=requests_version,
+)
+
+
+class ProjectsRestInterceptor:
+ """Interceptor for Projects.
+
+ Interceptors are used to manipulate requests, request metadata, and responses
+ in arbitrary ways.
+ Example use cases include:
+ * Logging
+ * Verifying requests according to service or custom semantics
+ * Stripping extraneous information from responses
+
+ These use cases and more can be enabled by injecting an
+ instance of a custom subclass when constructing the ProjectsRestTransport.
+
+ .. code-block:: python
+ class MyCustomProjectsInterceptor(ProjectsRestInterceptor):
+ def pre_create_project(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_create_project(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_delete_project(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_delete_project(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_iam_policy(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_iam_policy(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_project(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_project(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_projects(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_projects(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_move_project(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_move_project(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_search_projects(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_search_projects(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_set_iam_policy(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_set_iam_policy(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_test_iam_permissions(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_test_iam_permissions(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_undelete_project(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_undelete_project(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_update_project(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_update_project(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ transport = ProjectsRestTransport(interceptor=MyCustomProjectsInterceptor())
+ client = ProjectsClient(transport=transport)
+
+
+ """
+
+ def pre_create_project(
+ self,
+ request: projects.CreateProjectRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[projects.CreateProjectRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for create_project
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Projects server.
+ """
+ return request, metadata
+
+ def post_create_project(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for create_project
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Projects server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_delete_project(
+ self,
+ request: projects.DeleteProjectRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[projects.DeleteProjectRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for delete_project
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Projects server.
+ """
+ return request, metadata
+
+ def post_delete_project(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for delete_project
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Projects server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_iam_policy(
+ self,
+ request: iam_policy_pb2.GetIamPolicyRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.GetIamPolicyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_iam_policy
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Projects server.
+ """
+ return request, metadata
+
+ def post_get_iam_policy(self, response: policy_pb2.Policy) -> policy_pb2.Policy:
+ """Post-rpc interceptor for get_iam_policy
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Projects server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_project(
+ self, request: projects.GetProjectRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[projects.GetProjectRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_project
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Projects server.
+ """
+ return request, metadata
+
+ def post_get_project(self, response: projects.Project) -> projects.Project:
+ """Post-rpc interceptor for get_project
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Projects server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_projects(
+ self, request: projects.ListProjectsRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[projects.ListProjectsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_projects
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Projects server.
+ """
+ return request, metadata
+
+ def post_list_projects(
+ self, response: projects.ListProjectsResponse
+ ) -> projects.ListProjectsResponse:
+ """Post-rpc interceptor for list_projects
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Projects server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_move_project(
+ self, request: projects.MoveProjectRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[projects.MoveProjectRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for move_project
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Projects server.
+ """
+ return request, metadata
+
+ def post_move_project(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for move_project
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Projects server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_search_projects(
+ self,
+ request: projects.SearchProjectsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[projects.SearchProjectsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for search_projects
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Projects server.
+ """
+ return request, metadata
+
+ def post_search_projects(
+ self, response: projects.SearchProjectsResponse
+ ) -> projects.SearchProjectsResponse:
+ """Post-rpc interceptor for search_projects
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Projects server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_set_iam_policy(
+ self,
+ request: iam_policy_pb2.SetIamPolicyRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.SetIamPolicyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for set_iam_policy
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Projects server.
+ """
+ return request, metadata
+
+ def post_set_iam_policy(self, response: policy_pb2.Policy) -> policy_pb2.Policy:
+ """Post-rpc interceptor for set_iam_policy
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Projects server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_test_iam_permissions(
+ self,
+ request: iam_policy_pb2.TestIamPermissionsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.TestIamPermissionsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for test_iam_permissions
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Projects server.
+ """
+ return request, metadata
+
+ def post_test_iam_permissions(
+ self, response: iam_policy_pb2.TestIamPermissionsResponse
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ """Post-rpc interceptor for test_iam_permissions
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Projects server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_undelete_project(
+ self,
+ request: projects.UndeleteProjectRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[projects.UndeleteProjectRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for undelete_project
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Projects server.
+ """
+ return request, metadata
+
+ def post_undelete_project(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for undelete_project
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Projects server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_update_project(
+ self,
+ request: projects.UpdateProjectRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[projects.UpdateProjectRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for update_project
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Projects server.
+ """
+ return request, metadata
+
+ def post_update_project(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for update_project
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Projects server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_operation(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[operations_pb2.GetOperationRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the Projects server.
+ """
+ return request, metadata
+
+ def post_get_operation(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the response
+ after it is returned by the Projects server but before
+ it is returned to user code.
+ """
+ return response
+
+
+@dataclasses.dataclass
+class ProjectsRestStub:
+ _session: AuthorizedSession
+ _host: str
+ _interceptor: ProjectsRestInterceptor
+
+
+class ProjectsRestTransport(ProjectsTransport):
+ """REST backend transport for Projects.
+
+ Manages Google Cloud Projects.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends JSON representations of protocol buffers over HTTP/1.1
+
+ """
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ url_scheme: str = "https",
+ interceptor: Optional[ProjectsRestInterceptor] = None,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Callable[[], Tuple[bytes, bytes]]): Client
+ certificate to configure mutual TLS HTTP channel. It is ignored
+ if ``channel`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you are developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ url_scheme: the protocol scheme for the API endpoint. Normally
+ "https", but for testing or local servers,
+ "http" can be specified.
+ """
+ # Run the base constructor
+ # TODO(yon-mg): resolve other ctor params i.e. scopes, quota, etc.
+ # TODO: When custom host (api_endpoint) is set, `scopes` must *also* be set on the
+ # credentials object
+ maybe_url_match = re.match("^(?P<scheme>http(?:s)?://)?(?P<host>.*)$", host)
+ if maybe_url_match is None:
+ raise ValueError(
+ f"Unexpected hostname structure: {host}"
+ ) # pragma: NO COVER
+
+ url_match_items = maybe_url_match.groupdict()
+
+ host = f"{url_scheme}://{host}" if not url_match_items["scheme"] else host
+
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+ self._session = AuthorizedSession(
+ self._credentials, default_host=self.DEFAULT_HOST
+ )
+ self._operations_client: Optional[operations_v1.AbstractOperationsClient] = None
+ if client_cert_source_for_mtls:
+ self._session.configure_mtls_channel(client_cert_source_for_mtls)
+ self._interceptor = interceptor or ProjectsRestInterceptor()
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def operations_client(self) -> operations_v1.AbstractOperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ http_options: Dict[str, List[Dict[str, str]]] = {
+ "google.longrunning.Operations.GetOperation": [
+ {
+ "method": "get",
+ "uri": "/v3/{name=operations/**}",
+ },
+ ],
+ }
+
+ rest_transport = operations_v1.OperationsRestTransport(
+ host=self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ scopes=self._scopes,
+ http_options=http_options,
+ path_prefix="v3",
+ )
+
+ self._operations_client = operations_v1.AbstractOperationsClient(
+ transport=rest_transport
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ class _CreateProject(ProjectsRestStub):
+ def __hash__(self):
+ return hash("CreateProject")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: projects.CreateProjectRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the create project method over HTTP.
+
+ Args:
+ request (~.projects.CreateProjectRequest):
+ The request object. The request sent to the
+ [CreateProject][google.cloud.resourcemanager.v3.Projects.CreateProject]
+ method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/projects",
+ "body": "project",
+ },
+ ]
+ request, metadata = self._interceptor.pre_create_project(request, metadata)
+ pb_request = projects.CreateProjectRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_create_project(resp)
+ return resp
+
+ class _DeleteProject(ProjectsRestStub):
+ def __hash__(self):
+ return hash("DeleteProject")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: projects.DeleteProjectRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the delete project method over HTTP.
+
+ Args:
+ request (~.projects.DeleteProjectRequest):
+ The request object. [DeleteProject][google.cloud.resourcemanager.v3.Projects.DeleteProject]
+ method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "delete",
+ "uri": "/v3/{name=projects/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_delete_project(request, metadata)
+ pb_request = projects.DeleteProjectRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_delete_project(resp)
+ return resp
+
+ class _GetIamPolicy(ProjectsRestStub):
+ def __hash__(self):
+ return hash("GetIamPolicy")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.GetIamPolicyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Call the get iam policy method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.GetIamPolicyRequest):
+ The request object. Request message for ``GetIamPolicy`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which
+ specifies access controls for Google Cloud resources.
+
+ A ``Policy`` is a collection of ``bindings``. A
+ ``binding`` binds one or more ``members``, or
+ principals, to a single ``role``. Principals can be user
+ accounts, service accounts, Google groups, and domains
+ (such as G Suite). A ``role`` is a named list of
+ permissions; each ``role`` can be an IAM predefined role
+ or a user-created custom role.
+
+ For some types of Google Cloud resources, a ``binding``
+ can also specify a ``condition``, which is a logical
+ expression that allows access to a resource only if the
+ expression evaluates to ``true``. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the `IAM
+ documentation <https://cloud.google.com/iam/help/conditions/resource-policies>`__.
+
+ **JSON example:**
+
+ ::
+
+ {
+ "bindings": [
+ {
+ "role": "roles/resourcemanager.organizationAdmin",
+ "members": [
+ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+ },
+ {
+ "role": "roles/resourcemanager.organizationViewer",
+ "members": [
+ "user:eve@example.com"
+ ],
+ "condition": {
+ "title": "expirable access",
+ "description": "Does not grant access after Sep 2020",
+ "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')",
+ }
+ }
+ ],
+ "etag": "BwWWja0YfJA=",
+ "version": 3
+ }
+
+ **YAML example:**
+
+ ::
+
+ bindings:
+ - members:
+ - user:mike@example.com
+ - group:admins@example.com
+ - domain:google.com
+ - serviceAccount:my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin
+ - members:
+ - user:eve@example.com
+ role: roles/resourcemanager.organizationViewer
+ condition:
+ title: expirable access
+ description: Does not grant access after Sep 2020
+ expression: request.time < timestamp('2020-10-01T00:00:00.000Z')
+ etag: BwWWja0YfJA=
+ version: 3
+
+ For a description of IAM and its features, see the `IAM
+ documentation <https://cloud.google.com/iam/docs/>`__.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=projects/*}:getIamPolicy",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_iam_policy(request, metadata)
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = policy_pb2.Policy()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_iam_policy(resp)
+ return resp
+
+ class _GetProject(ProjectsRestStub):
+ def __hash__(self):
+ return hash("GetProject")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: projects.GetProjectRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> projects.Project:
+ r"""Call the get project method over HTTP.
+
+ Args:
+ request (~.projects.GetProjectRequest):
+ The request object. The request sent to the
+ [GetProject][google.cloud.resourcemanager.v3.Projects.GetProject]
+ method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.projects.Project:
+ A project is a high-level Google
+ Cloud entity. It is a container for
+ ACLs, APIs, App Engine Apps, VMs, and
+ other Google Cloud Platform resources.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/{name=projects/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_project(request, metadata)
+ pb_request = projects.GetProjectRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = projects.Project()
+ pb_resp = projects.Project.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_project(resp)
+ return resp
+
+ class _ListProjects(ProjectsRestStub):
+ def __hash__(self):
+ return hash("ListProjects")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "parent": "",
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: projects.ListProjectsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> projects.ListProjectsResponse:
+ r"""Call the list projects method over HTTP.
+
+ Args:
+ request (~.projects.ListProjectsRequest):
+ The request object. The request sent to the
+ [ListProjects][google.cloud.resourcemanager.v3.Projects.ListProjects]
+ method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.projects.ListProjectsResponse:
+ A page of the response received from the
+ [ListProjects][google.cloud.resourcemanager.v3.Projects.ListProjects]
+ method.
+
+ A paginated response where more pages are available has
+ ``next_page_token`` set. This token can be used in a
+ subsequent request to retrieve the next request page.
+
+ NOTE: A response may contain fewer elements than the
+ request ``page_size`` and still have a
+ ``next_page_token``.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/projects",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_projects(request, metadata)
+ pb_request = projects.ListProjectsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = projects.ListProjectsResponse()
+ pb_resp = projects.ListProjectsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_projects(resp)
+ return resp
+
+ class _MoveProject(ProjectsRestStub):
+ def __hash__(self):
+ return hash("MoveProject")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: projects.MoveProjectRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the move project method over HTTP.
+
+ Args:
+ request (~.projects.MoveProjectRequest):
+ The request object. The request sent to
+ [MoveProject][google.cloud.resourcemanager.v3.Projects.MoveProject]
+ method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{name=projects/*}:move",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_move_project(request, metadata)
+ pb_request = projects.MoveProjectRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_move_project(resp)
+ return resp
+
+ class _SearchProjects(ProjectsRestStub):
+ def __hash__(self):
+ return hash("SearchProjects")
+
+ def __call__(
+ self,
+ request: projects.SearchProjectsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> projects.SearchProjectsResponse:
+ r"""Call the search projects method over HTTP.
+
+ Args:
+ request (~.projects.SearchProjectsRequest):
+ The request object. The request sent to the
+ [SearchProjects][google.cloud.resourcemanager.v3.Projects.SearchProjects]
+ method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.projects.SearchProjectsResponse:
+ A page of the response received from the
+ [SearchProjects][google.cloud.resourcemanager.v3.Projects.SearchProjects]
+ method.
+
+ A paginated response where more pages are available has
+ ``next_page_token`` set. This token can be used in a
+ subsequent request to retrieve the next request page.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/projects:search",
+ },
+ ]
+ request, metadata = self._interceptor.pre_search_projects(request, metadata)
+ pb_request = projects.SearchProjectsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = projects.SearchProjectsResponse()
+ pb_resp = projects.SearchProjectsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_search_projects(resp)
+ return resp
+
+ class _SetIamPolicy(ProjectsRestStub):
+ def __hash__(self):
+ return hash("SetIamPolicy")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.SetIamPolicyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Call the set iam policy method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.SetIamPolicyRequest):
+ The request object. Request message for ``SetIamPolicy`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which
+ specifies access controls for Google Cloud resources.
+
+ A ``Policy`` is a collection of ``bindings``. A
+ ``binding`` binds one or more ``members``, or
+ principals, to a single ``role``. Principals can be user
+ accounts, service accounts, Google groups, and domains
+ (such as G Suite). A ``role`` is a named list of
+ permissions; each ``role`` can be an IAM predefined role
+ or a user-created custom role.
+
+ For some types of Google Cloud resources, a ``binding``
+ can also specify a ``condition``, which is a logical
+ expression that allows access to a resource only if the
+ expression evaluates to ``true``. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the `IAM
+ documentation <https://cloud.google.com/iam/help/conditions/resource-policies>`__.
+
+ **JSON example:**
+
+ ::
+
+ {
+ "bindings": [
+ {
+ "role": "roles/resourcemanager.organizationAdmin",
+ "members": [
+ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+ },
+ {
+ "role": "roles/resourcemanager.organizationViewer",
+ "members": [
+ "user:eve@example.com"
+ ],
+ "condition": {
+ "title": "expirable access",
+ "description": "Does not grant access after Sep 2020",
+ "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')",
+ }
+ }
+ ],
+ "etag": "BwWWja0YfJA=",
+ "version": 3
+ }
+
+ **YAML example:**
+
+ ::
+
+ bindings:
+ - members:
+ - user:mike@example.com
+ - group:admins@example.com
+ - domain:google.com
+ - serviceAccount:my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin
+ - members:
+ - user:eve@example.com
+ role: roles/resourcemanager.organizationViewer
+ condition:
+ title: expirable access
+ description: Does not grant access after Sep 2020
+ expression: request.time < timestamp('2020-10-01T00:00:00.000Z')
+ etag: BwWWja0YfJA=
+ version: 3
+
+ For a description of IAM and its features, see the `IAM
+ documentation <https://cloud.google.com/iam/docs/>`__.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=projects/*}:setIamPolicy",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_set_iam_policy(request, metadata)
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = policy_pb2.Policy()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_set_iam_policy(resp)
+ return resp
+
+ class _TestIamPermissions(ProjectsRestStub):
+ def __hash__(self):
+ return hash("TestIamPermissions")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.TestIamPermissionsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Call the test iam permissions method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.TestIamPermissionsRequest):
+ The request object. Request message for ``TestIamPermissions`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for ``TestIamPermissions`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=projects/*}:testIamPermissions",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_test_iam_permissions(
+ request, metadata
+ )
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = iam_policy_pb2.TestIamPermissionsResponse()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_test_iam_permissions(resp)
+ return resp
+
+ class _UndeleteProject(ProjectsRestStub):
+ def __hash__(self):
+ return hash("UndeleteProject")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: projects.UndeleteProjectRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the undelete project method over HTTP.
+
+ Args:
+ request (~.projects.UndeleteProjectRequest):
+ The request object. The request sent to the [UndeleteProject]
+ [google.cloud.resourcemanager.v3.Projects.UndeleteProject]
+ method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{name=projects/*}:undelete",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_undelete_project(
+ request, metadata
+ )
+ pb_request = projects.UndeleteProjectRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_undelete_project(resp)
+ return resp
+
+ class _UpdateProject(ProjectsRestStub):
+ def __hash__(self):
+ return hash("UpdateProject")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: projects.UpdateProjectRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the update project method over HTTP.
+
+ Args:
+ request (~.projects.UpdateProjectRequest):
+ The request object. The request sent to the
+ [UpdateProject][google.cloud.resourcemanager.v3.Projects.UpdateProject]
+ method.
+
+ Only the ``display_name`` and ``labels`` fields can be
+ change. Use the
+ [MoveProject][google.cloud.resourcemanager.v3.Projects.MoveProject]
+ method to change the ``parent`` field.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "patch",
+ "uri": "/v3/{project.name=projects/*}",
+ "body": "project",
+ },
+ ]
+ request, metadata = self._interceptor.pre_update_project(request, metadata)
+ pb_request = projects.UpdateProjectRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_update_project(resp)
+ return resp
+
+ @property
+ def create_project(
+ self,
+ ) -> Callable[[projects.CreateProjectRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._CreateProject(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def delete_project(
+ self,
+ ) -> Callable[[projects.DeleteProjectRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._DeleteProject(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], policy_pb2.Policy]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetIamPolicy(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_project(self) -> Callable[[projects.GetProjectRequest], projects.Project]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetProject(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_projects(
+ self,
+ ) -> Callable[[projects.ListProjectsRequest], projects.ListProjectsResponse]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListProjects(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def move_project(
+ self,
+ ) -> Callable[[projects.MoveProjectRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._MoveProject(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def search_projects(
+ self,
+ ) -> Callable[[projects.SearchProjectsRequest], projects.SearchProjectsResponse]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._SearchProjects(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], policy_pb2.Policy]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._SetIamPolicy(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ iam_policy_pb2.TestIamPermissionsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._TestIamPermissions(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def undelete_project(
+ self,
+ ) -> Callable[[projects.UndeleteProjectRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._UndeleteProject(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def update_project(
+ self,
+ ) -> Callable[[projects.UpdateProjectRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._UpdateProject(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_operation(self):
+ return self._GetOperation(self._session, self._host, self._interceptor) # type: ignore
+
+ class _GetOperation(ProjectsRestStub):
+ def __call__(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+
+ r"""Call the get operation method over HTTP.
+
+ Args:
+ request (operations_pb2.GetOperationRequest):
+ The request object for GetOperation method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ operations_pb2.Operation: Response from GetOperation method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/{name=operations/**}",
+ },
+ ]
+
+ request, metadata = self._interceptor.pre_get_operation(request, metadata)
+ request_kwargs = json_format.MessageToDict(request)
+ transcoded_request = path_template.transcode(http_options, **request_kwargs)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(json.dumps(transcoded_request["query_params"]))
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ resp = operations_pb2.Operation()
+ resp = json_format.Parse(response.content.decode("utf-8"), resp)
+ resp = self._interceptor.post_get_operation(resp)
+ return resp
+
+ @property
+ def kind(self) -> str:
+ return "rest"
+
+ def close(self):
+ self._session.close()
+
+
+__all__ = ("ProjectsRestTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/__init__.py
@@ -0,0 +1,22 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .async_client import TagBindingsAsyncClient
+from .client import TagBindingsClient
+
+__all__ = (
+ "TagBindingsClient",
+ "TagBindingsAsyncClient",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/async_client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/async_client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/async_client.py
@@ -0,0 +1,763 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import functools
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+)
+
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.api_core.client_options import ClientOptions
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import empty_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.tag_bindings import pagers
+from google.cloud.resourcemanager_v3.types import tag_bindings
+
+from .client import TagBindingsClient
+from .transports.base import DEFAULT_CLIENT_INFO, TagBindingsTransport
+from .transports.grpc_asyncio import TagBindingsGrpcAsyncIOTransport
+
+
+class TagBindingsAsyncClient:
+ """Allow users to create and manage TagBindings between
+ TagValues and different Google Cloud resources throughout the
+ GCP resource hierarchy.
+ """
+
+ _client: TagBindingsClient
+
+ DEFAULT_ENDPOINT = TagBindingsClient.DEFAULT_ENDPOINT
+ DEFAULT_MTLS_ENDPOINT = TagBindingsClient.DEFAULT_MTLS_ENDPOINT
+
+ tag_binding_path = staticmethod(TagBindingsClient.tag_binding_path)
+ parse_tag_binding_path = staticmethod(TagBindingsClient.parse_tag_binding_path)
+ tag_key_path = staticmethod(TagBindingsClient.tag_key_path)
+ parse_tag_key_path = staticmethod(TagBindingsClient.parse_tag_key_path)
+ tag_value_path = staticmethod(TagBindingsClient.tag_value_path)
+ parse_tag_value_path = staticmethod(TagBindingsClient.parse_tag_value_path)
+ common_billing_account_path = staticmethod(
+ TagBindingsClient.common_billing_account_path
+ )
+ parse_common_billing_account_path = staticmethod(
+ TagBindingsClient.parse_common_billing_account_path
+ )
+ common_folder_path = staticmethod(TagBindingsClient.common_folder_path)
+ parse_common_folder_path = staticmethod(TagBindingsClient.parse_common_folder_path)
+ common_organization_path = staticmethod(TagBindingsClient.common_organization_path)
+ parse_common_organization_path = staticmethod(
+ TagBindingsClient.parse_common_organization_path
+ )
+ common_project_path = staticmethod(TagBindingsClient.common_project_path)
+ parse_common_project_path = staticmethod(
+ TagBindingsClient.parse_common_project_path
+ )
+ common_location_path = staticmethod(TagBindingsClient.common_location_path)
+ parse_common_location_path = staticmethod(
+ TagBindingsClient.parse_common_location_path
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagBindingsAsyncClient: The constructed client.
+ """
+ return TagBindingsClient.from_service_account_info.__func__(TagBindingsAsyncClient, info, *args, **kwargs) # type: ignore
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagBindingsAsyncClient: The constructed client.
+ """
+ return TagBindingsClient.from_service_account_file.__func__(TagBindingsAsyncClient, filename, *args, **kwargs) # type: ignore
+
+ from_service_account_json = from_service_account_file
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ return TagBindingsClient.get_mtls_endpoint_and_cert_source(client_options) # type: ignore
+
+ @property
+ def transport(self) -> TagBindingsTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ TagBindingsTransport: The transport used by the client instance.
+ """
+ return self._client.transport
+
+ get_transport_class = functools.partial(
+ type(TagBindingsClient).get_transport_class, type(TagBindingsClient)
+ )
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Union[str, TagBindingsTransport] = "grpc_asyncio",
+ client_options: Optional[ClientOptions] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the tag bindings client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, ~.TagBindingsTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (ClientOptions): Custom options for the client. It
+ won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ self._client = TagBindingsClient(
+ credentials=credentials,
+ transport=transport,
+ client_options=client_options,
+ client_info=client_info,
+ )
+
+ async def list_tag_bindings(
+ self,
+ request: Optional[Union[tag_bindings.ListTagBindingsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListTagBindingsAsyncPager:
+ r"""Lists the TagBindings for the given Google Cloud resource, as
+ specified with ``parent``.
+
+ NOTE: The ``parent`` field is expected to be a full resource
+ name:
+ https://cloud.google.com/apis/design/resource_names#full_resource_name
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_list_tag_bindings():
+ # Create a client
+ client = resourcemanager_v3.TagBindingsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListTagBindingsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_tag_bindings(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.ListTagBindingsRequest, dict]]):
+ The request object. The request message to list all
+ TagBindings for a parent.
+ parent (:class:`str`):
+ Required. The full resource name of a
+ resource for which you want to list
+ existing TagBindings. E.g.
+ "//cloudresourcemanager.googleapis.com/projects/123"
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.tag_bindings.pagers.ListTagBindingsAsyncPager:
+ The ListTagBindings response.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_bindings.ListTagBindingsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_tag_bindings,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListTagBindingsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def create_tag_binding(
+ self,
+ request: Optional[Union[tag_bindings.CreateTagBindingRequest, dict]] = None,
+ *,
+ tag_binding: Optional[tag_bindings.TagBinding] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Creates a TagBinding between a TagValue and a Google
+ Cloud resource.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_create_tag_binding():
+ # Create a client
+ client = resourcemanager_v3.TagBindingsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.CreateTagBindingRequest(
+ )
+
+ # Make the request
+ operation = client.create_tag_binding(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.CreateTagBindingRequest, dict]]):
+ The request object. The request message to create a
+ TagBinding.
+ tag_binding (:class:`google.cloud.resourcemanager_v3.types.TagBinding`):
+ Required. The TagBinding to be
+ created.
+
+ This corresponds to the ``tag_binding`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.TagBinding` A TagBinding represents a connection between a TagValue and a cloud
+ resource Once a TagBinding is created, the TagValue
+ is applied to all the descendants of the Google Cloud
+ resource.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([tag_binding])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_bindings.CreateTagBindingRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if tag_binding is not None:
+ request.tag_binding = tag_binding
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.create_tag_binding,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ tag_bindings.TagBinding,
+ metadata_type=tag_bindings.CreateTagBindingMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def delete_tag_binding(
+ self,
+ request: Optional[Union[tag_bindings.DeleteTagBindingRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Deletes a TagBinding.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_delete_tag_binding():
+ # Create a client
+ client = resourcemanager_v3.TagBindingsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.DeleteTagBindingRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_tag_binding(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.DeleteTagBindingRequest, dict]]):
+ The request object. The request message to delete a
+ TagBinding.
+ name (:class:`str`):
+ Required. The name of the TagBinding. This is a String
+ of the form: ``tagBindings/{id}`` (e.g.
+ ``tagBindings/%2F%2Fcloudresourcemanager.googleapis.com%2Fprojects%2F123/tagValues/456``).
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.protobuf.empty_pb2.Empty` A generic empty message that you can re-use to avoid defining duplicated
+ empty messages in your APIs. A typical example is to
+ use it as the request or the response type of an API
+ method. For instance:
+
+ service Foo {
+ rpc Bar(google.protobuf.Empty) returns
+ (google.protobuf.Empty);
+
+ }
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_bindings.DeleteTagBindingRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.delete_tag_binding,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ empty_pb2.Empty,
+ metadata_type=tag_bindings.DeleteTagBindingMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_effective_tags(
+ self,
+ request: Optional[Union[tag_bindings.ListEffectiveTagsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListEffectiveTagsAsyncPager:
+ r"""Return a list of effective tags for the given Google Cloud
+ resource, as specified in ``parent``.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_list_effective_tags():
+ # Create a client
+ client = resourcemanager_v3.TagBindingsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListEffectiveTagsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_effective_tags(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.ListEffectiveTagsRequest, dict]]):
+ The request object. The request message to
+ ListEffectiveTags
+ parent (:class:`str`):
+ Required. The full resource name of a
+ resource for which you want to list the
+ effective tags. E.g.
+ "//cloudresourcemanager.googleapis.com/projects/123"
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.tag_bindings.pagers.ListEffectiveTagsAsyncPager:
+ The response of ListEffectiveTags.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_bindings.ListEffectiveTagsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_effective_tags,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListEffectiveTagsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._client._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, exc_type, exc, tb):
+ await self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("TagBindingsAsyncClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/client.py
@@ -0,0 +1,1011 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import os
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+ cast,
+)
+
+from google.api_core import client_options as client_options_lib
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.exceptions import MutualTLSChannelError # type: ignore
+from google.auth.transport import mtls # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import empty_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.tag_bindings import pagers
+from google.cloud.resourcemanager_v3.types import tag_bindings
+
+from .transports.base import DEFAULT_CLIENT_INFO, TagBindingsTransport
+from .transports.grpc import TagBindingsGrpcTransport
+from .transports.grpc_asyncio import TagBindingsGrpcAsyncIOTransport
+from .transports.rest import TagBindingsRestTransport
+
+
+class TagBindingsClientMeta(type):
+ """Metaclass for the TagBindings client.
+
+ This provides class-level methods for building and retrieving
+ support objects (e.g. transport) without polluting the client instance
+ objects.
+ """
+
+ _transport_registry = OrderedDict() # type: Dict[str, Type[TagBindingsTransport]]
+ _transport_registry["grpc"] = TagBindingsGrpcTransport
+ _transport_registry["grpc_asyncio"] = TagBindingsGrpcAsyncIOTransport
+ _transport_registry["rest"] = TagBindingsRestTransport
+
+ def get_transport_class(
+ cls,
+ label: Optional[str] = None,
+ ) -> Type[TagBindingsTransport]:
+ """Returns an appropriate transport class.
+
+ Args:
+ label: The name of the desired transport. If none is
+ provided, then the first transport in the registry is used.
+
+ Returns:
+ The transport class to use.
+ """
+ # If a specific transport is requested, return that one.
+ if label:
+ return cls._transport_registry[label]
+
+ # No transport is requested; return the default (that is, the first one
+ # in the dictionary).
+ return next(iter(cls._transport_registry.values()))
+
+
+class TagBindingsClient(metaclass=TagBindingsClientMeta):
+ """Allow users to create and manage TagBindings between
+ TagValues and different Google Cloud resources throughout the
+ GCP resource hierarchy.
+ """
+
+ @staticmethod
+ def _get_default_mtls_endpoint(api_endpoint):
+ """Converts api endpoint to mTLS endpoint.
+
+ Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to
+ "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively.
+ Args:
+ api_endpoint (Optional[str]): the api endpoint to convert.
+ Returns:
+ str: converted mTLS api endpoint.
+ """
+ if not api_endpoint:
+ return api_endpoint
+
+ mtls_endpoint_re = re.compile(
+ r"(?P<name>[^.]+)(?P<mtls>\.mtls)?(?P<sandbox>\.sandbox)?(?P<googledomain>\.googleapis\.com)?"
+ )
+
+ m = mtls_endpoint_re.match(api_endpoint)
+ name, mtls, sandbox, googledomain = m.groups()
+ if mtls or not googledomain:
+ return api_endpoint
+
+ if sandbox:
+ return api_endpoint.replace(
+ "sandbox.googleapis.com", "mtls.sandbox.googleapis.com"
+ )
+
+ return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com")
+
+ DEFAULT_ENDPOINT = "cloudresourcemanager.googleapis.com"
+ DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore
+ DEFAULT_ENDPOINT
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagBindingsClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_info(info)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagBindingsClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_file(filename)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ from_service_account_json = from_service_account_file
+
+ @property
+ def transport(self) -> TagBindingsTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ TagBindingsTransport: The transport used by the client
+ instance.
+ """
+ return self._transport
+
+ @staticmethod
+ def tag_binding_path(
+ tag_binding: str,
+ ) -> str:
+ """Returns a fully-qualified tag_binding string."""
+ return "tagBindings/{tag_binding}".format(
+ tag_binding=tag_binding,
+ )
+
+ @staticmethod
+ def parse_tag_binding_path(path: str) -> Dict[str, str]:
+ """Parses a tag_binding path into its component segments."""
+ m = re.match(r"^tagBindings/(?P<tag_binding>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def tag_key_path(
+ tag_key: str,
+ ) -> str:
+ """Returns a fully-qualified tag_key string."""
+ return "tagKeys/{tag_key}".format(
+ tag_key=tag_key,
+ )
+
+ @staticmethod
+ def parse_tag_key_path(path: str) -> Dict[str, str]:
+ """Parses a tag_key path into its component segments."""
+ m = re.match(r"^tagKeys/(?P<tag_key>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def tag_value_path(
+ tag_value: str,
+ ) -> str:
+ """Returns a fully-qualified tag_value string."""
+ return "tagValues/{tag_value}".format(
+ tag_value=tag_value,
+ )
+
+ @staticmethod
+ def parse_tag_value_path(path: str) -> Dict[str, str]:
+ """Parses a tag_value path into its component segments."""
+ m = re.match(r"^tagValues/(?P<tag_value>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_billing_account_path(
+ billing_account: str,
+ ) -> str:
+ """Returns a fully-qualified billing_account string."""
+ return "billingAccounts/{billing_account}".format(
+ billing_account=billing_account,
+ )
+
+ @staticmethod
+ def parse_common_billing_account_path(path: str) -> Dict[str, str]:
+ """Parse a billing_account path into its component segments."""
+ m = re.match(r"^billingAccounts/(?P<billing_account>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_folder_path(
+ folder: str,
+ ) -> str:
+ """Returns a fully-qualified folder string."""
+ return "folders/{folder}".format(
+ folder=folder,
+ )
+
+ @staticmethod
+ def parse_common_folder_path(path: str) -> Dict[str, str]:
+ """Parse a folder path into its component segments."""
+ m = re.match(r"^folders/(?P<folder>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_organization_path(
+ organization: str,
+ ) -> str:
+ """Returns a fully-qualified organization string."""
+ return "organizations/{organization}".format(
+ organization=organization,
+ )
+
+ @staticmethod
+ def parse_common_organization_path(path: str) -> Dict[str, str]:
+ """Parse a organization path into its component segments."""
+ m = re.match(r"^organizations/(?P<organization>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_project_path(
+ project: str,
+ ) -> str:
+ """Returns a fully-qualified project string."""
+ return "projects/{project}".format(
+ project=project,
+ )
+
+ @staticmethod
+ def parse_common_project_path(path: str) -> Dict[str, str]:
+ """Parse a project path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_location_path(
+ project: str,
+ location: str,
+ ) -> str:
+ """Returns a fully-qualified location string."""
+ return "projects/{project}/locations/{location}".format(
+ project=project,
+ location=location,
+ )
+
+ @staticmethod
+ def parse_common_location_path(path: str) -> Dict[str, str]:
+ """Parse a location path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[client_options_lib.ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ use_client_cert = os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")
+ use_mtls_endpoint = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto")
+ if use_client_cert not in ("true", "false"):
+ raise ValueError(
+ "Environment variable `GOOGLE_API_USE_CLIENT_CERTIFICATE` must be either `true` or `false`"
+ )
+ if use_mtls_endpoint not in ("auto", "never", "always"):
+ raise MutualTLSChannelError(
+ "Environment variable `GOOGLE_API_USE_MTLS_ENDPOINT` must be `never`, `auto` or `always`"
+ )
+
+ # Figure out the client cert source to use.
+ client_cert_source = None
+ if use_client_cert == "true":
+ if client_options.client_cert_source:
+ client_cert_source = client_options.client_cert_source
+ elif mtls.has_default_client_cert_source():
+ client_cert_source = mtls.default_client_cert_source()
+
+ # Figure out which api endpoint to use.
+ if client_options.api_endpoint is not None:
+ api_endpoint = client_options.api_endpoint
+ elif use_mtls_endpoint == "always" or (
+ use_mtls_endpoint == "auto" and client_cert_source
+ ):
+ api_endpoint = cls.DEFAULT_MTLS_ENDPOINT
+ else:
+ api_endpoint = cls.DEFAULT_ENDPOINT
+
+ return api_endpoint, client_cert_source
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Optional[Union[str, TagBindingsTransport]] = None,
+ client_options: Optional[Union[client_options_lib.ClientOptions, dict]] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the tag bindings client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, TagBindingsTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (Optional[Union[google.api_core.client_options.ClientOptions, dict]]): Custom options for the
+ client. It won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ if isinstance(client_options, dict):
+ client_options = client_options_lib.from_dict(client_options)
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ client_options = cast(client_options_lib.ClientOptions, client_options)
+
+ api_endpoint, client_cert_source_func = self.get_mtls_endpoint_and_cert_source(
+ client_options
+ )
+
+ api_key_value = getattr(client_options, "api_key", None)
+ if api_key_value and credentials:
+ raise ValueError(
+ "client_options.api_key and credentials are mutually exclusive"
+ )
+
+ # Save or instantiate the transport.
+ # Ordinarily, we provide the transport, but allowing a custom transport
+ # instance provides an extensibility point for unusual situations.
+ if isinstance(transport, TagBindingsTransport):
+ # transport is a TagBindingsTransport instance.
+ if credentials or client_options.credentials_file or api_key_value:
+ raise ValueError(
+ "When providing a transport instance, "
+ "provide its credentials directly."
+ )
+ if client_options.scopes:
+ raise ValueError(
+ "When providing a transport instance, provide its scopes "
+ "directly."
+ )
+ self._transport = transport
+ else:
+ import google.auth._default # type: ignore
+
+ if api_key_value and hasattr(
+ google.auth._default, "get_api_key_credentials"
+ ):
+ credentials = google.auth._default.get_api_key_credentials(
+ api_key_value
+ )
+
+ Transport = type(self).get_transport_class(transport)
+ self._transport = Transport(
+ credentials=credentials,
+ credentials_file=client_options.credentials_file,
+ host=api_endpoint,
+ scopes=client_options.scopes,
+ client_cert_source_for_mtls=client_cert_source_func,
+ quota_project_id=client_options.quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=True,
+ api_audience=client_options.api_audience,
+ )
+
+ def list_tag_bindings(
+ self,
+ request: Optional[Union[tag_bindings.ListTagBindingsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListTagBindingsPager:
+ r"""Lists the TagBindings for the given Google Cloud resource, as
+ specified with ``parent``.
+
+ NOTE: The ``parent`` field is expected to be a full resource
+ name:
+ https://cloud.google.com/apis/design/resource_names#full_resource_name
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_list_tag_bindings():
+ # Create a client
+ client = resourcemanager_v3.TagBindingsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListTagBindingsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_tag_bindings(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.ListTagBindingsRequest, dict]):
+ The request object. The request message to list all
+ TagBindings for a parent.
+ parent (str):
+ Required. The full resource name of a
+ resource for which you want to list
+ existing TagBindings. E.g.
+ "//cloudresourcemanager.googleapis.com/projects/123"
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.tag_bindings.pagers.ListTagBindingsPager:
+ The ListTagBindings response.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_bindings.ListTagBindingsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_bindings.ListTagBindingsRequest):
+ request = tag_bindings.ListTagBindingsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_tag_bindings]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListTagBindingsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def create_tag_binding(
+ self,
+ request: Optional[Union[tag_bindings.CreateTagBindingRequest, dict]] = None,
+ *,
+ tag_binding: Optional[tag_bindings.TagBinding] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Creates a TagBinding between a TagValue and a Google
+ Cloud resource.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_create_tag_binding():
+ # Create a client
+ client = resourcemanager_v3.TagBindingsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.CreateTagBindingRequest(
+ )
+
+ # Make the request
+ operation = client.create_tag_binding(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.CreateTagBindingRequest, dict]):
+ The request object. The request message to create a
+ TagBinding.
+ tag_binding (google.cloud.resourcemanager_v3.types.TagBinding):
+ Required. The TagBinding to be
+ created.
+
+ This corresponds to the ``tag_binding`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.TagBinding` A TagBinding represents a connection between a TagValue and a cloud
+ resource Once a TagBinding is created, the TagValue
+ is applied to all the descendants of the Google Cloud
+ resource.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([tag_binding])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_bindings.CreateTagBindingRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_bindings.CreateTagBindingRequest):
+ request = tag_bindings.CreateTagBindingRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if tag_binding is not None:
+ request.tag_binding = tag_binding
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.create_tag_binding]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ tag_bindings.TagBinding,
+ metadata_type=tag_bindings.CreateTagBindingMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def delete_tag_binding(
+ self,
+ request: Optional[Union[tag_bindings.DeleteTagBindingRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Deletes a TagBinding.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_delete_tag_binding():
+ # Create a client
+ client = resourcemanager_v3.TagBindingsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.DeleteTagBindingRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_tag_binding(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.DeleteTagBindingRequest, dict]):
+ The request object. The request message to delete a
+ TagBinding.
+ name (str):
+ Required. The name of the TagBinding. This is a String
+ of the form: ``tagBindings/{id}`` (e.g.
+ ``tagBindings/%2F%2Fcloudresourcemanager.googleapis.com%2Fprojects%2F123/tagValues/456``).
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.protobuf.empty_pb2.Empty` A generic empty message that you can re-use to avoid defining duplicated
+ empty messages in your APIs. A typical example is to
+ use it as the request or the response type of an API
+ method. For instance:
+
+ service Foo {
+ rpc Bar(google.protobuf.Empty) returns
+ (google.protobuf.Empty);
+
+ }
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_bindings.DeleteTagBindingRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_bindings.DeleteTagBindingRequest):
+ request = tag_bindings.DeleteTagBindingRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.delete_tag_binding]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ empty_pb2.Empty,
+ metadata_type=tag_bindings.DeleteTagBindingMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_effective_tags(
+ self,
+ request: Optional[Union[tag_bindings.ListEffectiveTagsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListEffectiveTagsPager:
+ r"""Return a list of effective tags for the given Google Cloud
+ resource, as specified in ``parent``.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_list_effective_tags():
+ # Create a client
+ client = resourcemanager_v3.TagBindingsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListEffectiveTagsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_effective_tags(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.ListEffectiveTagsRequest, dict]):
+ The request object. The request message to
+ ListEffectiveTags
+ parent (str):
+ Required. The full resource name of a
+ resource for which you want to list the
+ effective tags. E.g.
+ "//cloudresourcemanager.googleapis.com/projects/123"
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.tag_bindings.pagers.ListEffectiveTagsPager:
+ The response of ListEffectiveTags.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_bindings.ListEffectiveTagsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_bindings.ListEffectiveTagsRequest):
+ request = tag_bindings.ListEffectiveTagsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_effective_tags]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListEffectiveTagsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def __enter__(self) -> "TagBindingsClient":
+ return self
+
+ def __exit__(self, type, value, traceback):
+ """Releases underlying transport's resources.
+
+ .. warning::
+ ONLY use as a context manager if the transport is NOT shared
+ with other clients! Exiting the with block will CLOSE the transport
+ and may cause errors in other clients!
+ """
+ self.transport.close()
+
+ def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("TagBindingsClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/pagers.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/pagers.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/pagers.py
@@ -0,0 +1,283 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import (
+ Any,
+ AsyncIterator,
+ Awaitable,
+ Callable,
+ Iterator,
+ Optional,
+ Sequence,
+ Tuple,
+)
+
+from google.cloud.resourcemanager_v3.types import tag_bindings
+
+
+class ListTagBindingsPager:
+ """A pager for iterating through ``list_tag_bindings`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListTagBindingsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``tag_bindings`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListTagBindings`` requests and continue to iterate
+ through the ``tag_bindings`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListTagBindingsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., tag_bindings.ListTagBindingsResponse],
+ request: tag_bindings.ListTagBindingsRequest,
+ response: tag_bindings.ListTagBindingsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListTagBindingsRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListTagBindingsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = tag_bindings.ListTagBindingsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[tag_bindings.ListTagBindingsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[tag_bindings.TagBinding]:
+ for page in self.pages:
+ yield from page.tag_bindings
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListTagBindingsAsyncPager:
+ """A pager for iterating through ``list_tag_bindings`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListTagBindingsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``tag_bindings`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListTagBindings`` requests and continue to iterate
+ through the ``tag_bindings`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListTagBindingsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[tag_bindings.ListTagBindingsResponse]],
+ request: tag_bindings.ListTagBindingsRequest,
+ response: tag_bindings.ListTagBindingsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListTagBindingsRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListTagBindingsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = tag_bindings.ListTagBindingsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[tag_bindings.ListTagBindingsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[tag_bindings.TagBinding]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.tag_bindings:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListEffectiveTagsPager:
+ """A pager for iterating through ``list_effective_tags`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListEffectiveTagsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``effective_tags`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListEffectiveTags`` requests and continue to iterate
+ through the ``effective_tags`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListEffectiveTagsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., tag_bindings.ListEffectiveTagsResponse],
+ request: tag_bindings.ListEffectiveTagsRequest,
+ response: tag_bindings.ListEffectiveTagsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListEffectiveTagsRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListEffectiveTagsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = tag_bindings.ListEffectiveTagsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[tag_bindings.ListEffectiveTagsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[tag_bindings.EffectiveTag]:
+ for page in self.pages:
+ yield from page.effective_tags
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListEffectiveTagsAsyncPager:
+ """A pager for iterating through ``list_effective_tags`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListEffectiveTagsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``effective_tags`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListEffectiveTags`` requests and continue to iterate
+ through the ``effective_tags`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListEffectiveTagsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[tag_bindings.ListEffectiveTagsResponse]],
+ request: tag_bindings.ListEffectiveTagsRequest,
+ response: tag_bindings.ListEffectiveTagsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListEffectiveTagsRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListEffectiveTagsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = tag_bindings.ListEffectiveTagsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[tag_bindings.ListEffectiveTagsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[tag_bindings.EffectiveTag]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.effective_tags:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/__init__.py
@@ -0,0 +1,36 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+from typing import Dict, Type
+
+from .base import TagBindingsTransport
+from .grpc import TagBindingsGrpcTransport
+from .grpc_asyncio import TagBindingsGrpcAsyncIOTransport
+from .rest import TagBindingsRestInterceptor, TagBindingsRestTransport
+
+# Compile a registry of transports.
+_transport_registry = OrderedDict() # type: Dict[str, Type[TagBindingsTransport]]
+_transport_registry["grpc"] = TagBindingsGrpcTransport
+_transport_registry["grpc_asyncio"] = TagBindingsGrpcAsyncIOTransport
+_transport_registry["rest"] = TagBindingsRestTransport
+
+__all__ = (
+ "TagBindingsTransport",
+ "TagBindingsGrpcTransport",
+ "TagBindingsGrpcAsyncIOTransport",
+ "TagBindingsRestTransport",
+ "TagBindingsRestInterceptor",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/base.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/base.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/base.py
@@ -0,0 +1,230 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import abc
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Union
+
+import google.api_core
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1, operations_v1
+from google.api_core import retry as retries
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+from google.cloud.resourcemanager_v3.types import tag_bindings
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+class TagBindingsTransport(abc.ABC):
+ """Abstract transport class for TagBindings."""
+
+ AUTH_SCOPES = (
+ "https://www.googleapis.com/auth/cloud-platform",
+ "https://www.googleapis.com/auth/cloud-platform.read-only",
+ )
+
+ DEFAULT_HOST: str = "cloudresourcemanager.googleapis.com"
+
+ def __init__(
+ self,
+ *,
+ host: str = DEFAULT_HOST,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ **kwargs,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A list of scopes.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ """
+
+ scopes_kwargs = {"scopes": scopes, "default_scopes": self.AUTH_SCOPES}
+
+ # Save the scopes.
+ self._scopes = scopes
+
+ # If no credentials are provided, then determine the appropriate
+ # defaults.
+ if credentials and credentials_file:
+ raise core_exceptions.DuplicateCredentialArgs(
+ "'credentials_file' and 'credentials' are mutually exclusive"
+ )
+
+ if credentials_file is not None:
+ credentials, _ = google.auth.load_credentials_from_file(
+ credentials_file, **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ elif credentials is None:
+ credentials, _ = google.auth.default(
+ **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ # Don't apply audience if the credentials file passed from user.
+ if hasattr(credentials, "with_gdch_audience"):
+ credentials = credentials.with_gdch_audience(
+ api_audience if api_audience else host
+ )
+
+ # If the credentials are service account credentials, then always try to use self signed JWT.
+ if (
+ always_use_jwt_access
+ and isinstance(credentials, service_account.Credentials)
+ and hasattr(service_account.Credentials, "with_always_use_jwt_access")
+ ):
+ credentials = credentials.with_always_use_jwt_access(True)
+
+ # Save the credentials.
+ self._credentials = credentials
+
+ # Save the hostname. Default to port 443 (HTTPS) if none is specified.
+ if ":" not in host:
+ host += ":443"
+ self._host = host
+
+ def _prep_wrapped_messages(self, client_info):
+ # Precompute the wrapped methods.
+ self._wrapped_methods = {
+ self.list_tag_bindings: gapic_v1.method.wrap_method(
+ self.list_tag_bindings,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.create_tag_binding: gapic_v1.method.wrap_method(
+ self.create_tag_binding,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.delete_tag_binding: gapic_v1.method.wrap_method(
+ self.delete_tag_binding,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.list_effective_tags: gapic_v1.method.wrap_method(
+ self.list_effective_tags,
+ default_timeout=None,
+ client_info=client_info,
+ ),
+ }
+
+ def close(self):
+ """Closes resources associated with the transport.
+
+ .. warning::
+ Only call this method if the transport is NOT shared
+ with other clients - this may cause errors in other clients!
+ """
+ raise NotImplementedError()
+
+ @property
+ def operations_client(self):
+ """Return the client designed to process long-running operations."""
+ raise NotImplementedError()
+
+ @property
+ def list_tag_bindings(
+ self,
+ ) -> Callable[
+ [tag_bindings.ListTagBindingsRequest],
+ Union[
+ tag_bindings.ListTagBindingsResponse,
+ Awaitable[tag_bindings.ListTagBindingsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def create_tag_binding(
+ self,
+ ) -> Callable[
+ [tag_bindings.CreateTagBindingRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def delete_tag_binding(
+ self,
+ ) -> Callable[
+ [tag_bindings.DeleteTagBindingRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_effective_tags(
+ self,
+ ) -> Callable[
+ [tag_bindings.ListEffectiveTagsRequest],
+ Union[
+ tag_bindings.ListEffectiveTagsResponse,
+ Awaitable[tag_bindings.ListEffectiveTagsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[
+ [operations_pb2.GetOperationRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def kind(self) -> str:
+ raise NotImplementedError()
+
+
+__all__ = ("TagBindingsTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/grpc.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/grpc.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/grpc.py
@@ -0,0 +1,389 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers, operations_v1
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+
+from google.cloud.resourcemanager_v3.types import tag_bindings
+
+from .base import DEFAULT_CLIENT_INFO, TagBindingsTransport
+
+
+class TagBindingsGrpcTransport(TagBindingsTransport):
+ """gRPC backend transport for TagBindings.
+
+ Allow users to create and manage TagBindings between
+ TagValues and different Google Cloud resources throughout the
+ GCP resource hierarchy.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _stubs: Dict[str, Callable]
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[grpc.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ channel (Optional[grpc.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> grpc.Channel:
+ """Create and return a gRPC channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ grpc.Channel: A gRPC channel object.
+
+ Raises:
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+
+ return grpc_helpers.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ @property
+ def grpc_channel(self) -> grpc.Channel:
+ """Return the channel designed to connect to this service."""
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsClient(self.grpc_channel)
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def list_tag_bindings(
+ self,
+ ) -> Callable[
+ [tag_bindings.ListTagBindingsRequest], tag_bindings.ListTagBindingsResponse
+ ]:
+ r"""Return a callable for the list tag bindings method over gRPC.
+
+ Lists the TagBindings for the given Google Cloud resource, as
+ specified with ``parent``.
+
+ NOTE: The ``parent`` field is expected to be a full resource
+ name:
+ https://cloud.google.com/apis/design/resource_names#full_resource_name
+
+ Returns:
+ Callable[[~.ListTagBindingsRequest],
+ ~.ListTagBindingsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_tag_bindings" not in self._stubs:
+ self._stubs["list_tag_bindings"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagBindings/ListTagBindings",
+ request_serializer=tag_bindings.ListTagBindingsRequest.serialize,
+ response_deserializer=tag_bindings.ListTagBindingsResponse.deserialize,
+ )
+ return self._stubs["list_tag_bindings"]
+
+ @property
+ def create_tag_binding(
+ self,
+ ) -> Callable[[tag_bindings.CreateTagBindingRequest], operations_pb2.Operation]:
+ r"""Return a callable for the create tag binding method over gRPC.
+
+ Creates a TagBinding between a TagValue and a Google
+ Cloud resource.
+
+ Returns:
+ Callable[[~.CreateTagBindingRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_tag_binding" not in self._stubs:
+ self._stubs["create_tag_binding"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagBindings/CreateTagBinding",
+ request_serializer=tag_bindings.CreateTagBindingRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_tag_binding"]
+
+ @property
+ def delete_tag_binding(
+ self,
+ ) -> Callable[[tag_bindings.DeleteTagBindingRequest], operations_pb2.Operation]:
+ r"""Return a callable for the delete tag binding method over gRPC.
+
+ Deletes a TagBinding.
+
+ Returns:
+ Callable[[~.DeleteTagBindingRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_tag_binding" not in self._stubs:
+ self._stubs["delete_tag_binding"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagBindings/DeleteTagBinding",
+ request_serializer=tag_bindings.DeleteTagBindingRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_tag_binding"]
+
+ @property
+ def list_effective_tags(
+ self,
+ ) -> Callable[
+ [tag_bindings.ListEffectiveTagsRequest], tag_bindings.ListEffectiveTagsResponse
+ ]:
+ r"""Return a callable for the list effective tags method over gRPC.
+
+ Return a list of effective tags for the given Google Cloud
+ resource, as specified in ``parent``.
+
+ Returns:
+ Callable[[~.ListEffectiveTagsRequest],
+ ~.ListEffectiveTagsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_effective_tags" not in self._stubs:
+ self._stubs["list_effective_tags"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagBindings/ListEffectiveTags",
+ request_serializer=tag_bindings.ListEffectiveTagsRequest.serialize,
+ response_deserializer=tag_bindings.ListEffectiveTagsResponse.deserialize,
+ )
+ return self._stubs["list_effective_tags"]
+
+ def close(self):
+ self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+ @property
+ def kind(self) -> str:
+ return "grpc"
+
+
+__all__ = ("TagBindingsGrpcTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/grpc_asyncio.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/grpc_asyncio.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/grpc_asyncio.py
@@ -0,0 +1,396 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers_async, operations_v1
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+from grpc.experimental import aio # type: ignore
+
+from google.cloud.resourcemanager_v3.types import tag_bindings
+
+from .base import DEFAULT_CLIENT_INFO, TagBindingsTransport
+from .grpc import TagBindingsGrpcTransport
+
+
+class TagBindingsGrpcAsyncIOTransport(TagBindingsTransport):
+ """gRPC AsyncIO backend transport for TagBindings.
+
+ Allow users to create and manage TagBindings between
+ TagValues and different Google Cloud resources throughout the
+ GCP resource hierarchy.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _grpc_channel: aio.Channel
+ _stubs: Dict[str, Callable] = {}
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> aio.Channel:
+ """Create and return a gRPC AsyncIO channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ aio.Channel: A gRPC AsyncIO channel object.
+ """
+
+ return grpc_helpers_async.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[aio.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ channel (Optional[aio.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsAsyncClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def grpc_channel(self) -> aio.Channel:
+ """Create the channel designed to connect to this service.
+
+ This property caches on the instance; repeated calls return
+ the same channel.
+ """
+ # Return the channel from cache.
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsAsyncClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsAsyncClient(
+ self.grpc_channel
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def list_tag_bindings(
+ self,
+ ) -> Callable[
+ [tag_bindings.ListTagBindingsRequest],
+ Awaitable[tag_bindings.ListTagBindingsResponse],
+ ]:
+ r"""Return a callable for the list tag bindings method over gRPC.
+
+ Lists the TagBindings for the given Google Cloud resource, as
+ specified with ``parent``.
+
+ NOTE: The ``parent`` field is expected to be a full resource
+ name:
+ https://cloud.google.com/apis/design/resource_names#full_resource_name
+
+ Returns:
+ Callable[[~.ListTagBindingsRequest],
+ Awaitable[~.ListTagBindingsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_tag_bindings" not in self._stubs:
+ self._stubs["list_tag_bindings"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagBindings/ListTagBindings",
+ request_serializer=tag_bindings.ListTagBindingsRequest.serialize,
+ response_deserializer=tag_bindings.ListTagBindingsResponse.deserialize,
+ )
+ return self._stubs["list_tag_bindings"]
+
+ @property
+ def create_tag_binding(
+ self,
+ ) -> Callable[
+ [tag_bindings.CreateTagBindingRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the create tag binding method over gRPC.
+
+ Creates a TagBinding between a TagValue and a Google
+ Cloud resource.
+
+ Returns:
+ Callable[[~.CreateTagBindingRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_tag_binding" not in self._stubs:
+ self._stubs["create_tag_binding"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagBindings/CreateTagBinding",
+ request_serializer=tag_bindings.CreateTagBindingRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_tag_binding"]
+
+ @property
+ def delete_tag_binding(
+ self,
+ ) -> Callable[
+ [tag_bindings.DeleteTagBindingRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the delete tag binding method over gRPC.
+
+ Deletes a TagBinding.
+
+ Returns:
+ Callable[[~.DeleteTagBindingRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_tag_binding" not in self._stubs:
+ self._stubs["delete_tag_binding"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagBindings/DeleteTagBinding",
+ request_serializer=tag_bindings.DeleteTagBindingRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_tag_binding"]
+
+ @property
+ def list_effective_tags(
+ self,
+ ) -> Callable[
+ [tag_bindings.ListEffectiveTagsRequest],
+ Awaitable[tag_bindings.ListEffectiveTagsResponse],
+ ]:
+ r"""Return a callable for the list effective tags method over gRPC.
+
+ Return a list of effective tags for the given Google Cloud
+ resource, as specified in ``parent``.
+
+ Returns:
+ Callable[[~.ListEffectiveTagsRequest],
+ Awaitable[~.ListEffectiveTagsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_effective_tags" not in self._stubs:
+ self._stubs["list_effective_tags"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagBindings/ListEffectiveTags",
+ request_serializer=tag_bindings.ListEffectiveTagsRequest.serialize,
+ response_deserializer=tag_bindings.ListEffectiveTagsResponse.deserialize,
+ )
+ return self._stubs["list_effective_tags"]
+
+ def close(self):
+ return self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+
+__all__ = ("TagBindingsGrpcAsyncIOTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/rest.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/rest.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_bindings/transports/rest.py
@@ -0,0 +1,844 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import dataclasses
+import json # type: ignore
+import re
+from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import (
+ gapic_v1,
+ operations_v1,
+ path_template,
+ rest_helpers,
+ rest_streaming,
+)
+from google.api_core import exceptions as core_exceptions
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.auth.transport.requests import AuthorizedSession # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import json_format
+import grpc # type: ignore
+from requests import __version__ as requests_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+
+from google.longrunning import operations_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.types import tag_bindings
+
+from .base import DEFAULT_CLIENT_INFO as BASE_DEFAULT_CLIENT_INFO
+from .base import TagBindingsTransport
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=BASE_DEFAULT_CLIENT_INFO.gapic_version,
+ grpc_version=None,
+ rest_version=requests_version,
+)
+
+
+class TagBindingsRestInterceptor:
+ """Interceptor for TagBindings.
+
+ Interceptors are used to manipulate requests, request metadata, and responses
+ in arbitrary ways.
+ Example use cases include:
+ * Logging
+ * Verifying requests according to service or custom semantics
+ * Stripping extraneous information from responses
+
+ These use cases and more can be enabled by injecting an
+ instance of a custom subclass when constructing the TagBindingsRestTransport.
+
+ .. code-block:: python
+ class MyCustomTagBindingsInterceptor(TagBindingsRestInterceptor):
+ def pre_create_tag_binding(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_create_tag_binding(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_delete_tag_binding(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_delete_tag_binding(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_effective_tags(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_effective_tags(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_tag_bindings(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_tag_bindings(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ transport = TagBindingsRestTransport(interceptor=MyCustomTagBindingsInterceptor())
+ client = TagBindingsClient(transport=transport)
+
+
+ """
+
+ def pre_create_tag_binding(
+ self,
+ request: tag_bindings.CreateTagBindingRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_bindings.CreateTagBindingRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for create_tag_binding
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagBindings server.
+ """
+ return request, metadata
+
+ def post_create_tag_binding(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for create_tag_binding
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagBindings server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_delete_tag_binding(
+ self,
+ request: tag_bindings.DeleteTagBindingRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_bindings.DeleteTagBindingRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for delete_tag_binding
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagBindings server.
+ """
+ return request, metadata
+
+ def post_delete_tag_binding(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for delete_tag_binding
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagBindings server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_effective_tags(
+ self,
+ request: tag_bindings.ListEffectiveTagsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_bindings.ListEffectiveTagsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_effective_tags
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagBindings server.
+ """
+ return request, metadata
+
+ def post_list_effective_tags(
+ self, response: tag_bindings.ListEffectiveTagsResponse
+ ) -> tag_bindings.ListEffectiveTagsResponse:
+ """Post-rpc interceptor for list_effective_tags
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagBindings server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_tag_bindings(
+ self,
+ request: tag_bindings.ListTagBindingsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_bindings.ListTagBindingsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_tag_bindings
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagBindings server.
+ """
+ return request, metadata
+
+ def post_list_tag_bindings(
+ self, response: tag_bindings.ListTagBindingsResponse
+ ) -> tag_bindings.ListTagBindingsResponse:
+ """Post-rpc interceptor for list_tag_bindings
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagBindings server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_operation(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[operations_pb2.GetOperationRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagBindings server.
+ """
+ return request, metadata
+
+ def post_get_operation(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagBindings server but before
+ it is returned to user code.
+ """
+ return response
+
+
+@dataclasses.dataclass
+class TagBindingsRestStub:
+ _session: AuthorizedSession
+ _host: str
+ _interceptor: TagBindingsRestInterceptor
+
+
+class TagBindingsRestTransport(TagBindingsTransport):
+ """REST backend transport for TagBindings.
+
+ Allow users to create and manage TagBindings between
+ TagValues and different Google Cloud resources throughout the
+ GCP resource hierarchy.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends JSON representations of protocol buffers over HTTP/1.1
+
+ """
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ url_scheme: str = "https",
+ interceptor: Optional[TagBindingsRestInterceptor] = None,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Callable[[], Tuple[bytes, bytes]]): Client
+ certificate to configure mutual TLS HTTP channel. It is ignored
+ if ``channel`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you are developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ url_scheme: the protocol scheme for the API endpoint. Normally
+ "https", but for testing or local servers,
+ "http" can be specified.
+ """
+ # Run the base constructor
+ # TODO(yon-mg): resolve other ctor params i.e. scopes, quota, etc.
+ # TODO: When custom host (api_endpoint) is set, `scopes` must *also* be set on the
+ # credentials object
+ maybe_url_match = re.match("^(?P<scheme>http(?:s)?://)?(?P<host>.*)$", host)
+ if maybe_url_match is None:
+ raise ValueError(
+ f"Unexpected hostname structure: {host}"
+ ) # pragma: NO COVER
+
+ url_match_items = maybe_url_match.groupdict()
+
+ host = f"{url_scheme}://{host}" if not url_match_items["scheme"] else host
+
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+ self._session = AuthorizedSession(
+ self._credentials, default_host=self.DEFAULT_HOST
+ )
+ self._operations_client: Optional[operations_v1.AbstractOperationsClient] = None
+ if client_cert_source_for_mtls:
+ self._session.configure_mtls_channel(client_cert_source_for_mtls)
+ self._interceptor = interceptor or TagBindingsRestInterceptor()
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def operations_client(self) -> operations_v1.AbstractOperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ http_options: Dict[str, List[Dict[str, str]]] = {
+ "google.longrunning.Operations.GetOperation": [
+ {
+ "method": "get",
+ "uri": "/v3/{name=operations/**}",
+ },
+ ],
+ }
+
+ rest_transport = operations_v1.OperationsRestTransport(
+ host=self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ scopes=self._scopes,
+ http_options=http_options,
+ path_prefix="v3",
+ )
+
+ self._operations_client = operations_v1.AbstractOperationsClient(
+ transport=rest_transport
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ class _CreateTagBinding(TagBindingsRestStub):
+ def __hash__(self):
+ return hash("CreateTagBinding")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_bindings.CreateTagBindingRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the create tag binding method over HTTP.
+
+ Args:
+ request (~.tag_bindings.CreateTagBindingRequest):
+ The request object. The request message to create a
+ TagBinding.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/tagBindings",
+ "body": "tag_binding",
+ },
+ ]
+ request, metadata = self._interceptor.pre_create_tag_binding(
+ request, metadata
+ )
+ pb_request = tag_bindings.CreateTagBindingRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_create_tag_binding(resp)
+ return resp
+
+ class _DeleteTagBinding(TagBindingsRestStub):
+ def __hash__(self):
+ return hash("DeleteTagBinding")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_bindings.DeleteTagBindingRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the delete tag binding method over HTTP.
+
+ Args:
+ request (~.tag_bindings.DeleteTagBindingRequest):
+ The request object. The request message to delete a
+ TagBinding.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "delete",
+ "uri": "/v3/{name=tagBindings/**}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_delete_tag_binding(
+ request, metadata
+ )
+ pb_request = tag_bindings.DeleteTagBindingRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_delete_tag_binding(resp)
+ return resp
+
+ class _ListEffectiveTags(TagBindingsRestStub):
+ def __hash__(self):
+ return hash("ListEffectiveTags")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "parent": "",
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_bindings.ListEffectiveTagsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_bindings.ListEffectiveTagsResponse:
+ r"""Call the list effective tags method over HTTP.
+
+ Args:
+ request (~.tag_bindings.ListEffectiveTagsRequest):
+ The request object. The request message to
+ ListEffectiveTags
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.tag_bindings.ListEffectiveTagsResponse:
+ The response of ListEffectiveTags.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/effectiveTags",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_effective_tags(
+ request, metadata
+ )
+ pb_request = tag_bindings.ListEffectiveTagsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = tag_bindings.ListEffectiveTagsResponse()
+ pb_resp = tag_bindings.ListEffectiveTagsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_effective_tags(resp)
+ return resp
+
+ class _ListTagBindings(TagBindingsRestStub):
+ def __hash__(self):
+ return hash("ListTagBindings")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "parent": "",
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_bindings.ListTagBindingsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_bindings.ListTagBindingsResponse:
+ r"""Call the list tag bindings method over HTTP.
+
+ Args:
+ request (~.tag_bindings.ListTagBindingsRequest):
+ The request object. The request message to list all
+ TagBindings for a parent.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.tag_bindings.ListTagBindingsResponse:
+ The ListTagBindings response.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/tagBindings",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_tag_bindings(
+ request, metadata
+ )
+ pb_request = tag_bindings.ListTagBindingsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = tag_bindings.ListTagBindingsResponse()
+ pb_resp = tag_bindings.ListTagBindingsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_tag_bindings(resp)
+ return resp
+
+ @property
+ def create_tag_binding(
+ self,
+ ) -> Callable[[tag_bindings.CreateTagBindingRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._CreateTagBinding(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def delete_tag_binding(
+ self,
+ ) -> Callable[[tag_bindings.DeleteTagBindingRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._DeleteTagBinding(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_effective_tags(
+ self,
+ ) -> Callable[
+ [tag_bindings.ListEffectiveTagsRequest], tag_bindings.ListEffectiveTagsResponse
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListEffectiveTags(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_tag_bindings(
+ self,
+ ) -> Callable[
+ [tag_bindings.ListTagBindingsRequest], tag_bindings.ListTagBindingsResponse
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListTagBindings(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_operation(self):
+ return self._GetOperation(self._session, self._host, self._interceptor) # type: ignore
+
+ class _GetOperation(TagBindingsRestStub):
+ def __call__(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+
+ r"""Call the get operation method over HTTP.
+
+ Args:
+ request (operations_pb2.GetOperationRequest):
+ The request object for GetOperation method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ operations_pb2.Operation: Response from GetOperation method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/{name=operations/**}",
+ },
+ ]
+
+ request, metadata = self._interceptor.pre_get_operation(request, metadata)
+ request_kwargs = json_format.MessageToDict(request)
+ transcoded_request = path_template.transcode(http_options, **request_kwargs)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(json.dumps(transcoded_request["query_params"]))
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ resp = operations_pb2.Operation()
+ resp = json_format.Parse(response.content.decode("utf-8"), resp)
+ resp = self._interceptor.post_get_operation(resp)
+ return resp
+
+ @property
+ def kind(self) -> str:
+ return "rest"
+
+ def close(self):
+ self._session.close()
+
+
+__all__ = ("TagBindingsRestTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/__init__.py
@@ -0,0 +1,22 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .async_client import TagHoldsAsyncClient
+from .client import TagHoldsClient
+
+__all__ = (
+ "TagHoldsClient",
+ "TagHoldsAsyncClient",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/async_client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/async_client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/async_client.py
@@ -0,0 +1,657 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import functools
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+)
+
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.api_core.client_options import ClientOptions
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import empty_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.tag_holds import pagers
+from google.cloud.resourcemanager_v3.types import tag_holds
+
+from .client import TagHoldsClient
+from .transports.base import DEFAULT_CLIENT_INFO, TagHoldsTransport
+from .transports.grpc_asyncio import TagHoldsGrpcAsyncIOTransport
+
+
+class TagHoldsAsyncClient:
+ """Allow users to create and manage TagHolds for TagValues.
+ TagHolds represent the use of a Tag Value that is not captured
+ by TagBindings but should still block TagValue deletion (such as
+ a reference in a policy condition). This service provides
+ isolated failure domains by cloud location so that TagHolds can
+ be managed in the same location as their usage.
+ """
+
+ _client: TagHoldsClient
+
+ DEFAULT_ENDPOINT = TagHoldsClient.DEFAULT_ENDPOINT
+ DEFAULT_MTLS_ENDPOINT = TagHoldsClient.DEFAULT_MTLS_ENDPOINT
+
+ tag_hold_path = staticmethod(TagHoldsClient.tag_hold_path)
+ parse_tag_hold_path = staticmethod(TagHoldsClient.parse_tag_hold_path)
+ common_billing_account_path = staticmethod(
+ TagHoldsClient.common_billing_account_path
+ )
+ parse_common_billing_account_path = staticmethod(
+ TagHoldsClient.parse_common_billing_account_path
+ )
+ common_folder_path = staticmethod(TagHoldsClient.common_folder_path)
+ parse_common_folder_path = staticmethod(TagHoldsClient.parse_common_folder_path)
+ common_organization_path = staticmethod(TagHoldsClient.common_organization_path)
+ parse_common_organization_path = staticmethod(
+ TagHoldsClient.parse_common_organization_path
+ )
+ common_project_path = staticmethod(TagHoldsClient.common_project_path)
+ parse_common_project_path = staticmethod(TagHoldsClient.parse_common_project_path)
+ common_location_path = staticmethod(TagHoldsClient.common_location_path)
+ parse_common_location_path = staticmethod(TagHoldsClient.parse_common_location_path)
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagHoldsAsyncClient: The constructed client.
+ """
+ return TagHoldsClient.from_service_account_info.__func__(TagHoldsAsyncClient, info, *args, **kwargs) # type: ignore
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagHoldsAsyncClient: The constructed client.
+ """
+ return TagHoldsClient.from_service_account_file.__func__(TagHoldsAsyncClient, filename, *args, **kwargs) # type: ignore
+
+ from_service_account_json = from_service_account_file
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ return TagHoldsClient.get_mtls_endpoint_and_cert_source(client_options) # type: ignore
+
+ @property
+ def transport(self) -> TagHoldsTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ TagHoldsTransport: The transport used by the client instance.
+ """
+ return self._client.transport
+
+ get_transport_class = functools.partial(
+ type(TagHoldsClient).get_transport_class, type(TagHoldsClient)
+ )
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Union[str, TagHoldsTransport] = "grpc_asyncio",
+ client_options: Optional[ClientOptions] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the tag holds client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, ~.TagHoldsTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (ClientOptions): Custom options for the client. It
+ won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ self._client = TagHoldsClient(
+ credentials=credentials,
+ transport=transport,
+ client_options=client_options,
+ client_info=client_info,
+ )
+
+ async def create_tag_hold(
+ self,
+ request: Optional[Union[tag_holds.CreateTagHoldRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ tag_hold: Optional[tag_holds.TagHold] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Creates a TagHold. Returns ALREADY_EXISTS if a TagHold with the
+ same resource and origin exists under the same TagValue.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_create_tag_hold():
+ # Create a client
+ client = resourcemanager_v3.TagHoldsAsyncClient()
+
+ # Initialize request argument(s)
+ tag_hold = resourcemanager_v3.TagHold()
+ tag_hold.holder = "holder_value"
+
+ request = resourcemanager_v3.CreateTagHoldRequest(
+ parent="parent_value",
+ tag_hold=tag_hold,
+ )
+
+ # Make the request
+ operation = client.create_tag_hold(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.CreateTagHoldRequest, dict]]):
+ The request object. The request message to create a
+ TagHold.
+ parent (:class:`str`):
+ Required. The resource name of the TagHold's parent
+ TagValue. Must be of the form:
+ ``tagValues/{tag-value-id}``.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ tag_hold (:class:`google.cloud.resourcemanager_v3.types.TagHold`):
+ Required. The TagHold to be created.
+ This corresponds to the ``tag_hold`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.TagHold` A TagHold represents the use of a TagValue that is not captured by
+ TagBindings. If a TagValue has any TagHolds, deletion
+ will be blocked. This resource is intended to be
+ created in the same cloud location as the holder.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, tag_hold])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_holds.CreateTagHoldRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if tag_hold is not None:
+ request.tag_hold = tag_hold
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.create_tag_hold,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ tag_holds.TagHold,
+ metadata_type=tag_holds.CreateTagHoldMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def delete_tag_hold(
+ self,
+ request: Optional[Union[tag_holds.DeleteTagHoldRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Deletes a TagHold.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_delete_tag_hold():
+ # Create a client
+ client = resourcemanager_v3.TagHoldsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.DeleteTagHoldRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_tag_hold(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.DeleteTagHoldRequest, dict]]):
+ The request object. The request message to delete a
+ TagHold.
+ name (:class:`str`):
+ Required. The resource name of the TagHold to delete.
+ Must be of the form:
+ ``tagValues/{tag-value-id}/tagHolds/{tag-hold-id}``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.protobuf.empty_pb2.Empty` A generic empty message that you can re-use to avoid defining duplicated
+ empty messages in your APIs. A typical example is to
+ use it as the request or the response type of an API
+ method. For instance:
+
+ service Foo {
+ rpc Bar(google.protobuf.Empty) returns
+ (google.protobuf.Empty);
+
+ }
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_holds.DeleteTagHoldRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.delete_tag_hold,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ empty_pb2.Empty,
+ metadata_type=tag_holds.DeleteTagHoldMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_tag_holds(
+ self,
+ request: Optional[Union[tag_holds.ListTagHoldsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListTagHoldsAsyncPager:
+ r"""Lists TagHolds under a TagValue.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_list_tag_holds():
+ # Create a client
+ client = resourcemanager_v3.TagHoldsAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListTagHoldsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_tag_holds(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.ListTagHoldsRequest, dict]]):
+ The request object. The request message for listing the
+ TagHolds under a TagValue.
+ parent (:class:`str`):
+ Required. The resource name of the parent TagValue. Must
+ be of the form: ``tagValues/{tag-value-id}``.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.tag_holds.pagers.ListTagHoldsAsyncPager:
+ The ListTagHolds response.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_holds.ListTagHoldsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_tag_holds,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListTagHoldsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._client._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, exc_type, exc, tb):
+ await self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("TagHoldsAsyncClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/client.py
@@ -0,0 +1,894 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import os
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+ cast,
+)
+
+from google.api_core import client_options as client_options_lib
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.exceptions import MutualTLSChannelError # type: ignore
+from google.auth.transport import mtls # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import empty_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.tag_holds import pagers
+from google.cloud.resourcemanager_v3.types import tag_holds
+
+from .transports.base import DEFAULT_CLIENT_INFO, TagHoldsTransport
+from .transports.grpc import TagHoldsGrpcTransport
+from .transports.grpc_asyncio import TagHoldsGrpcAsyncIOTransport
+from .transports.rest import TagHoldsRestTransport
+
+
+class TagHoldsClientMeta(type):
+ """Metaclass for the TagHolds client.
+
+ This provides class-level methods for building and retrieving
+ support objects (e.g. transport) without polluting the client instance
+ objects.
+ """
+
+ _transport_registry = OrderedDict() # type: Dict[str, Type[TagHoldsTransport]]
+ _transport_registry["grpc"] = TagHoldsGrpcTransport
+ _transport_registry["grpc_asyncio"] = TagHoldsGrpcAsyncIOTransport
+ _transport_registry["rest"] = TagHoldsRestTransport
+
+ def get_transport_class(
+ cls,
+ label: Optional[str] = None,
+ ) -> Type[TagHoldsTransport]:
+ """Returns an appropriate transport class.
+
+ Args:
+ label: The name of the desired transport. If none is
+ provided, then the first transport in the registry is used.
+
+ Returns:
+ The transport class to use.
+ """
+ # If a specific transport is requested, return that one.
+ if label:
+ return cls._transport_registry[label]
+
+ # No transport is requested; return the default (that is, the first one
+ # in the dictionary).
+ return next(iter(cls._transport_registry.values()))
+
+
+class TagHoldsClient(metaclass=TagHoldsClientMeta):
+ """Allow users to create and manage TagHolds for TagValues.
+ TagHolds represent the use of a Tag Value that is not captured
+ by TagBindings but should still block TagValue deletion (such as
+ a reference in a policy condition). This service provides
+ isolated failure domains by cloud location so that TagHolds can
+ be managed in the same location as their usage.
+ """
+
+ @staticmethod
+ def _get_default_mtls_endpoint(api_endpoint):
+ """Converts api endpoint to mTLS endpoint.
+
+ Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to
+ "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively.
+ Args:
+ api_endpoint (Optional[str]): the api endpoint to convert.
+ Returns:
+ str: converted mTLS api endpoint.
+ """
+ if not api_endpoint:
+ return api_endpoint
+
+ mtls_endpoint_re = re.compile(
+ r"(?P<name>[^.]+)(?P<mtls>\.mtls)?(?P<sandbox>\.sandbox)?(?P<googledomain>\.googleapis\.com)?"
+ )
+
+ m = mtls_endpoint_re.match(api_endpoint)
+ name, mtls, sandbox, googledomain = m.groups()
+ if mtls or not googledomain:
+ return api_endpoint
+
+ if sandbox:
+ return api_endpoint.replace(
+ "sandbox.googleapis.com", "mtls.sandbox.googleapis.com"
+ )
+
+ return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com")
+
+ DEFAULT_ENDPOINT = "cloudresourcemanager.googleapis.com"
+ DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore
+ DEFAULT_ENDPOINT
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagHoldsClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_info(info)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagHoldsClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_file(filename)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ from_service_account_json = from_service_account_file
+
+ @property
+ def transport(self) -> TagHoldsTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ TagHoldsTransport: The transport used by the client
+ instance.
+ """
+ return self._transport
+
+ @staticmethod
+ def tag_hold_path(
+ tag_value: str,
+ tag_hold: str,
+ ) -> str:
+ """Returns a fully-qualified tag_hold string."""
+ return "tagValues/{tag_value}/tagHolds/{tag_hold}".format(
+ tag_value=tag_value,
+ tag_hold=tag_hold,
+ )
+
+ @staticmethod
+ def parse_tag_hold_path(path: str) -> Dict[str, str]:
+ """Parses a tag_hold path into its component segments."""
+ m = re.match(r"^tagValues/(?P<tag_value>.+?)/tagHolds/(?P<tag_hold>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_billing_account_path(
+ billing_account: str,
+ ) -> str:
+ """Returns a fully-qualified billing_account string."""
+ return "billingAccounts/{billing_account}".format(
+ billing_account=billing_account,
+ )
+
+ @staticmethod
+ def parse_common_billing_account_path(path: str) -> Dict[str, str]:
+ """Parse a billing_account path into its component segments."""
+ m = re.match(r"^billingAccounts/(?P<billing_account>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_folder_path(
+ folder: str,
+ ) -> str:
+ """Returns a fully-qualified folder string."""
+ return "folders/{folder}".format(
+ folder=folder,
+ )
+
+ @staticmethod
+ def parse_common_folder_path(path: str) -> Dict[str, str]:
+ """Parse a folder path into its component segments."""
+ m = re.match(r"^folders/(?P<folder>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_organization_path(
+ organization: str,
+ ) -> str:
+ """Returns a fully-qualified organization string."""
+ return "organizations/{organization}".format(
+ organization=organization,
+ )
+
+ @staticmethod
+ def parse_common_organization_path(path: str) -> Dict[str, str]:
+ """Parse a organization path into its component segments."""
+ m = re.match(r"^organizations/(?P<organization>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_project_path(
+ project: str,
+ ) -> str:
+ """Returns a fully-qualified project string."""
+ return "projects/{project}".format(
+ project=project,
+ )
+
+ @staticmethod
+ def parse_common_project_path(path: str) -> Dict[str, str]:
+ """Parse a project path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_location_path(
+ project: str,
+ location: str,
+ ) -> str:
+ """Returns a fully-qualified location string."""
+ return "projects/{project}/locations/{location}".format(
+ project=project,
+ location=location,
+ )
+
+ @staticmethod
+ def parse_common_location_path(path: str) -> Dict[str, str]:
+ """Parse a location path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[client_options_lib.ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ use_client_cert = os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")
+ use_mtls_endpoint = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto")
+ if use_client_cert not in ("true", "false"):
+ raise ValueError(
+ "Environment variable `GOOGLE_API_USE_CLIENT_CERTIFICATE` must be either `true` or `false`"
+ )
+ if use_mtls_endpoint not in ("auto", "never", "always"):
+ raise MutualTLSChannelError(
+ "Environment variable `GOOGLE_API_USE_MTLS_ENDPOINT` must be `never`, `auto` or `always`"
+ )
+
+ # Figure out the client cert source to use.
+ client_cert_source = None
+ if use_client_cert == "true":
+ if client_options.client_cert_source:
+ client_cert_source = client_options.client_cert_source
+ elif mtls.has_default_client_cert_source():
+ client_cert_source = mtls.default_client_cert_source()
+
+ # Figure out which api endpoint to use.
+ if client_options.api_endpoint is not None:
+ api_endpoint = client_options.api_endpoint
+ elif use_mtls_endpoint == "always" or (
+ use_mtls_endpoint == "auto" and client_cert_source
+ ):
+ api_endpoint = cls.DEFAULT_MTLS_ENDPOINT
+ else:
+ api_endpoint = cls.DEFAULT_ENDPOINT
+
+ return api_endpoint, client_cert_source
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Optional[Union[str, TagHoldsTransport]] = None,
+ client_options: Optional[Union[client_options_lib.ClientOptions, dict]] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the tag holds client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, TagHoldsTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (Optional[Union[google.api_core.client_options.ClientOptions, dict]]): Custom options for the
+ client. It won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ if isinstance(client_options, dict):
+ client_options = client_options_lib.from_dict(client_options)
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ client_options = cast(client_options_lib.ClientOptions, client_options)
+
+ api_endpoint, client_cert_source_func = self.get_mtls_endpoint_and_cert_source(
+ client_options
+ )
+
+ api_key_value = getattr(client_options, "api_key", None)
+ if api_key_value and credentials:
+ raise ValueError(
+ "client_options.api_key and credentials are mutually exclusive"
+ )
+
+ # Save or instantiate the transport.
+ # Ordinarily, we provide the transport, but allowing a custom transport
+ # instance provides an extensibility point for unusual situations.
+ if isinstance(transport, TagHoldsTransport):
+ # transport is a TagHoldsTransport instance.
+ if credentials or client_options.credentials_file or api_key_value:
+ raise ValueError(
+ "When providing a transport instance, "
+ "provide its credentials directly."
+ )
+ if client_options.scopes:
+ raise ValueError(
+ "When providing a transport instance, provide its scopes "
+ "directly."
+ )
+ self._transport = transport
+ else:
+ import google.auth._default # type: ignore
+
+ if api_key_value and hasattr(
+ google.auth._default, "get_api_key_credentials"
+ ):
+ credentials = google.auth._default.get_api_key_credentials(
+ api_key_value
+ )
+
+ Transport = type(self).get_transport_class(transport)
+ self._transport = Transport(
+ credentials=credentials,
+ credentials_file=client_options.credentials_file,
+ host=api_endpoint,
+ scopes=client_options.scopes,
+ client_cert_source_for_mtls=client_cert_source_func,
+ quota_project_id=client_options.quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=True,
+ api_audience=client_options.api_audience,
+ )
+
+ def create_tag_hold(
+ self,
+ request: Optional[Union[tag_holds.CreateTagHoldRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ tag_hold: Optional[tag_holds.TagHold] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Creates a TagHold. Returns ALREADY_EXISTS if a TagHold with the
+ same resource and origin exists under the same TagValue.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_create_tag_hold():
+ # Create a client
+ client = resourcemanager_v3.TagHoldsClient()
+
+ # Initialize request argument(s)
+ tag_hold = resourcemanager_v3.TagHold()
+ tag_hold.holder = "holder_value"
+
+ request = resourcemanager_v3.CreateTagHoldRequest(
+ parent="parent_value",
+ tag_hold=tag_hold,
+ )
+
+ # Make the request
+ operation = client.create_tag_hold(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.CreateTagHoldRequest, dict]):
+ The request object. The request message to create a
+ TagHold.
+ parent (str):
+ Required. The resource name of the TagHold's parent
+ TagValue. Must be of the form:
+ ``tagValues/{tag-value-id}``.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ tag_hold (google.cloud.resourcemanager_v3.types.TagHold):
+ Required. The TagHold to be created.
+ This corresponds to the ``tag_hold`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.TagHold` A TagHold represents the use of a TagValue that is not captured by
+ TagBindings. If a TagValue has any TagHolds, deletion
+ will be blocked. This resource is intended to be
+ created in the same cloud location as the holder.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, tag_hold])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_holds.CreateTagHoldRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_holds.CreateTagHoldRequest):
+ request = tag_holds.CreateTagHoldRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if tag_hold is not None:
+ request.tag_hold = tag_hold
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.create_tag_hold]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ tag_holds.TagHold,
+ metadata_type=tag_holds.CreateTagHoldMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def delete_tag_hold(
+ self,
+ request: Optional[Union[tag_holds.DeleteTagHoldRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Deletes a TagHold.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_delete_tag_hold():
+ # Create a client
+ client = resourcemanager_v3.TagHoldsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.DeleteTagHoldRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_tag_hold(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.DeleteTagHoldRequest, dict]):
+ The request object. The request message to delete a
+ TagHold.
+ name (str):
+ Required. The resource name of the TagHold to delete.
+ Must be of the form:
+ ``tagValues/{tag-value-id}/tagHolds/{tag-hold-id}``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.protobuf.empty_pb2.Empty` A generic empty message that you can re-use to avoid defining duplicated
+ empty messages in your APIs. A typical example is to
+ use it as the request or the response type of an API
+ method. For instance:
+
+ service Foo {
+ rpc Bar(google.protobuf.Empty) returns
+ (google.protobuf.Empty);
+
+ }
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_holds.DeleteTagHoldRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_holds.DeleteTagHoldRequest):
+ request = tag_holds.DeleteTagHoldRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.delete_tag_hold]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ empty_pb2.Empty,
+ metadata_type=tag_holds.DeleteTagHoldMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_tag_holds(
+ self,
+ request: Optional[Union[tag_holds.ListTagHoldsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListTagHoldsPager:
+ r"""Lists TagHolds under a TagValue.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_list_tag_holds():
+ # Create a client
+ client = resourcemanager_v3.TagHoldsClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListTagHoldsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_tag_holds(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.ListTagHoldsRequest, dict]):
+ The request object. The request message for listing the
+ TagHolds under a TagValue.
+ parent (str):
+ Required. The resource name of the parent TagValue. Must
+ be of the form: ``tagValues/{tag-value-id}``.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.tag_holds.pagers.ListTagHoldsPager:
+ The ListTagHolds response.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_holds.ListTagHoldsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_holds.ListTagHoldsRequest):
+ request = tag_holds.ListTagHoldsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_tag_holds]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListTagHoldsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def __enter__(self) -> "TagHoldsClient":
+ return self
+
+ def __exit__(self, type, value, traceback):
+ """Releases underlying transport's resources.
+
+ .. warning::
+ ONLY use as a context manager if the transport is NOT shared
+ with other clients! Exiting the with block will CLOSE the transport
+ and may cause errors in other clients!
+ """
+ self.transport.close()
+
+ def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("TagHoldsClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/pagers.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/pagers.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/pagers.py
@@ -0,0 +1,155 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import (
+ Any,
+ AsyncIterator,
+ Awaitable,
+ Callable,
+ Iterator,
+ Optional,
+ Sequence,
+ Tuple,
+)
+
+from google.cloud.resourcemanager_v3.types import tag_holds
+
+
+class ListTagHoldsPager:
+ """A pager for iterating through ``list_tag_holds`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListTagHoldsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``tag_holds`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListTagHolds`` requests and continue to iterate
+ through the ``tag_holds`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListTagHoldsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., tag_holds.ListTagHoldsResponse],
+ request: tag_holds.ListTagHoldsRequest,
+ response: tag_holds.ListTagHoldsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListTagHoldsRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListTagHoldsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = tag_holds.ListTagHoldsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[tag_holds.ListTagHoldsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[tag_holds.TagHold]:
+ for page in self.pages:
+ yield from page.tag_holds
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListTagHoldsAsyncPager:
+ """A pager for iterating through ``list_tag_holds`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListTagHoldsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``tag_holds`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListTagHolds`` requests and continue to iterate
+ through the ``tag_holds`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListTagHoldsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[tag_holds.ListTagHoldsResponse]],
+ request: tag_holds.ListTagHoldsRequest,
+ response: tag_holds.ListTagHoldsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListTagHoldsRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListTagHoldsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = tag_holds.ListTagHoldsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[tag_holds.ListTagHoldsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[tag_holds.TagHold]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.tag_holds:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/__init__.py
@@ -0,0 +1,36 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+from typing import Dict, Type
+
+from .base import TagHoldsTransport
+from .grpc import TagHoldsGrpcTransport
+from .grpc_asyncio import TagHoldsGrpcAsyncIOTransport
+from .rest import TagHoldsRestInterceptor, TagHoldsRestTransport
+
+# Compile a registry of transports.
+_transport_registry = OrderedDict() # type: Dict[str, Type[TagHoldsTransport]]
+_transport_registry["grpc"] = TagHoldsGrpcTransport
+_transport_registry["grpc_asyncio"] = TagHoldsGrpcAsyncIOTransport
+_transport_registry["rest"] = TagHoldsRestTransport
+
+__all__ = (
+ "TagHoldsTransport",
+ "TagHoldsGrpcTransport",
+ "TagHoldsGrpcAsyncIOTransport",
+ "TagHoldsRestTransport",
+ "TagHoldsRestInterceptor",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/base.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/base.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/base.py
@@ -0,0 +1,203 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import abc
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Union
+
+import google.api_core
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1, operations_v1
+from google.api_core import retry as retries
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+from google.cloud.resourcemanager_v3.types import tag_holds
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+class TagHoldsTransport(abc.ABC):
+ """Abstract transport class for TagHolds."""
+
+ AUTH_SCOPES = (
+ "https://www.googleapis.com/auth/cloud-platform",
+ "https://www.googleapis.com/auth/cloud-platform.read-only",
+ )
+
+ DEFAULT_HOST: str = "cloudresourcemanager.googleapis.com"
+
+ def __init__(
+ self,
+ *,
+ host: str = DEFAULT_HOST,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ **kwargs,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A list of scopes.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ """
+
+ scopes_kwargs = {"scopes": scopes, "default_scopes": self.AUTH_SCOPES}
+
+ # Save the scopes.
+ self._scopes = scopes
+
+ # If no credentials are provided, then determine the appropriate
+ # defaults.
+ if credentials and credentials_file:
+ raise core_exceptions.DuplicateCredentialArgs(
+ "'credentials_file' and 'credentials' are mutually exclusive"
+ )
+
+ if credentials_file is not None:
+ credentials, _ = google.auth.load_credentials_from_file(
+ credentials_file, **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ elif credentials is None:
+ credentials, _ = google.auth.default(
+ **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ # Don't apply audience if the credentials file passed from user.
+ if hasattr(credentials, "with_gdch_audience"):
+ credentials = credentials.with_gdch_audience(
+ api_audience if api_audience else host
+ )
+
+ # If the credentials are service account credentials, then always try to use self signed JWT.
+ if (
+ always_use_jwt_access
+ and isinstance(credentials, service_account.Credentials)
+ and hasattr(service_account.Credentials, "with_always_use_jwt_access")
+ ):
+ credentials = credentials.with_always_use_jwt_access(True)
+
+ # Save the credentials.
+ self._credentials = credentials
+
+ # Save the hostname. Default to port 443 (HTTPS) if none is specified.
+ if ":" not in host:
+ host += ":443"
+ self._host = host
+
+ def _prep_wrapped_messages(self, client_info):
+ # Precompute the wrapped methods.
+ self._wrapped_methods = {
+ self.create_tag_hold: gapic_v1.method.wrap_method(
+ self.create_tag_hold,
+ default_timeout=None,
+ client_info=client_info,
+ ),
+ self.delete_tag_hold: gapic_v1.method.wrap_method(
+ self.delete_tag_hold,
+ default_timeout=None,
+ client_info=client_info,
+ ),
+ self.list_tag_holds: gapic_v1.method.wrap_method(
+ self.list_tag_holds,
+ default_timeout=None,
+ client_info=client_info,
+ ),
+ }
+
+ def close(self):
+ """Closes resources associated with the transport.
+
+ .. warning::
+ Only call this method if the transport is NOT shared
+ with other clients - this may cause errors in other clients!
+ """
+ raise NotImplementedError()
+
+ @property
+ def operations_client(self):
+ """Return the client designed to process long-running operations."""
+ raise NotImplementedError()
+
+ @property
+ def create_tag_hold(
+ self,
+ ) -> Callable[
+ [tag_holds.CreateTagHoldRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def delete_tag_hold(
+ self,
+ ) -> Callable[
+ [tag_holds.DeleteTagHoldRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_tag_holds(
+ self,
+ ) -> Callable[
+ [tag_holds.ListTagHoldsRequest],
+ Union[
+ tag_holds.ListTagHoldsResponse, Awaitable[tag_holds.ListTagHoldsResponse]
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[
+ [operations_pb2.GetOperationRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def kind(self) -> str:
+ raise NotImplementedError()
+
+
+__all__ = ("TagHoldsTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/grpc.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/grpc.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/grpc.py
@@ -0,0 +1,356 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers, operations_v1
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+
+from google.cloud.resourcemanager_v3.types import tag_holds
+
+from .base import DEFAULT_CLIENT_INFO, TagHoldsTransport
+
+
+class TagHoldsGrpcTransport(TagHoldsTransport):
+ """gRPC backend transport for TagHolds.
+
+ Allow users to create and manage TagHolds for TagValues.
+ TagHolds represent the use of a Tag Value that is not captured
+ by TagBindings but should still block TagValue deletion (such as
+ a reference in a policy condition). This service provides
+ isolated failure domains by cloud location so that TagHolds can
+ be managed in the same location as their usage.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _stubs: Dict[str, Callable]
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[grpc.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ channel (Optional[grpc.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> grpc.Channel:
+ """Create and return a gRPC channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ grpc.Channel: A gRPC channel object.
+
+ Raises:
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+
+ return grpc_helpers.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ @property
+ def grpc_channel(self) -> grpc.Channel:
+ """Return the channel designed to connect to this service."""
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsClient(self.grpc_channel)
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def create_tag_hold(
+ self,
+ ) -> Callable[[tag_holds.CreateTagHoldRequest], operations_pb2.Operation]:
+ r"""Return a callable for the create tag hold method over gRPC.
+
+ Creates a TagHold. Returns ALREADY_EXISTS if a TagHold with the
+ same resource and origin exists under the same TagValue.
+
+ Returns:
+ Callable[[~.CreateTagHoldRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_tag_hold" not in self._stubs:
+ self._stubs["create_tag_hold"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagHolds/CreateTagHold",
+ request_serializer=tag_holds.CreateTagHoldRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_tag_hold"]
+
+ @property
+ def delete_tag_hold(
+ self,
+ ) -> Callable[[tag_holds.DeleteTagHoldRequest], operations_pb2.Operation]:
+ r"""Return a callable for the delete tag hold method over gRPC.
+
+ Deletes a TagHold.
+
+ Returns:
+ Callable[[~.DeleteTagHoldRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_tag_hold" not in self._stubs:
+ self._stubs["delete_tag_hold"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagHolds/DeleteTagHold",
+ request_serializer=tag_holds.DeleteTagHoldRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_tag_hold"]
+
+ @property
+ def list_tag_holds(
+ self,
+ ) -> Callable[[tag_holds.ListTagHoldsRequest], tag_holds.ListTagHoldsResponse]:
+ r"""Return a callable for the list tag holds method over gRPC.
+
+ Lists TagHolds under a TagValue.
+
+ Returns:
+ Callable[[~.ListTagHoldsRequest],
+ ~.ListTagHoldsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_tag_holds" not in self._stubs:
+ self._stubs["list_tag_holds"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagHolds/ListTagHolds",
+ request_serializer=tag_holds.ListTagHoldsRequest.serialize,
+ response_deserializer=tag_holds.ListTagHoldsResponse.deserialize,
+ )
+ return self._stubs["list_tag_holds"]
+
+ def close(self):
+ self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+ @property
+ def kind(self) -> str:
+ return "grpc"
+
+
+__all__ = ("TagHoldsGrpcTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/grpc_asyncio.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/grpc_asyncio.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/grpc_asyncio.py
@@ -0,0 +1,363 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers_async, operations_v1
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+from grpc.experimental import aio # type: ignore
+
+from google.cloud.resourcemanager_v3.types import tag_holds
+
+from .base import DEFAULT_CLIENT_INFO, TagHoldsTransport
+from .grpc import TagHoldsGrpcTransport
+
+
+class TagHoldsGrpcAsyncIOTransport(TagHoldsTransport):
+ """gRPC AsyncIO backend transport for TagHolds.
+
+ Allow users to create and manage TagHolds for TagValues.
+ TagHolds represent the use of a Tag Value that is not captured
+ by TagBindings but should still block TagValue deletion (such as
+ a reference in a policy condition). This service provides
+ isolated failure domains by cloud location so that TagHolds can
+ be managed in the same location as their usage.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _grpc_channel: aio.Channel
+ _stubs: Dict[str, Callable] = {}
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> aio.Channel:
+ """Create and return a gRPC AsyncIO channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ aio.Channel: A gRPC AsyncIO channel object.
+ """
+
+ return grpc_helpers_async.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[aio.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ channel (Optional[aio.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsAsyncClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def grpc_channel(self) -> aio.Channel:
+ """Create the channel designed to connect to this service.
+
+ This property caches on the instance; repeated calls return
+ the same channel.
+ """
+ # Return the channel from cache.
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsAsyncClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsAsyncClient(
+ self.grpc_channel
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def create_tag_hold(
+ self,
+ ) -> Callable[
+ [tag_holds.CreateTagHoldRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the create tag hold method over gRPC.
+
+ Creates a TagHold. Returns ALREADY_EXISTS if a TagHold with the
+ same resource and origin exists under the same TagValue.
+
+ Returns:
+ Callable[[~.CreateTagHoldRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_tag_hold" not in self._stubs:
+ self._stubs["create_tag_hold"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagHolds/CreateTagHold",
+ request_serializer=tag_holds.CreateTagHoldRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_tag_hold"]
+
+ @property
+ def delete_tag_hold(
+ self,
+ ) -> Callable[
+ [tag_holds.DeleteTagHoldRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the delete tag hold method over gRPC.
+
+ Deletes a TagHold.
+
+ Returns:
+ Callable[[~.DeleteTagHoldRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_tag_hold" not in self._stubs:
+ self._stubs["delete_tag_hold"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagHolds/DeleteTagHold",
+ request_serializer=tag_holds.DeleteTagHoldRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_tag_hold"]
+
+ @property
+ def list_tag_holds(
+ self,
+ ) -> Callable[
+ [tag_holds.ListTagHoldsRequest], Awaitable[tag_holds.ListTagHoldsResponse]
+ ]:
+ r"""Return a callable for the list tag holds method over gRPC.
+
+ Lists TagHolds under a TagValue.
+
+ Returns:
+ Callable[[~.ListTagHoldsRequest],
+ Awaitable[~.ListTagHoldsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_tag_holds" not in self._stubs:
+ self._stubs["list_tag_holds"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagHolds/ListTagHolds",
+ request_serializer=tag_holds.ListTagHoldsRequest.serialize,
+ response_deserializer=tag_holds.ListTagHoldsResponse.deserialize,
+ )
+ return self._stubs["list_tag_holds"]
+
+ def close(self):
+ return self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+
+__all__ = ("TagHoldsGrpcAsyncIOTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/rest.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/rest.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_holds/transports/rest.py
@@ -0,0 +1,705 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import dataclasses
+import json # type: ignore
+import re
+from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import (
+ gapic_v1,
+ operations_v1,
+ path_template,
+ rest_helpers,
+ rest_streaming,
+)
+from google.api_core import exceptions as core_exceptions
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.auth.transport.requests import AuthorizedSession # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import json_format
+import grpc # type: ignore
+from requests import __version__ as requests_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+
+from google.longrunning import operations_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.types import tag_holds
+
+from .base import DEFAULT_CLIENT_INFO as BASE_DEFAULT_CLIENT_INFO
+from .base import TagHoldsTransport
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=BASE_DEFAULT_CLIENT_INFO.gapic_version,
+ grpc_version=None,
+ rest_version=requests_version,
+)
+
+
+class TagHoldsRestInterceptor:
+ """Interceptor for TagHolds.
+
+ Interceptors are used to manipulate requests, request metadata, and responses
+ in arbitrary ways.
+ Example use cases include:
+ * Logging
+ * Verifying requests according to service or custom semantics
+ * Stripping extraneous information from responses
+
+ These use cases and more can be enabled by injecting an
+ instance of a custom subclass when constructing the TagHoldsRestTransport.
+
+ .. code-block:: python
+ class MyCustomTagHoldsInterceptor(TagHoldsRestInterceptor):
+ def pre_create_tag_hold(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_create_tag_hold(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_delete_tag_hold(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_delete_tag_hold(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_tag_holds(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_tag_holds(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ transport = TagHoldsRestTransport(interceptor=MyCustomTagHoldsInterceptor())
+ client = TagHoldsClient(transport=transport)
+
+
+ """
+
+ def pre_create_tag_hold(
+ self,
+ request: tag_holds.CreateTagHoldRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_holds.CreateTagHoldRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for create_tag_hold
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagHolds server.
+ """
+ return request, metadata
+
+ def post_create_tag_hold(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for create_tag_hold
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagHolds server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_delete_tag_hold(
+ self,
+ request: tag_holds.DeleteTagHoldRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_holds.DeleteTagHoldRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for delete_tag_hold
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagHolds server.
+ """
+ return request, metadata
+
+ def post_delete_tag_hold(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for delete_tag_hold
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagHolds server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_tag_holds(
+ self,
+ request: tag_holds.ListTagHoldsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_holds.ListTagHoldsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_tag_holds
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagHolds server.
+ """
+ return request, metadata
+
+ def post_list_tag_holds(
+ self, response: tag_holds.ListTagHoldsResponse
+ ) -> tag_holds.ListTagHoldsResponse:
+ """Post-rpc interceptor for list_tag_holds
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagHolds server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_operation(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[operations_pb2.GetOperationRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagHolds server.
+ """
+ return request, metadata
+
+ def post_get_operation(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagHolds server but before
+ it is returned to user code.
+ """
+ return response
+
+
+@dataclasses.dataclass
+class TagHoldsRestStub:
+ _session: AuthorizedSession
+ _host: str
+ _interceptor: TagHoldsRestInterceptor
+
+
+class TagHoldsRestTransport(TagHoldsTransport):
+ """REST backend transport for TagHolds.
+
+ Allow users to create and manage TagHolds for TagValues.
+ TagHolds represent the use of a Tag Value that is not captured
+ by TagBindings but should still block TagValue deletion (such as
+ a reference in a policy condition). This service provides
+ isolated failure domains by cloud location so that TagHolds can
+ be managed in the same location as their usage.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends JSON representations of protocol buffers over HTTP/1.1
+
+ """
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ url_scheme: str = "https",
+ interceptor: Optional[TagHoldsRestInterceptor] = None,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Callable[[], Tuple[bytes, bytes]]): Client
+ certificate to configure mutual TLS HTTP channel. It is ignored
+ if ``channel`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you are developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ url_scheme: the protocol scheme for the API endpoint. Normally
+ "https", but for testing or local servers,
+ "http" can be specified.
+ """
+ # Run the base constructor
+ # TODO(yon-mg): resolve other ctor params i.e. scopes, quota, etc.
+ # TODO: When custom host (api_endpoint) is set, `scopes` must *also* be set on the
+ # credentials object
+ maybe_url_match = re.match("^(?P<scheme>http(?:s)?://)?(?P<host>.*)$", host)
+ if maybe_url_match is None:
+ raise ValueError(
+ f"Unexpected hostname structure: {host}"
+ ) # pragma: NO COVER
+
+ url_match_items = maybe_url_match.groupdict()
+
+ host = f"{url_scheme}://{host}" if not url_match_items["scheme"] else host
+
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+ self._session = AuthorizedSession(
+ self._credentials, default_host=self.DEFAULT_HOST
+ )
+ self._operations_client: Optional[operations_v1.AbstractOperationsClient] = None
+ if client_cert_source_for_mtls:
+ self._session.configure_mtls_channel(client_cert_source_for_mtls)
+ self._interceptor = interceptor or TagHoldsRestInterceptor()
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def operations_client(self) -> operations_v1.AbstractOperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ http_options: Dict[str, List[Dict[str, str]]] = {
+ "google.longrunning.Operations.GetOperation": [
+ {
+ "method": "get",
+ "uri": "/v3/{name=operations/**}",
+ },
+ ],
+ }
+
+ rest_transport = operations_v1.OperationsRestTransport(
+ host=self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ scopes=self._scopes,
+ http_options=http_options,
+ path_prefix="v3",
+ )
+
+ self._operations_client = operations_v1.AbstractOperationsClient(
+ transport=rest_transport
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ class _CreateTagHold(TagHoldsRestStub):
+ def __hash__(self):
+ return hash("CreateTagHold")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_holds.CreateTagHoldRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the create tag hold method over HTTP.
+
+ Args:
+ request (~.tag_holds.CreateTagHoldRequest):
+ The request object. The request message to create a
+ TagHold.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{parent=tagValues/*}/tagHolds",
+ "body": "tag_hold",
+ },
+ ]
+ request, metadata = self._interceptor.pre_create_tag_hold(request, metadata)
+ pb_request = tag_holds.CreateTagHoldRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_create_tag_hold(resp)
+ return resp
+
+ class _DeleteTagHold(TagHoldsRestStub):
+ def __hash__(self):
+ return hash("DeleteTagHold")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_holds.DeleteTagHoldRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the delete tag hold method over HTTP.
+
+ Args:
+ request (~.tag_holds.DeleteTagHoldRequest):
+ The request object. The request message to delete a
+ TagHold.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "delete",
+ "uri": "/v3/{name=tagValues/*/tagHolds/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_delete_tag_hold(request, metadata)
+ pb_request = tag_holds.DeleteTagHoldRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_delete_tag_hold(resp)
+ return resp
+
+ class _ListTagHolds(TagHoldsRestStub):
+ def __hash__(self):
+ return hash("ListTagHolds")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_holds.ListTagHoldsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_holds.ListTagHoldsResponse:
+ r"""Call the list tag holds method over HTTP.
+
+ Args:
+ request (~.tag_holds.ListTagHoldsRequest):
+ The request object. The request message for listing the
+ TagHolds under a TagValue.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.tag_holds.ListTagHoldsResponse:
+ The ListTagHolds response.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/{parent=tagValues/*}/tagHolds",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_tag_holds(request, metadata)
+ pb_request = tag_holds.ListTagHoldsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = tag_holds.ListTagHoldsResponse()
+ pb_resp = tag_holds.ListTagHoldsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_tag_holds(resp)
+ return resp
+
+ @property
+ def create_tag_hold(
+ self,
+ ) -> Callable[[tag_holds.CreateTagHoldRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._CreateTagHold(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def delete_tag_hold(
+ self,
+ ) -> Callable[[tag_holds.DeleteTagHoldRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._DeleteTagHold(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_tag_holds(
+ self,
+ ) -> Callable[[tag_holds.ListTagHoldsRequest], tag_holds.ListTagHoldsResponse]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListTagHolds(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_operation(self):
+ return self._GetOperation(self._session, self._host, self._interceptor) # type: ignore
+
+ class _GetOperation(TagHoldsRestStub):
+ def __call__(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+
+ r"""Call the get operation method over HTTP.
+
+ Args:
+ request (operations_pb2.GetOperationRequest):
+ The request object for GetOperation method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ operations_pb2.Operation: Response from GetOperation method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/{name=operations/**}",
+ },
+ ]
+
+ request, metadata = self._interceptor.pre_get_operation(request, metadata)
+ request_kwargs = json_format.MessageToDict(request)
+ transcoded_request = path_template.transcode(http_options, **request_kwargs)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(json.dumps(transcoded_request["query_params"]))
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ resp = operations_pb2.Operation()
+ resp = json_format.Parse(response.content.decode("utf-8"), resp)
+ resp = self._interceptor.post_get_operation(resp)
+ return resp
+
+ @property
+ def kind(self) -> str:
+ return "rest"
+
+ def close(self):
+ self._session.close()
+
+
+__all__ = ("TagHoldsRestTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/__init__.py
@@ -0,0 +1,22 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .async_client import TagKeysAsyncClient
+from .client import TagKeysClient
+
+__all__ = (
+ "TagKeysClient",
+ "TagKeysAsyncClient",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/async_client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/async_client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/async_client.py
@@ -0,0 +1,1460 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import functools
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+)
+
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.api_core.client_options import ClientOptions
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.tag_keys import pagers
+from google.cloud.resourcemanager_v3.types import tag_keys
+
+from .client import TagKeysClient
+from .transports.base import DEFAULT_CLIENT_INFO, TagKeysTransport
+from .transports.grpc_asyncio import TagKeysGrpcAsyncIOTransport
+
+
+class TagKeysAsyncClient:
+ """Allow users to create and manage tag keys."""
+
+ _client: TagKeysClient
+
+ DEFAULT_ENDPOINT = TagKeysClient.DEFAULT_ENDPOINT
+ DEFAULT_MTLS_ENDPOINT = TagKeysClient.DEFAULT_MTLS_ENDPOINT
+
+ tag_key_path = staticmethod(TagKeysClient.tag_key_path)
+ parse_tag_key_path = staticmethod(TagKeysClient.parse_tag_key_path)
+ common_billing_account_path = staticmethod(
+ TagKeysClient.common_billing_account_path
+ )
+ parse_common_billing_account_path = staticmethod(
+ TagKeysClient.parse_common_billing_account_path
+ )
+ common_folder_path = staticmethod(TagKeysClient.common_folder_path)
+ parse_common_folder_path = staticmethod(TagKeysClient.parse_common_folder_path)
+ common_organization_path = staticmethod(TagKeysClient.common_organization_path)
+ parse_common_organization_path = staticmethod(
+ TagKeysClient.parse_common_organization_path
+ )
+ common_project_path = staticmethod(TagKeysClient.common_project_path)
+ parse_common_project_path = staticmethod(TagKeysClient.parse_common_project_path)
+ common_location_path = staticmethod(TagKeysClient.common_location_path)
+ parse_common_location_path = staticmethod(TagKeysClient.parse_common_location_path)
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagKeysAsyncClient: The constructed client.
+ """
+ return TagKeysClient.from_service_account_info.__func__(TagKeysAsyncClient, info, *args, **kwargs) # type: ignore
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagKeysAsyncClient: The constructed client.
+ """
+ return TagKeysClient.from_service_account_file.__func__(TagKeysAsyncClient, filename, *args, **kwargs) # type: ignore
+
+ from_service_account_json = from_service_account_file
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ return TagKeysClient.get_mtls_endpoint_and_cert_source(client_options) # type: ignore
+
+ @property
+ def transport(self) -> TagKeysTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ TagKeysTransport: The transport used by the client instance.
+ """
+ return self._client.transport
+
+ get_transport_class = functools.partial(
+ type(TagKeysClient).get_transport_class, type(TagKeysClient)
+ )
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Union[str, TagKeysTransport] = "grpc_asyncio",
+ client_options: Optional[ClientOptions] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the tag keys client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, ~.TagKeysTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (ClientOptions): Custom options for the client. It
+ won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ self._client = TagKeysClient(
+ credentials=credentials,
+ transport=transport,
+ client_options=client_options,
+ client_info=client_info,
+ )
+
+ async def list_tag_keys(
+ self,
+ request: Optional[Union[tag_keys.ListTagKeysRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListTagKeysAsyncPager:
+ r"""Lists all TagKeys for a parent resource.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_list_tag_keys():
+ # Create a client
+ client = resourcemanager_v3.TagKeysAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListTagKeysRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_tag_keys(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.ListTagKeysRequest, dict]]):
+ The request object. The request message for listing all
+ TagKeys under a parent resource.
+ parent (:class:`str`):
+ Required. The resource name of the TagKey's parent. Must
+ be of the form ``organizations/{org_id}`` or
+ ``projects/{project_id}`` or
+ ``projects/{project_number}``
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.tag_keys.pagers.ListTagKeysAsyncPager:
+ The ListTagKeys response message.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_keys.ListTagKeysRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_tag_keys,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListTagKeysAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_tag_key(
+ self,
+ request: Optional[Union[tag_keys.GetTagKeyRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_keys.TagKey:
+ r"""Retrieves a TagKey. This method will return
+ ``PERMISSION_DENIED`` if the key does not exist or the user does
+ not have permission to view it.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_get_tag_key():
+ # Create a client
+ client = resourcemanager_v3.TagKeysAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetTagKeyRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_tag_key(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.GetTagKeyRequest, dict]]):
+ The request object. The request message for getting a
+ TagKey.
+ name (:class:`str`):
+ Required. A resource name in the format
+ ``tagKeys/{id}``, such as ``tagKeys/123``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.TagKey:
+ A TagKey, used to group a set of
+ TagValues.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_keys.GetTagKeyRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_tag_key,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_namespaced_tag_key(
+ self,
+ request: Optional[Union[tag_keys.GetNamespacedTagKeyRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_keys.TagKey:
+ r"""Retrieves a TagKey by its namespaced name. This method will
+ return ``PERMISSION_DENIED`` if the key does not exist or the
+ user does not have permission to view it.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_get_namespaced_tag_key():
+ # Create a client
+ client = resourcemanager_v3.TagKeysAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetNamespacedTagKeyRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_namespaced_tag_key(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.GetNamespacedTagKeyRequest, dict]]):
+ The request object. The request message for getting a
+ TagKey by its namespaced name.
+ name (:class:`str`):
+ Required. A namespaced tag key name in the format
+ ``{parentId}/{tagKeyShort}``, such as ``42/foo`` for a
+ key with short name "foo" under the organization with ID
+ 42 or ``r2-d2/bar`` for a key with short name "bar"
+ under the project ``r2-d2``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.TagKey:
+ A TagKey, used to group a set of
+ TagValues.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_keys.GetNamespacedTagKeyRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_namespaced_tag_key,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def create_tag_key(
+ self,
+ request: Optional[Union[tag_keys.CreateTagKeyRequest, dict]] = None,
+ *,
+ tag_key: Optional[tag_keys.TagKey] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Creates a new TagKey. If another request with the
+ same parameters is sent while the original request is in
+ process, the second request will receive an error. A
+ maximum of 1000 TagKeys can exist under a parent at any
+ given time.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_create_tag_key():
+ # Create a client
+ client = resourcemanager_v3.TagKeysAsyncClient()
+
+ # Initialize request argument(s)
+ tag_key = resourcemanager_v3.TagKey()
+ tag_key.short_name = "short_name_value"
+
+ request = resourcemanager_v3.CreateTagKeyRequest(
+ tag_key=tag_key,
+ )
+
+ # Make the request
+ operation = client.create_tag_key(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.CreateTagKeyRequest, dict]]):
+ The request object. The request message for creating a
+ TagKey.
+ tag_key (:class:`google.cloud.resourcemanager_v3.types.TagKey`):
+ Required. The TagKey to be created. Only fields
+ ``short_name``, ``description``, and ``parent`` are
+ considered during the creation request.
+
+ This corresponds to the ``tag_key`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.resourcemanager_v3.types.TagKey` A
+ TagKey, used to group a set of TagValues.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([tag_key])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_keys.CreateTagKeyRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if tag_key is not None:
+ request.tag_key = tag_key
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.create_tag_key,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ tag_keys.TagKey,
+ metadata_type=tag_keys.CreateTagKeyMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def update_tag_key(
+ self,
+ request: Optional[Union[tag_keys.UpdateTagKeyRequest, dict]] = None,
+ *,
+ tag_key: Optional[tag_keys.TagKey] = None,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Updates the attributes of the TagKey resource.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_update_tag_key():
+ # Create a client
+ client = resourcemanager_v3.TagKeysAsyncClient()
+
+ # Initialize request argument(s)
+ tag_key = resourcemanager_v3.TagKey()
+ tag_key.short_name = "short_name_value"
+
+ request = resourcemanager_v3.UpdateTagKeyRequest(
+ tag_key=tag_key,
+ )
+
+ # Make the request
+ operation = client.update_tag_key(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.UpdateTagKeyRequest, dict]]):
+ The request object. The request message for updating a
+ TagKey.
+ tag_key (:class:`google.cloud.resourcemanager_v3.types.TagKey`):
+ Required. The new definition of the TagKey. Only the
+ ``description`` and ``etag`` fields can be updated by
+ this request. If the ``etag`` field is not empty, it
+ must match the ``etag`` field of the existing tag key.
+ Otherwise, ``ABORTED`` will be returned.
+
+ This corresponds to the ``tag_key`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`):
+ Fields to be updated. The mask may only contain
+ ``description`` or ``etag``. If omitted entirely, both
+ ``description`` and ``etag`` are assumed to be
+ significant.
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.resourcemanager_v3.types.TagKey` A
+ TagKey, used to group a set of TagValues.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([tag_key, update_mask])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_keys.UpdateTagKeyRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if tag_key is not None:
+ request.tag_key = tag_key
+ if update_mask is not None:
+ request.update_mask = update_mask
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.update_tag_key,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("tag_key.name", request.tag_key.name),)
+ ),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ tag_keys.TagKey,
+ metadata_type=tag_keys.UpdateTagKeyMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def delete_tag_key(
+ self,
+ request: Optional[Union[tag_keys.DeleteTagKeyRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Deletes a TagKey. The TagKey cannot be deleted if it
+ has any child TagValues.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_delete_tag_key():
+ # Create a client
+ client = resourcemanager_v3.TagKeysAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.DeleteTagKeyRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_tag_key(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.DeleteTagKeyRequest, dict]]):
+ The request object. The request message for deleting a
+ TagKey.
+ name (:class:`str`):
+ Required. The resource name of a TagKey to be deleted in
+ the format ``tagKeys/123``. The TagKey cannot be a
+ parent of any existing TagValues or it will not be
+ deleted successfully.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.resourcemanager_v3.types.TagKey` A
+ TagKey, used to group a set of TagValues.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_keys.DeleteTagKeyRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.delete_tag_key,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ tag_keys.TagKey,
+ metadata_type=tag_keys.DeleteTagKeyMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.GetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Gets the access control policy for a TagKey. The returned policy
+ may be empty if no such policy or resource exists. The
+ ``resource`` field should be the TagKey's resource name. For
+ example, "tagKeys/1234". The caller must have
+ ``cloudresourcemanager.googleapis.com/tagKeys.getIamPolicy``
+ permission on the specified TagKey.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_get_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.TagKeysAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = await client.get_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.GetIamPolicyRequest, dict]]):
+ The request object. Request message for ``GetIamPolicy`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy is being requested. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.GetIamPolicyRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource=resource,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_iam_policy,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def set_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.SetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Sets the access control policy on a TagKey, replacing any
+ existing policy. The ``resource`` field should be the TagKey's
+ resource name. For example, "tagKeys/1234". The caller must have
+ ``resourcemanager.tagKeys.setIamPolicy`` permission on the
+ identified tagValue.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_set_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.TagKeysAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = await client.set_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.SetIamPolicyRequest, dict]]):
+ The request object. Request message for ``SetIamPolicy`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy is being specified. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.SetIamPolicyRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource=resource,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.set_iam_policy,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def test_iam_permissions(
+ self,
+ request: Optional[Union[iam_policy_pb2.TestIamPermissionsRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ permissions: Optional[MutableSequence[str]] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Returns permissions that a caller has on the specified TagKey.
+ The ``resource`` field should be the TagKey's resource name. For
+ example, "tagKeys/1234".
+
+ There are no permissions required for making this API call.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_test_iam_permissions():
+ # Create a client
+ client = resourcemanager_v3.TagKeysAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource="resource_value",
+ permissions=['permissions_value1', 'permissions_value2'],
+ )
+
+ # Make the request
+ response = await client.test_iam_permissions(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest, dict]]):
+ The request object. Request message for ``TestIamPermissions`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy detail is being requested. See
+ the operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ permissions (:class:`MutableSequence[str]`):
+ The set of permissions to check for the ``resource``.
+ Permissions with wildcards (such as '*' or 'storage.*')
+ are not allowed. For more information see `IAM
+ Overview <https://cloud.google.com/iam/docs/overview#permissions>`__.
+
+ This corresponds to the ``permissions`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for TestIamPermissions method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource, permissions])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.TestIamPermissionsRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource=resource,
+ permissions=permissions,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.test_iam_permissions,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._client._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, exc_type, exc, tb):
+ await self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("TagKeysAsyncClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/client.py
@@ -0,0 +1,1660 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import os
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+ cast,
+)
+
+from google.api_core import client_options as client_options_lib
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.exceptions import MutualTLSChannelError # type: ignore
+from google.auth.transport import mtls # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.tag_keys import pagers
+from google.cloud.resourcemanager_v3.types import tag_keys
+
+from .transports.base import DEFAULT_CLIENT_INFO, TagKeysTransport
+from .transports.grpc import TagKeysGrpcTransport
+from .transports.grpc_asyncio import TagKeysGrpcAsyncIOTransport
+from .transports.rest import TagKeysRestTransport
+
+
+class TagKeysClientMeta(type):
+ """Metaclass for the TagKeys client.
+
+ This provides class-level methods for building and retrieving
+ support objects (e.g. transport) without polluting the client instance
+ objects.
+ """
+
+ _transport_registry = OrderedDict() # type: Dict[str, Type[TagKeysTransport]]
+ _transport_registry["grpc"] = TagKeysGrpcTransport
+ _transport_registry["grpc_asyncio"] = TagKeysGrpcAsyncIOTransport
+ _transport_registry["rest"] = TagKeysRestTransport
+
+ def get_transport_class(
+ cls,
+ label: Optional[str] = None,
+ ) -> Type[TagKeysTransport]:
+ """Returns an appropriate transport class.
+
+ Args:
+ label: The name of the desired transport. If none is
+ provided, then the first transport in the registry is used.
+
+ Returns:
+ The transport class to use.
+ """
+ # If a specific transport is requested, return that one.
+ if label:
+ return cls._transport_registry[label]
+
+ # No transport is requested; return the default (that is, the first one
+ # in the dictionary).
+ return next(iter(cls._transport_registry.values()))
+
+
+class TagKeysClient(metaclass=TagKeysClientMeta):
+ """Allow users to create and manage tag keys."""
+
+ @staticmethod
+ def _get_default_mtls_endpoint(api_endpoint):
+ """Converts api endpoint to mTLS endpoint.
+
+ Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to
+ "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively.
+ Args:
+ api_endpoint (Optional[str]): the api endpoint to convert.
+ Returns:
+ str: converted mTLS api endpoint.
+ """
+ if not api_endpoint:
+ return api_endpoint
+
+ mtls_endpoint_re = re.compile(
+ r"(?P<name>[^.]+)(?P<mtls>\.mtls)?(?P<sandbox>\.sandbox)?(?P<googledomain>\.googleapis\.com)?"
+ )
+
+ m = mtls_endpoint_re.match(api_endpoint)
+ name, mtls, sandbox, googledomain = m.groups()
+ if mtls or not googledomain:
+ return api_endpoint
+
+ if sandbox:
+ return api_endpoint.replace(
+ "sandbox.googleapis.com", "mtls.sandbox.googleapis.com"
+ )
+
+ return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com")
+
+ DEFAULT_ENDPOINT = "cloudresourcemanager.googleapis.com"
+ DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore
+ DEFAULT_ENDPOINT
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagKeysClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_info(info)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagKeysClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_file(filename)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ from_service_account_json = from_service_account_file
+
+ @property
+ def transport(self) -> TagKeysTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ TagKeysTransport: The transport used by the client
+ instance.
+ """
+ return self._transport
+
+ @staticmethod
+ def tag_key_path(
+ tag_key: str,
+ ) -> str:
+ """Returns a fully-qualified tag_key string."""
+ return "tagKeys/{tag_key}".format(
+ tag_key=tag_key,
+ )
+
+ @staticmethod
+ def parse_tag_key_path(path: str) -> Dict[str, str]:
+ """Parses a tag_key path into its component segments."""
+ m = re.match(r"^tagKeys/(?P<tag_key>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_billing_account_path(
+ billing_account: str,
+ ) -> str:
+ """Returns a fully-qualified billing_account string."""
+ return "billingAccounts/{billing_account}".format(
+ billing_account=billing_account,
+ )
+
+ @staticmethod
+ def parse_common_billing_account_path(path: str) -> Dict[str, str]:
+ """Parse a billing_account path into its component segments."""
+ m = re.match(r"^billingAccounts/(?P<billing_account>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_folder_path(
+ folder: str,
+ ) -> str:
+ """Returns a fully-qualified folder string."""
+ return "folders/{folder}".format(
+ folder=folder,
+ )
+
+ @staticmethod
+ def parse_common_folder_path(path: str) -> Dict[str, str]:
+ """Parse a folder path into its component segments."""
+ m = re.match(r"^folders/(?P<folder>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_organization_path(
+ organization: str,
+ ) -> str:
+ """Returns a fully-qualified organization string."""
+ return "organizations/{organization}".format(
+ organization=organization,
+ )
+
+ @staticmethod
+ def parse_common_organization_path(path: str) -> Dict[str, str]:
+ """Parse a organization path into its component segments."""
+ m = re.match(r"^organizations/(?P<organization>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_project_path(
+ project: str,
+ ) -> str:
+ """Returns a fully-qualified project string."""
+ return "projects/{project}".format(
+ project=project,
+ )
+
+ @staticmethod
+ def parse_common_project_path(path: str) -> Dict[str, str]:
+ """Parse a project path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_location_path(
+ project: str,
+ location: str,
+ ) -> str:
+ """Returns a fully-qualified location string."""
+ return "projects/{project}/locations/{location}".format(
+ project=project,
+ location=location,
+ )
+
+ @staticmethod
+ def parse_common_location_path(path: str) -> Dict[str, str]:
+ """Parse a location path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[client_options_lib.ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ use_client_cert = os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")
+ use_mtls_endpoint = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto")
+ if use_client_cert not in ("true", "false"):
+ raise ValueError(
+ "Environment variable `GOOGLE_API_USE_CLIENT_CERTIFICATE` must be either `true` or `false`"
+ )
+ if use_mtls_endpoint not in ("auto", "never", "always"):
+ raise MutualTLSChannelError(
+ "Environment variable `GOOGLE_API_USE_MTLS_ENDPOINT` must be `never`, `auto` or `always`"
+ )
+
+ # Figure out the client cert source to use.
+ client_cert_source = None
+ if use_client_cert == "true":
+ if client_options.client_cert_source:
+ client_cert_source = client_options.client_cert_source
+ elif mtls.has_default_client_cert_source():
+ client_cert_source = mtls.default_client_cert_source()
+
+ # Figure out which api endpoint to use.
+ if client_options.api_endpoint is not None:
+ api_endpoint = client_options.api_endpoint
+ elif use_mtls_endpoint == "always" or (
+ use_mtls_endpoint == "auto" and client_cert_source
+ ):
+ api_endpoint = cls.DEFAULT_MTLS_ENDPOINT
+ else:
+ api_endpoint = cls.DEFAULT_ENDPOINT
+
+ return api_endpoint, client_cert_source
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Optional[Union[str, TagKeysTransport]] = None,
+ client_options: Optional[Union[client_options_lib.ClientOptions, dict]] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the tag keys client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, TagKeysTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (Optional[Union[google.api_core.client_options.ClientOptions, dict]]): Custom options for the
+ client. It won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ if isinstance(client_options, dict):
+ client_options = client_options_lib.from_dict(client_options)
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ client_options = cast(client_options_lib.ClientOptions, client_options)
+
+ api_endpoint, client_cert_source_func = self.get_mtls_endpoint_and_cert_source(
+ client_options
+ )
+
+ api_key_value = getattr(client_options, "api_key", None)
+ if api_key_value and credentials:
+ raise ValueError(
+ "client_options.api_key and credentials are mutually exclusive"
+ )
+
+ # Save or instantiate the transport.
+ # Ordinarily, we provide the transport, but allowing a custom transport
+ # instance provides an extensibility point for unusual situations.
+ if isinstance(transport, TagKeysTransport):
+ # transport is a TagKeysTransport instance.
+ if credentials or client_options.credentials_file or api_key_value:
+ raise ValueError(
+ "When providing a transport instance, "
+ "provide its credentials directly."
+ )
+ if client_options.scopes:
+ raise ValueError(
+ "When providing a transport instance, provide its scopes "
+ "directly."
+ )
+ self._transport = transport
+ else:
+ import google.auth._default # type: ignore
+
+ if api_key_value and hasattr(
+ google.auth._default, "get_api_key_credentials"
+ ):
+ credentials = google.auth._default.get_api_key_credentials(
+ api_key_value
+ )
+
+ Transport = type(self).get_transport_class(transport)
+ self._transport = Transport(
+ credentials=credentials,
+ credentials_file=client_options.credentials_file,
+ host=api_endpoint,
+ scopes=client_options.scopes,
+ client_cert_source_for_mtls=client_cert_source_func,
+ quota_project_id=client_options.quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=True,
+ api_audience=client_options.api_audience,
+ )
+
+ def list_tag_keys(
+ self,
+ request: Optional[Union[tag_keys.ListTagKeysRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListTagKeysPager:
+ r"""Lists all TagKeys for a parent resource.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_list_tag_keys():
+ # Create a client
+ client = resourcemanager_v3.TagKeysClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListTagKeysRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_tag_keys(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.ListTagKeysRequest, dict]):
+ The request object. The request message for listing all
+ TagKeys under a parent resource.
+ parent (str):
+ Required. The resource name of the TagKey's parent. Must
+ be of the form ``organizations/{org_id}`` or
+ ``projects/{project_id}`` or
+ ``projects/{project_number}``
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.tag_keys.pagers.ListTagKeysPager:
+ The ListTagKeys response message.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_keys.ListTagKeysRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_keys.ListTagKeysRequest):
+ request = tag_keys.ListTagKeysRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_tag_keys]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListTagKeysPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_tag_key(
+ self,
+ request: Optional[Union[tag_keys.GetTagKeyRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_keys.TagKey:
+ r"""Retrieves a TagKey. This method will return
+ ``PERMISSION_DENIED`` if the key does not exist or the user does
+ not have permission to view it.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_get_tag_key():
+ # Create a client
+ client = resourcemanager_v3.TagKeysClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetTagKeyRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_tag_key(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.GetTagKeyRequest, dict]):
+ The request object. The request message for getting a
+ TagKey.
+ name (str):
+ Required. A resource name in the format
+ ``tagKeys/{id}``, such as ``tagKeys/123``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.TagKey:
+ A TagKey, used to group a set of
+ TagValues.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_keys.GetTagKeyRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_keys.GetTagKeyRequest):
+ request = tag_keys.GetTagKeyRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_tag_key]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_namespaced_tag_key(
+ self,
+ request: Optional[Union[tag_keys.GetNamespacedTagKeyRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_keys.TagKey:
+ r"""Retrieves a TagKey by its namespaced name. This method will
+ return ``PERMISSION_DENIED`` if the key does not exist or the
+ user does not have permission to view it.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_get_namespaced_tag_key():
+ # Create a client
+ client = resourcemanager_v3.TagKeysClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetNamespacedTagKeyRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_namespaced_tag_key(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.GetNamespacedTagKeyRequest, dict]):
+ The request object. The request message for getting a
+ TagKey by its namespaced name.
+ name (str):
+ Required. A namespaced tag key name in the format
+ ``{parentId}/{tagKeyShort}``, such as ``42/foo`` for a
+ key with short name "foo" under the organization with ID
+ 42 or ``r2-d2/bar`` for a key with short name "bar"
+ under the project ``r2-d2``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.TagKey:
+ A TagKey, used to group a set of
+ TagValues.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_keys.GetNamespacedTagKeyRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_keys.GetNamespacedTagKeyRequest):
+ request = tag_keys.GetNamespacedTagKeyRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_namespaced_tag_key]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def create_tag_key(
+ self,
+ request: Optional[Union[tag_keys.CreateTagKeyRequest, dict]] = None,
+ *,
+ tag_key: Optional[tag_keys.TagKey] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Creates a new TagKey. If another request with the
+ same parameters is sent while the original request is in
+ process, the second request will receive an error. A
+ maximum of 1000 TagKeys can exist under a parent at any
+ given time.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_create_tag_key():
+ # Create a client
+ client = resourcemanager_v3.TagKeysClient()
+
+ # Initialize request argument(s)
+ tag_key = resourcemanager_v3.TagKey()
+ tag_key.short_name = "short_name_value"
+
+ request = resourcemanager_v3.CreateTagKeyRequest(
+ tag_key=tag_key,
+ )
+
+ # Make the request
+ operation = client.create_tag_key(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.CreateTagKeyRequest, dict]):
+ The request object. The request message for creating a
+ TagKey.
+ tag_key (google.cloud.resourcemanager_v3.types.TagKey):
+ Required. The TagKey to be created. Only fields
+ ``short_name``, ``description``, and ``parent`` are
+ considered during the creation request.
+
+ This corresponds to the ``tag_key`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.resourcemanager_v3.types.TagKey` A
+ TagKey, used to group a set of TagValues.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([tag_key])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_keys.CreateTagKeyRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_keys.CreateTagKeyRequest):
+ request = tag_keys.CreateTagKeyRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if tag_key is not None:
+ request.tag_key = tag_key
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.create_tag_key]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ tag_keys.TagKey,
+ metadata_type=tag_keys.CreateTagKeyMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def update_tag_key(
+ self,
+ request: Optional[Union[tag_keys.UpdateTagKeyRequest, dict]] = None,
+ *,
+ tag_key: Optional[tag_keys.TagKey] = None,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Updates the attributes of the TagKey resource.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_update_tag_key():
+ # Create a client
+ client = resourcemanager_v3.TagKeysClient()
+
+ # Initialize request argument(s)
+ tag_key = resourcemanager_v3.TagKey()
+ tag_key.short_name = "short_name_value"
+
+ request = resourcemanager_v3.UpdateTagKeyRequest(
+ tag_key=tag_key,
+ )
+
+ # Make the request
+ operation = client.update_tag_key(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.UpdateTagKeyRequest, dict]):
+ The request object. The request message for updating a
+ TagKey.
+ tag_key (google.cloud.resourcemanager_v3.types.TagKey):
+ Required. The new definition of the TagKey. Only the
+ ``description`` and ``etag`` fields can be updated by
+ this request. If the ``etag`` field is not empty, it
+ must match the ``etag`` field of the existing tag key.
+ Otherwise, ``ABORTED`` will be returned.
+
+ This corresponds to the ``tag_key`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Fields to be updated. The mask may only contain
+ ``description`` or ``etag``. If omitted entirely, both
+ ``description`` and ``etag`` are assumed to be
+ significant.
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.resourcemanager_v3.types.TagKey` A
+ TagKey, used to group a set of TagValues.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([tag_key, update_mask])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_keys.UpdateTagKeyRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_keys.UpdateTagKeyRequest):
+ request = tag_keys.UpdateTagKeyRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if tag_key is not None:
+ request.tag_key = tag_key
+ if update_mask is not None:
+ request.update_mask = update_mask
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.update_tag_key]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("tag_key.name", request.tag_key.name),)
+ ),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ tag_keys.TagKey,
+ metadata_type=tag_keys.UpdateTagKeyMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def delete_tag_key(
+ self,
+ request: Optional[Union[tag_keys.DeleteTagKeyRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Deletes a TagKey. The TagKey cannot be deleted if it
+ has any child TagValues.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_delete_tag_key():
+ # Create a client
+ client = resourcemanager_v3.TagKeysClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.DeleteTagKeyRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_tag_key(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.DeleteTagKeyRequest, dict]):
+ The request object. The request message for deleting a
+ TagKey.
+ name (str):
+ Required. The resource name of a TagKey to be deleted in
+ the format ``tagKeys/123``. The TagKey cannot be a
+ parent of any existing TagValues or it will not be
+ deleted successfully.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be
+ :class:`google.cloud.resourcemanager_v3.types.TagKey` A
+ TagKey, used to group a set of TagValues.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_keys.DeleteTagKeyRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_keys.DeleteTagKeyRequest):
+ request = tag_keys.DeleteTagKeyRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.delete_tag_key]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ tag_keys.TagKey,
+ metadata_type=tag_keys.DeleteTagKeyMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.GetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Gets the access control policy for a TagKey. The returned policy
+ may be empty if no such policy or resource exists. The
+ ``resource`` field should be the TagKey's resource name. For
+ example, "tagKeys/1234". The caller must have
+ ``cloudresourcemanager.googleapis.com/tagKeys.getIamPolicy``
+ permission on the specified TagKey.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_get_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.TagKeysClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = client.get_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.GetIamPolicyRequest, dict]):
+ The request object. Request message for ``GetIamPolicy`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy is being requested. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.GetIamPolicyRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.GetIamPolicyRequest()
+ if resource is not None:
+ request.resource = resource
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_iam_policy]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def set_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.SetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Sets the access control policy on a TagKey, replacing any
+ existing policy. The ``resource`` field should be the TagKey's
+ resource name. For example, "tagKeys/1234". The caller must have
+ ``resourcemanager.tagKeys.setIamPolicy`` permission on the
+ identified tagValue.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_set_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.TagKeysClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = client.set_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.SetIamPolicyRequest, dict]):
+ The request object. Request message for ``SetIamPolicy`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy is being specified. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.SetIamPolicyRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.SetIamPolicyRequest()
+ if resource is not None:
+ request.resource = resource
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.set_iam_policy]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def test_iam_permissions(
+ self,
+ request: Optional[Union[iam_policy_pb2.TestIamPermissionsRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ permissions: Optional[MutableSequence[str]] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Returns permissions that a caller has on the specified TagKey.
+ The ``resource`` field should be the TagKey's resource name. For
+ example, "tagKeys/1234".
+
+ There are no permissions required for making this API call.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_test_iam_permissions():
+ # Create a client
+ client = resourcemanager_v3.TagKeysClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource="resource_value",
+ permissions=['permissions_value1', 'permissions_value2'],
+ )
+
+ # Make the request
+ response = client.test_iam_permissions(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest, dict]):
+ The request object. Request message for ``TestIamPermissions`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy detail is being requested. See
+ the operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ permissions (MutableSequence[str]):
+ The set of permissions to check for the ``resource``.
+ Permissions with wildcards (such as '*' or 'storage.*')
+ are not allowed. For more information see `IAM
+ Overview <https://cloud.google.com/iam/docs/overview#permissions>`__.
+
+ This corresponds to the ``permissions`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for TestIamPermissions method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource, permissions])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.TestIamPermissionsRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.TestIamPermissionsRequest()
+ if resource is not None:
+ request.resource = resource
+ if permissions:
+ request.permissions.extend(permissions)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.test_iam_permissions]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def __enter__(self) -> "TagKeysClient":
+ return self
+
+ def __exit__(self, type, value, traceback):
+ """Releases underlying transport's resources.
+
+ .. warning::
+ ONLY use as a context manager if the transport is NOT shared
+ with other clients! Exiting the with block will CLOSE the transport
+ and may cause errors in other clients!
+ """
+ self.transport.close()
+
+ def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("TagKeysClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/pagers.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/pagers.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/pagers.py
@@ -0,0 +1,155 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import (
+ Any,
+ AsyncIterator,
+ Awaitable,
+ Callable,
+ Iterator,
+ Optional,
+ Sequence,
+ Tuple,
+)
+
+from google.cloud.resourcemanager_v3.types import tag_keys
+
+
+class ListTagKeysPager:
+ """A pager for iterating through ``list_tag_keys`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListTagKeysResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``tag_keys`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListTagKeys`` requests and continue to iterate
+ through the ``tag_keys`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListTagKeysResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., tag_keys.ListTagKeysResponse],
+ request: tag_keys.ListTagKeysRequest,
+ response: tag_keys.ListTagKeysResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListTagKeysRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListTagKeysResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = tag_keys.ListTagKeysRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[tag_keys.ListTagKeysResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[tag_keys.TagKey]:
+ for page in self.pages:
+ yield from page.tag_keys
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListTagKeysAsyncPager:
+ """A pager for iterating through ``list_tag_keys`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListTagKeysResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``tag_keys`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListTagKeys`` requests and continue to iterate
+ through the ``tag_keys`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListTagKeysResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[tag_keys.ListTagKeysResponse]],
+ request: tag_keys.ListTagKeysRequest,
+ response: tag_keys.ListTagKeysResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListTagKeysRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListTagKeysResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = tag_keys.ListTagKeysRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[tag_keys.ListTagKeysResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[tag_keys.TagKey]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.tag_keys:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/__init__.py
@@ -0,0 +1,36 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+from typing import Dict, Type
+
+from .base import TagKeysTransport
+from .grpc import TagKeysGrpcTransport
+from .grpc_asyncio import TagKeysGrpcAsyncIOTransport
+from .rest import TagKeysRestInterceptor, TagKeysRestTransport
+
+# Compile a registry of transports.
+_transport_registry = OrderedDict() # type: Dict[str, Type[TagKeysTransport]]
+_transport_registry["grpc"] = TagKeysGrpcTransport
+_transport_registry["grpc_asyncio"] = TagKeysGrpcAsyncIOTransport
+_transport_registry["rest"] = TagKeysRestTransport
+
+__all__ = (
+ "TagKeysTransport",
+ "TagKeysGrpcTransport",
+ "TagKeysGrpcAsyncIOTransport",
+ "TagKeysRestTransport",
+ "TagKeysRestInterceptor",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/base.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/base.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/base.py
@@ -0,0 +1,316 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import abc
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Union
+
+import google.api_core
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1, operations_v1
+from google.api_core import retry as retries
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+from google.cloud.resourcemanager_v3.types import tag_keys
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+class TagKeysTransport(abc.ABC):
+ """Abstract transport class for TagKeys."""
+
+ AUTH_SCOPES = (
+ "https://www.googleapis.com/auth/cloud-platform",
+ "https://www.googleapis.com/auth/cloud-platform.read-only",
+ )
+
+ DEFAULT_HOST: str = "cloudresourcemanager.googleapis.com"
+
+ def __init__(
+ self,
+ *,
+ host: str = DEFAULT_HOST,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ **kwargs,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A list of scopes.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ """
+
+ scopes_kwargs = {"scopes": scopes, "default_scopes": self.AUTH_SCOPES}
+
+ # Save the scopes.
+ self._scopes = scopes
+
+ # If no credentials are provided, then determine the appropriate
+ # defaults.
+ if credentials and credentials_file:
+ raise core_exceptions.DuplicateCredentialArgs(
+ "'credentials_file' and 'credentials' are mutually exclusive"
+ )
+
+ if credentials_file is not None:
+ credentials, _ = google.auth.load_credentials_from_file(
+ credentials_file, **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ elif credentials is None:
+ credentials, _ = google.auth.default(
+ **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ # Don't apply audience if the credentials file passed from user.
+ if hasattr(credentials, "with_gdch_audience"):
+ credentials = credentials.with_gdch_audience(
+ api_audience if api_audience else host
+ )
+
+ # If the credentials are service account credentials, then always try to use self signed JWT.
+ if (
+ always_use_jwt_access
+ and isinstance(credentials, service_account.Credentials)
+ and hasattr(service_account.Credentials, "with_always_use_jwt_access")
+ ):
+ credentials = credentials.with_always_use_jwt_access(True)
+
+ # Save the credentials.
+ self._credentials = credentials
+
+ # Save the hostname. Default to port 443 (HTTPS) if none is specified.
+ if ":" not in host:
+ host += ":443"
+ self._host = host
+
+ def _prep_wrapped_messages(self, client_info):
+ # Precompute the wrapped methods.
+ self._wrapped_methods = {
+ self.list_tag_keys: gapic_v1.method.wrap_method(
+ self.list_tag_keys,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.get_tag_key: gapic_v1.method.wrap_method(
+ self.get_tag_key,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.get_namespaced_tag_key: gapic_v1.method.wrap_method(
+ self.get_namespaced_tag_key,
+ default_timeout=None,
+ client_info=client_info,
+ ),
+ self.create_tag_key: gapic_v1.method.wrap_method(
+ self.create_tag_key,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.update_tag_key: gapic_v1.method.wrap_method(
+ self.update_tag_key,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.delete_tag_key: gapic_v1.method.wrap_method(
+ self.delete_tag_key,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.get_iam_policy: gapic_v1.method.wrap_method(
+ self.get_iam_policy,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.set_iam_policy: gapic_v1.method.wrap_method(
+ self.set_iam_policy,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.test_iam_permissions: gapic_v1.method.wrap_method(
+ self.test_iam_permissions,
+ default_timeout=None,
+ client_info=client_info,
+ ),
+ }
+
+ def close(self):
+ """Closes resources associated with the transport.
+
+ .. warning::
+ Only call this method if the transport is NOT shared
+ with other clients - this may cause errors in other clients!
+ """
+ raise NotImplementedError()
+
+ @property
+ def operations_client(self):
+ """Return the client designed to process long-running operations."""
+ raise NotImplementedError()
+
+ @property
+ def list_tag_keys(
+ self,
+ ) -> Callable[
+ [tag_keys.ListTagKeysRequest],
+ Union[tag_keys.ListTagKeysResponse, Awaitable[tag_keys.ListTagKeysResponse]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_tag_key(
+ self,
+ ) -> Callable[
+ [tag_keys.GetTagKeyRequest], Union[tag_keys.TagKey, Awaitable[tag_keys.TagKey]]
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_namespaced_tag_key(
+ self,
+ ) -> Callable[
+ [tag_keys.GetNamespacedTagKeyRequest],
+ Union[tag_keys.TagKey, Awaitable[tag_keys.TagKey]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def create_tag_key(
+ self,
+ ) -> Callable[
+ [tag_keys.CreateTagKeyRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def update_tag_key(
+ self,
+ ) -> Callable[
+ [tag_keys.UpdateTagKeyRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def delete_tag_key(
+ self,
+ ) -> Callable[
+ [tag_keys.DeleteTagKeyRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.GetIamPolicyRequest],
+ Union[policy_pb2.Policy, Awaitable[policy_pb2.Policy]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.SetIamPolicyRequest],
+ Union[policy_pb2.Policy, Awaitable[policy_pb2.Policy]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ Union[
+ iam_policy_pb2.TestIamPermissionsResponse,
+ Awaitable[iam_policy_pb2.TestIamPermissionsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[
+ [operations_pb2.GetOperationRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def kind(self) -> str:
+ raise NotImplementedError()
+
+
+__all__ = ("TagKeysTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/grpc.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/grpc.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/grpc.py
@@ -0,0 +1,531 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers, operations_v1
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+
+from google.cloud.resourcemanager_v3.types import tag_keys
+
+from .base import DEFAULT_CLIENT_INFO, TagKeysTransport
+
+
+class TagKeysGrpcTransport(TagKeysTransport):
+ """gRPC backend transport for TagKeys.
+
+ Allow users to create and manage tag keys.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _stubs: Dict[str, Callable]
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[grpc.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ channel (Optional[grpc.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> grpc.Channel:
+ """Create and return a gRPC channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ grpc.Channel: A gRPC channel object.
+
+ Raises:
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+
+ return grpc_helpers.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ @property
+ def grpc_channel(self) -> grpc.Channel:
+ """Return the channel designed to connect to this service."""
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsClient(self.grpc_channel)
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def list_tag_keys(
+ self,
+ ) -> Callable[[tag_keys.ListTagKeysRequest], tag_keys.ListTagKeysResponse]:
+ r"""Return a callable for the list tag keys method over gRPC.
+
+ Lists all TagKeys for a parent resource.
+
+ Returns:
+ Callable[[~.ListTagKeysRequest],
+ ~.ListTagKeysResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_tag_keys" not in self._stubs:
+ self._stubs["list_tag_keys"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/ListTagKeys",
+ request_serializer=tag_keys.ListTagKeysRequest.serialize,
+ response_deserializer=tag_keys.ListTagKeysResponse.deserialize,
+ )
+ return self._stubs["list_tag_keys"]
+
+ @property
+ def get_tag_key(self) -> Callable[[tag_keys.GetTagKeyRequest], tag_keys.TagKey]:
+ r"""Return a callable for the get tag key method over gRPC.
+
+ Retrieves a TagKey. This method will return
+ ``PERMISSION_DENIED`` if the key does not exist or the user does
+ not have permission to view it.
+
+ Returns:
+ Callable[[~.GetTagKeyRequest],
+ ~.TagKey]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_tag_key" not in self._stubs:
+ self._stubs["get_tag_key"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/GetTagKey",
+ request_serializer=tag_keys.GetTagKeyRequest.serialize,
+ response_deserializer=tag_keys.TagKey.deserialize,
+ )
+ return self._stubs["get_tag_key"]
+
+ @property
+ def get_namespaced_tag_key(
+ self,
+ ) -> Callable[[tag_keys.GetNamespacedTagKeyRequest], tag_keys.TagKey]:
+ r"""Return a callable for the get namespaced tag key method over gRPC.
+
+ Retrieves a TagKey by its namespaced name. This method will
+ return ``PERMISSION_DENIED`` if the key does not exist or the
+ user does not have permission to view it.
+
+ Returns:
+ Callable[[~.GetNamespacedTagKeyRequest],
+ ~.TagKey]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_namespaced_tag_key" not in self._stubs:
+ self._stubs["get_namespaced_tag_key"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/GetNamespacedTagKey",
+ request_serializer=tag_keys.GetNamespacedTagKeyRequest.serialize,
+ response_deserializer=tag_keys.TagKey.deserialize,
+ )
+ return self._stubs["get_namespaced_tag_key"]
+
+ @property
+ def create_tag_key(
+ self,
+ ) -> Callable[[tag_keys.CreateTagKeyRequest], operations_pb2.Operation]:
+ r"""Return a callable for the create tag key method over gRPC.
+
+ Creates a new TagKey. If another request with the
+ same parameters is sent while the original request is in
+ process, the second request will receive an error. A
+ maximum of 1000 TagKeys can exist under a parent at any
+ given time.
+
+ Returns:
+ Callable[[~.CreateTagKeyRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_tag_key" not in self._stubs:
+ self._stubs["create_tag_key"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/CreateTagKey",
+ request_serializer=tag_keys.CreateTagKeyRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_tag_key"]
+
+ @property
+ def update_tag_key(
+ self,
+ ) -> Callable[[tag_keys.UpdateTagKeyRequest], operations_pb2.Operation]:
+ r"""Return a callable for the update tag key method over gRPC.
+
+ Updates the attributes of the TagKey resource.
+
+ Returns:
+ Callable[[~.UpdateTagKeyRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_tag_key" not in self._stubs:
+ self._stubs["update_tag_key"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/UpdateTagKey",
+ request_serializer=tag_keys.UpdateTagKeyRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["update_tag_key"]
+
+ @property
+ def delete_tag_key(
+ self,
+ ) -> Callable[[tag_keys.DeleteTagKeyRequest], operations_pb2.Operation]:
+ r"""Return a callable for the delete tag key method over gRPC.
+
+ Deletes a TagKey. The TagKey cannot be deleted if it
+ has any child TagValues.
+
+ Returns:
+ Callable[[~.DeleteTagKeyRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_tag_key" not in self._stubs:
+ self._stubs["delete_tag_key"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/DeleteTagKey",
+ request_serializer=tag_keys.DeleteTagKeyRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_tag_key"]
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], policy_pb2.Policy]:
+ r"""Return a callable for the get iam policy method over gRPC.
+
+ Gets the access control policy for a TagKey. The returned policy
+ may be empty if no such policy or resource exists. The
+ ``resource`` field should be the TagKey's resource name. For
+ example, "tagKeys/1234". The caller must have
+ ``cloudresourcemanager.googleapis.com/tagKeys.getIamPolicy``
+ permission on the specified TagKey.
+
+ Returns:
+ Callable[[~.GetIamPolicyRequest],
+ ~.Policy]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_iam_policy" not in self._stubs:
+ self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/GetIamPolicy",
+ request_serializer=iam_policy_pb2.GetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["get_iam_policy"]
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], policy_pb2.Policy]:
+ r"""Return a callable for the set iam policy method over gRPC.
+
+ Sets the access control policy on a TagKey, replacing any
+ existing policy. The ``resource`` field should be the TagKey's
+ resource name. For example, "tagKeys/1234". The caller must have
+ ``resourcemanager.tagKeys.setIamPolicy`` permission on the
+ identified tagValue.
+
+ Returns:
+ Callable[[~.SetIamPolicyRequest],
+ ~.Policy]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "set_iam_policy" not in self._stubs:
+ self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/SetIamPolicy",
+ request_serializer=iam_policy_pb2.SetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["set_iam_policy"]
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ iam_policy_pb2.TestIamPermissionsResponse,
+ ]:
+ r"""Return a callable for the test iam permissions method over gRPC.
+
+ Returns permissions that a caller has on the specified TagKey.
+ The ``resource`` field should be the TagKey's resource name. For
+ example, "tagKeys/1234".
+
+ There are no permissions required for making this API call.
+
+ Returns:
+ Callable[[~.TestIamPermissionsRequest],
+ ~.TestIamPermissionsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "test_iam_permissions" not in self._stubs:
+ self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/TestIamPermissions",
+ request_serializer=iam_policy_pb2.TestIamPermissionsRequest.SerializeToString,
+ response_deserializer=iam_policy_pb2.TestIamPermissionsResponse.FromString,
+ )
+ return self._stubs["test_iam_permissions"]
+
+ def close(self):
+ self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+ @property
+ def kind(self) -> str:
+ return "grpc"
+
+
+__all__ = ("TagKeysGrpcTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/grpc_asyncio.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/grpc_asyncio.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/grpc_asyncio.py
@@ -0,0 +1,536 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers_async, operations_v1
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+from grpc.experimental import aio # type: ignore
+
+from google.cloud.resourcemanager_v3.types import tag_keys
+
+from .base import DEFAULT_CLIENT_INFO, TagKeysTransport
+from .grpc import TagKeysGrpcTransport
+
+
+class TagKeysGrpcAsyncIOTransport(TagKeysTransport):
+ """gRPC AsyncIO backend transport for TagKeys.
+
+ Allow users to create and manage tag keys.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _grpc_channel: aio.Channel
+ _stubs: Dict[str, Callable] = {}
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> aio.Channel:
+ """Create and return a gRPC AsyncIO channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ aio.Channel: A gRPC AsyncIO channel object.
+ """
+
+ return grpc_helpers_async.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[aio.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ channel (Optional[aio.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsAsyncClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def grpc_channel(self) -> aio.Channel:
+ """Create the channel designed to connect to this service.
+
+ This property caches on the instance; repeated calls return
+ the same channel.
+ """
+ # Return the channel from cache.
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsAsyncClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsAsyncClient(
+ self.grpc_channel
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def list_tag_keys(
+ self,
+ ) -> Callable[
+ [tag_keys.ListTagKeysRequest], Awaitable[tag_keys.ListTagKeysResponse]
+ ]:
+ r"""Return a callable for the list tag keys method over gRPC.
+
+ Lists all TagKeys for a parent resource.
+
+ Returns:
+ Callable[[~.ListTagKeysRequest],
+ Awaitable[~.ListTagKeysResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_tag_keys" not in self._stubs:
+ self._stubs["list_tag_keys"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/ListTagKeys",
+ request_serializer=tag_keys.ListTagKeysRequest.serialize,
+ response_deserializer=tag_keys.ListTagKeysResponse.deserialize,
+ )
+ return self._stubs["list_tag_keys"]
+
+ @property
+ def get_tag_key(
+ self,
+ ) -> Callable[[tag_keys.GetTagKeyRequest], Awaitable[tag_keys.TagKey]]:
+ r"""Return a callable for the get tag key method over gRPC.
+
+ Retrieves a TagKey. This method will return
+ ``PERMISSION_DENIED`` if the key does not exist or the user does
+ not have permission to view it.
+
+ Returns:
+ Callable[[~.GetTagKeyRequest],
+ Awaitable[~.TagKey]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_tag_key" not in self._stubs:
+ self._stubs["get_tag_key"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/GetTagKey",
+ request_serializer=tag_keys.GetTagKeyRequest.serialize,
+ response_deserializer=tag_keys.TagKey.deserialize,
+ )
+ return self._stubs["get_tag_key"]
+
+ @property
+ def get_namespaced_tag_key(
+ self,
+ ) -> Callable[[tag_keys.GetNamespacedTagKeyRequest], Awaitable[tag_keys.TagKey]]:
+ r"""Return a callable for the get namespaced tag key method over gRPC.
+
+ Retrieves a TagKey by its namespaced name. This method will
+ return ``PERMISSION_DENIED`` if the key does not exist or the
+ user does not have permission to view it.
+
+ Returns:
+ Callable[[~.GetNamespacedTagKeyRequest],
+ Awaitable[~.TagKey]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_namespaced_tag_key" not in self._stubs:
+ self._stubs["get_namespaced_tag_key"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/GetNamespacedTagKey",
+ request_serializer=tag_keys.GetNamespacedTagKeyRequest.serialize,
+ response_deserializer=tag_keys.TagKey.deserialize,
+ )
+ return self._stubs["get_namespaced_tag_key"]
+
+ @property
+ def create_tag_key(
+ self,
+ ) -> Callable[[tag_keys.CreateTagKeyRequest], Awaitable[operations_pb2.Operation]]:
+ r"""Return a callable for the create tag key method over gRPC.
+
+ Creates a new TagKey. If another request with the
+ same parameters is sent while the original request is in
+ process, the second request will receive an error. A
+ maximum of 1000 TagKeys can exist under a parent at any
+ given time.
+
+ Returns:
+ Callable[[~.CreateTagKeyRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_tag_key" not in self._stubs:
+ self._stubs["create_tag_key"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/CreateTagKey",
+ request_serializer=tag_keys.CreateTagKeyRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_tag_key"]
+
+ @property
+ def update_tag_key(
+ self,
+ ) -> Callable[[tag_keys.UpdateTagKeyRequest], Awaitable[operations_pb2.Operation]]:
+ r"""Return a callable for the update tag key method over gRPC.
+
+ Updates the attributes of the TagKey resource.
+
+ Returns:
+ Callable[[~.UpdateTagKeyRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_tag_key" not in self._stubs:
+ self._stubs["update_tag_key"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/UpdateTagKey",
+ request_serializer=tag_keys.UpdateTagKeyRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["update_tag_key"]
+
+ @property
+ def delete_tag_key(
+ self,
+ ) -> Callable[[tag_keys.DeleteTagKeyRequest], Awaitable[operations_pb2.Operation]]:
+ r"""Return a callable for the delete tag key method over gRPC.
+
+ Deletes a TagKey. The TagKey cannot be deleted if it
+ has any child TagValues.
+
+ Returns:
+ Callable[[~.DeleteTagKeyRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_tag_key" not in self._stubs:
+ self._stubs["delete_tag_key"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/DeleteTagKey",
+ request_serializer=tag_keys.DeleteTagKeyRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_tag_key"]
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], Awaitable[policy_pb2.Policy]]:
+ r"""Return a callable for the get iam policy method over gRPC.
+
+ Gets the access control policy for a TagKey. The returned policy
+ may be empty if no such policy or resource exists. The
+ ``resource`` field should be the TagKey's resource name. For
+ example, "tagKeys/1234". The caller must have
+ ``cloudresourcemanager.googleapis.com/tagKeys.getIamPolicy``
+ permission on the specified TagKey.
+
+ Returns:
+ Callable[[~.GetIamPolicyRequest],
+ Awaitable[~.Policy]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_iam_policy" not in self._stubs:
+ self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/GetIamPolicy",
+ request_serializer=iam_policy_pb2.GetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["get_iam_policy"]
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], Awaitable[policy_pb2.Policy]]:
+ r"""Return a callable for the set iam policy method over gRPC.
+
+ Sets the access control policy on a TagKey, replacing any
+ existing policy. The ``resource`` field should be the TagKey's
+ resource name. For example, "tagKeys/1234". The caller must have
+ ``resourcemanager.tagKeys.setIamPolicy`` permission on the
+ identified tagValue.
+
+ Returns:
+ Callable[[~.SetIamPolicyRequest],
+ Awaitable[~.Policy]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "set_iam_policy" not in self._stubs:
+ self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/SetIamPolicy",
+ request_serializer=iam_policy_pb2.SetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["set_iam_policy"]
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ Awaitable[iam_policy_pb2.TestIamPermissionsResponse],
+ ]:
+ r"""Return a callable for the test iam permissions method over gRPC.
+
+ Returns permissions that a caller has on the specified TagKey.
+ The ``resource`` field should be the TagKey's resource name. For
+ example, "tagKeys/1234".
+
+ There are no permissions required for making this API call.
+
+ Returns:
+ Callable[[~.TestIamPermissionsRequest],
+ Awaitable[~.TestIamPermissionsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "test_iam_permissions" not in self._stubs:
+ self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagKeys/TestIamPermissions",
+ request_serializer=iam_policy_pb2.TestIamPermissionsRequest.SerializeToString,
+ response_deserializer=iam_policy_pb2.TestIamPermissionsResponse.FromString,
+ )
+ return self._stubs["test_iam_permissions"]
+
+ def close(self):
+ return self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+
+__all__ = ("TagKeysGrpcAsyncIOTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/rest.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/rest.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_keys/transports/rest.py
@@ -0,0 +1,1637 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import dataclasses
+import json # type: ignore
+import re
+from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import (
+ gapic_v1,
+ operations_v1,
+ path_template,
+ rest_helpers,
+ rest_streaming,
+)
+from google.api_core import exceptions as core_exceptions
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.auth.transport.requests import AuthorizedSession # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import json_format
+import grpc # type: ignore
+from requests import __version__ as requests_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.types import tag_keys
+
+from .base import DEFAULT_CLIENT_INFO as BASE_DEFAULT_CLIENT_INFO
+from .base import TagKeysTransport
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=BASE_DEFAULT_CLIENT_INFO.gapic_version,
+ grpc_version=None,
+ rest_version=requests_version,
+)
+
+
+class TagKeysRestInterceptor:
+ """Interceptor for TagKeys.
+
+ Interceptors are used to manipulate requests, request metadata, and responses
+ in arbitrary ways.
+ Example use cases include:
+ * Logging
+ * Verifying requests according to service or custom semantics
+ * Stripping extraneous information from responses
+
+ These use cases and more can be enabled by injecting an
+ instance of a custom subclass when constructing the TagKeysRestTransport.
+
+ .. code-block:: python
+ class MyCustomTagKeysInterceptor(TagKeysRestInterceptor):
+ def pre_create_tag_key(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_create_tag_key(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_delete_tag_key(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_delete_tag_key(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_iam_policy(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_iam_policy(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_namespaced_tag_key(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_namespaced_tag_key(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_tag_key(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_tag_key(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_tag_keys(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_tag_keys(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_set_iam_policy(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_set_iam_policy(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_test_iam_permissions(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_test_iam_permissions(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_update_tag_key(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_update_tag_key(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ transport = TagKeysRestTransport(interceptor=MyCustomTagKeysInterceptor())
+ client = TagKeysClient(transport=transport)
+
+
+ """
+
+ def pre_create_tag_key(
+ self, request: tag_keys.CreateTagKeyRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[tag_keys.CreateTagKeyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for create_tag_key
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagKeys server.
+ """
+ return request, metadata
+
+ def post_create_tag_key(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for create_tag_key
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagKeys server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_delete_tag_key(
+ self, request: tag_keys.DeleteTagKeyRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[tag_keys.DeleteTagKeyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for delete_tag_key
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagKeys server.
+ """
+ return request, metadata
+
+ def post_delete_tag_key(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for delete_tag_key
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagKeys server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_iam_policy(
+ self,
+ request: iam_policy_pb2.GetIamPolicyRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.GetIamPolicyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_iam_policy
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagKeys server.
+ """
+ return request, metadata
+
+ def post_get_iam_policy(self, response: policy_pb2.Policy) -> policy_pb2.Policy:
+ """Post-rpc interceptor for get_iam_policy
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagKeys server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_namespaced_tag_key(
+ self,
+ request: tag_keys.GetNamespacedTagKeyRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_keys.GetNamespacedTagKeyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_namespaced_tag_key
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagKeys server.
+ """
+ return request, metadata
+
+ def post_get_namespaced_tag_key(self, response: tag_keys.TagKey) -> tag_keys.TagKey:
+ """Post-rpc interceptor for get_namespaced_tag_key
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagKeys server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_tag_key(
+ self, request: tag_keys.GetTagKeyRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[tag_keys.GetTagKeyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_tag_key
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagKeys server.
+ """
+ return request, metadata
+
+ def post_get_tag_key(self, response: tag_keys.TagKey) -> tag_keys.TagKey:
+ """Post-rpc interceptor for get_tag_key
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagKeys server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_tag_keys(
+ self, request: tag_keys.ListTagKeysRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[tag_keys.ListTagKeysRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_tag_keys
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagKeys server.
+ """
+ return request, metadata
+
+ def post_list_tag_keys(
+ self, response: tag_keys.ListTagKeysResponse
+ ) -> tag_keys.ListTagKeysResponse:
+ """Post-rpc interceptor for list_tag_keys
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagKeys server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_set_iam_policy(
+ self,
+ request: iam_policy_pb2.SetIamPolicyRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.SetIamPolicyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for set_iam_policy
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagKeys server.
+ """
+ return request, metadata
+
+ def post_set_iam_policy(self, response: policy_pb2.Policy) -> policy_pb2.Policy:
+ """Post-rpc interceptor for set_iam_policy
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagKeys server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_test_iam_permissions(
+ self,
+ request: iam_policy_pb2.TestIamPermissionsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.TestIamPermissionsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for test_iam_permissions
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagKeys server.
+ """
+ return request, metadata
+
+ def post_test_iam_permissions(
+ self, response: iam_policy_pb2.TestIamPermissionsResponse
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ """Post-rpc interceptor for test_iam_permissions
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagKeys server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_update_tag_key(
+ self, request: tag_keys.UpdateTagKeyRequest, metadata: Sequence[Tuple[str, str]]
+ ) -> Tuple[tag_keys.UpdateTagKeyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for update_tag_key
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagKeys server.
+ """
+ return request, metadata
+
+ def post_update_tag_key(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for update_tag_key
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagKeys server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_operation(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[operations_pb2.GetOperationRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagKeys server.
+ """
+ return request, metadata
+
+ def post_get_operation(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagKeys server but before
+ it is returned to user code.
+ """
+ return response
+
+
+@dataclasses.dataclass
+class TagKeysRestStub:
+ _session: AuthorizedSession
+ _host: str
+ _interceptor: TagKeysRestInterceptor
+
+
+class TagKeysRestTransport(TagKeysTransport):
+ """REST backend transport for TagKeys.
+
+ Allow users to create and manage tag keys.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends JSON representations of protocol buffers over HTTP/1.1
+
+ """
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ url_scheme: str = "https",
+ interceptor: Optional[TagKeysRestInterceptor] = None,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Callable[[], Tuple[bytes, bytes]]): Client
+ certificate to configure mutual TLS HTTP channel. It is ignored
+ if ``channel`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you are developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ url_scheme: the protocol scheme for the API endpoint. Normally
+ "https", but for testing or local servers,
+ "http" can be specified.
+ """
+ # Run the base constructor
+ # TODO(yon-mg): resolve other ctor params i.e. scopes, quota, etc.
+ # TODO: When custom host (api_endpoint) is set, `scopes` must *also* be set on the
+ # credentials object
+ maybe_url_match = re.match("^(?P<scheme>http(?:s)?://)?(?P<host>.*)$", host)
+ if maybe_url_match is None:
+ raise ValueError(
+ f"Unexpected hostname structure: {host}"
+ ) # pragma: NO COVER
+
+ url_match_items = maybe_url_match.groupdict()
+
+ host = f"{url_scheme}://{host}" if not url_match_items["scheme"] else host
+
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+ self._session = AuthorizedSession(
+ self._credentials, default_host=self.DEFAULT_HOST
+ )
+ self._operations_client: Optional[operations_v1.AbstractOperationsClient] = None
+ if client_cert_source_for_mtls:
+ self._session.configure_mtls_channel(client_cert_source_for_mtls)
+ self._interceptor = interceptor or TagKeysRestInterceptor()
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def operations_client(self) -> operations_v1.AbstractOperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ http_options: Dict[str, List[Dict[str, str]]] = {
+ "google.longrunning.Operations.GetOperation": [
+ {
+ "method": "get",
+ "uri": "/v3/{name=operations/**}",
+ },
+ ],
+ }
+
+ rest_transport = operations_v1.OperationsRestTransport(
+ host=self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ scopes=self._scopes,
+ http_options=http_options,
+ path_prefix="v3",
+ )
+
+ self._operations_client = operations_v1.AbstractOperationsClient(
+ transport=rest_transport
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ class _CreateTagKey(TagKeysRestStub):
+ def __hash__(self):
+ return hash("CreateTagKey")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_keys.CreateTagKeyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the create tag key method over HTTP.
+
+ Args:
+ request (~.tag_keys.CreateTagKeyRequest):
+ The request object. The request message for creating a
+ TagKey.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/tagKeys",
+ "body": "tag_key",
+ },
+ ]
+ request, metadata = self._interceptor.pre_create_tag_key(request, metadata)
+ pb_request = tag_keys.CreateTagKeyRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_create_tag_key(resp)
+ return resp
+
+ class _DeleteTagKey(TagKeysRestStub):
+ def __hash__(self):
+ return hash("DeleteTagKey")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_keys.DeleteTagKeyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the delete tag key method over HTTP.
+
+ Args:
+ request (~.tag_keys.DeleteTagKeyRequest):
+ The request object. The request message for deleting a
+ TagKey.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "delete",
+ "uri": "/v3/{name=tagKeys/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_delete_tag_key(request, metadata)
+ pb_request = tag_keys.DeleteTagKeyRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_delete_tag_key(resp)
+ return resp
+
+ class _GetIamPolicy(TagKeysRestStub):
+ def __hash__(self):
+ return hash("GetIamPolicy")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.GetIamPolicyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Call the get iam policy method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.GetIamPolicyRequest):
+ The request object. Request message for ``GetIamPolicy`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which
+ specifies access controls for Google Cloud resources.
+
+ A ``Policy`` is a collection of ``bindings``. A
+ ``binding`` binds one or more ``members``, or
+ principals, to a single ``role``. Principals can be user
+ accounts, service accounts, Google groups, and domains
+ (such as G Suite). A ``role`` is a named list of
+ permissions; each ``role`` can be an IAM predefined role
+ or a user-created custom role.
+
+ For some types of Google Cloud resources, a ``binding``
+ can also specify a ``condition``, which is a logical
+ expression that allows access to a resource only if the
+ expression evaluates to ``true``. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the `IAM
+ documentation <https://cloud.google.com/iam/help/conditions/resource-policies>`__.
+
+ **JSON example:**
+
+ ::
+
+ {
+ "bindings": [
+ {
+ "role": "roles/resourcemanager.organizationAdmin",
+ "members": [
+ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+ },
+ {
+ "role": "roles/resourcemanager.organizationViewer",
+ "members": [
+ "user:eve@example.com"
+ ],
+ "condition": {
+ "title": "expirable access",
+ "description": "Does not grant access after Sep 2020",
+ "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')",
+ }
+ }
+ ],
+ "etag": "BwWWja0YfJA=",
+ "version": 3
+ }
+
+ **YAML example:**
+
+ ::
+
+ bindings:
+ - members:
+ - user:mike@example.com
+ - group:admins@example.com
+ - domain:google.com
+ - serviceAccount:my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin
+ - members:
+ - user:eve@example.com
+ role: roles/resourcemanager.organizationViewer
+ condition:
+ title: expirable access
+ description: Does not grant access after Sep 2020
+ expression: request.time < timestamp('2020-10-01T00:00:00.000Z')
+ etag: BwWWja0YfJA=
+ version: 3
+
+ For a description of IAM and its features, see the `IAM
+ documentation <https://cloud.google.com/iam/docs/>`__.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=tagKeys/*}:getIamPolicy",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_iam_policy(request, metadata)
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = policy_pb2.Policy()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_iam_policy(resp)
+ return resp
+
+ class _GetNamespacedTagKey(TagKeysRestStub):
+ def __hash__(self):
+ return hash("GetNamespacedTagKey")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "name": "",
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_keys.GetNamespacedTagKeyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_keys.TagKey:
+ r"""Call the get namespaced tag key method over HTTP.
+
+ Args:
+ request (~.tag_keys.GetNamespacedTagKeyRequest):
+ The request object. The request message for getting a
+ TagKey by its namespaced name.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.tag_keys.TagKey:
+ A TagKey, used to group a set of
+ TagValues.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/tagKeys/namespaced",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_namespaced_tag_key(
+ request, metadata
+ )
+ pb_request = tag_keys.GetNamespacedTagKeyRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = tag_keys.TagKey()
+ pb_resp = tag_keys.TagKey.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_namespaced_tag_key(resp)
+ return resp
+
+ class _GetTagKey(TagKeysRestStub):
+ def __hash__(self):
+ return hash("GetTagKey")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_keys.GetTagKeyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_keys.TagKey:
+ r"""Call the get tag key method over HTTP.
+
+ Args:
+ request (~.tag_keys.GetTagKeyRequest):
+ The request object. The request message for getting a
+ TagKey.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.tag_keys.TagKey:
+ A TagKey, used to group a set of
+ TagValues.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/{name=tagKeys/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_tag_key(request, metadata)
+ pb_request = tag_keys.GetTagKeyRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = tag_keys.TagKey()
+ pb_resp = tag_keys.TagKey.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_tag_key(resp)
+ return resp
+
+ class _ListTagKeys(TagKeysRestStub):
+ def __hash__(self):
+ return hash("ListTagKeys")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "parent": "",
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_keys.ListTagKeysRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_keys.ListTagKeysResponse:
+ r"""Call the list tag keys method over HTTP.
+
+ Args:
+ request (~.tag_keys.ListTagKeysRequest):
+ The request object. The request message for listing all
+ TagKeys under a parent resource.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.tag_keys.ListTagKeysResponse:
+ The ListTagKeys response message.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/tagKeys",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_tag_keys(request, metadata)
+ pb_request = tag_keys.ListTagKeysRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = tag_keys.ListTagKeysResponse()
+ pb_resp = tag_keys.ListTagKeysResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_tag_keys(resp)
+ return resp
+
+ class _SetIamPolicy(TagKeysRestStub):
+ def __hash__(self):
+ return hash("SetIamPolicy")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.SetIamPolicyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Call the set iam policy method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.SetIamPolicyRequest):
+ The request object. Request message for ``SetIamPolicy`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which
+ specifies access controls for Google Cloud resources.
+
+ A ``Policy`` is a collection of ``bindings``. A
+ ``binding`` binds one or more ``members``, or
+ principals, to a single ``role``. Principals can be user
+ accounts, service accounts, Google groups, and domains
+ (such as G Suite). A ``role`` is a named list of
+ permissions; each ``role`` can be an IAM predefined role
+ or a user-created custom role.
+
+ For some types of Google Cloud resources, a ``binding``
+ can also specify a ``condition``, which is a logical
+ expression that allows access to a resource only if the
+ expression evaluates to ``true``. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the `IAM
+ documentation <https://cloud.google.com/iam/help/conditions/resource-policies>`__.
+
+ **JSON example:**
+
+ ::
+
+ {
+ "bindings": [
+ {
+ "role": "roles/resourcemanager.organizationAdmin",
+ "members": [
+ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+ },
+ {
+ "role": "roles/resourcemanager.organizationViewer",
+ "members": [
+ "user:eve@example.com"
+ ],
+ "condition": {
+ "title": "expirable access",
+ "description": "Does not grant access after Sep 2020",
+ "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')",
+ }
+ }
+ ],
+ "etag": "BwWWja0YfJA=",
+ "version": 3
+ }
+
+ **YAML example:**
+
+ ::
+
+ bindings:
+ - members:
+ - user:mike@example.com
+ - group:admins@example.com
+ - domain:google.com
+ - serviceAccount:my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin
+ - members:
+ - user:eve@example.com
+ role: roles/resourcemanager.organizationViewer
+ condition:
+ title: expirable access
+ description: Does not grant access after Sep 2020
+ expression: request.time < timestamp('2020-10-01T00:00:00.000Z')
+ etag: BwWWja0YfJA=
+ version: 3
+
+ For a description of IAM and its features, see the `IAM
+ documentation <https://cloud.google.com/iam/docs/>`__.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=tagKeys/*}:setIamPolicy",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_set_iam_policy(request, metadata)
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = policy_pb2.Policy()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_set_iam_policy(resp)
+ return resp
+
+ class _TestIamPermissions(TagKeysRestStub):
+ def __hash__(self):
+ return hash("TestIamPermissions")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.TestIamPermissionsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Call the test iam permissions method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.TestIamPermissionsRequest):
+ The request object. Request message for ``TestIamPermissions`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for ``TestIamPermissions`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=tagKeys/*}:testIamPermissions",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_test_iam_permissions(
+ request, metadata
+ )
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = iam_policy_pb2.TestIamPermissionsResponse()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_test_iam_permissions(resp)
+ return resp
+
+ class _UpdateTagKey(TagKeysRestStub):
+ def __hash__(self):
+ return hash("UpdateTagKey")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_keys.UpdateTagKeyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the update tag key method over HTTP.
+
+ Args:
+ request (~.tag_keys.UpdateTagKeyRequest):
+ The request object. The request message for updating a
+ TagKey.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "patch",
+ "uri": "/v3/{tag_key.name=tagKeys/*}",
+ "body": "tag_key",
+ },
+ ]
+ request, metadata = self._interceptor.pre_update_tag_key(request, metadata)
+ pb_request = tag_keys.UpdateTagKeyRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_update_tag_key(resp)
+ return resp
+
+ @property
+ def create_tag_key(
+ self,
+ ) -> Callable[[tag_keys.CreateTagKeyRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._CreateTagKey(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def delete_tag_key(
+ self,
+ ) -> Callable[[tag_keys.DeleteTagKeyRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._DeleteTagKey(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], policy_pb2.Policy]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetIamPolicy(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_namespaced_tag_key(
+ self,
+ ) -> Callable[[tag_keys.GetNamespacedTagKeyRequest], tag_keys.TagKey]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetNamespacedTagKey(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_tag_key(self) -> Callable[[tag_keys.GetTagKeyRequest], tag_keys.TagKey]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetTagKey(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_tag_keys(
+ self,
+ ) -> Callable[[tag_keys.ListTagKeysRequest], tag_keys.ListTagKeysResponse]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListTagKeys(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], policy_pb2.Policy]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._SetIamPolicy(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ iam_policy_pb2.TestIamPermissionsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._TestIamPermissions(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def update_tag_key(
+ self,
+ ) -> Callable[[tag_keys.UpdateTagKeyRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._UpdateTagKey(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_operation(self):
+ return self._GetOperation(self._session, self._host, self._interceptor) # type: ignore
+
+ class _GetOperation(TagKeysRestStub):
+ def __call__(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+
+ r"""Call the get operation method over HTTP.
+
+ Args:
+ request (operations_pb2.GetOperationRequest):
+ The request object for GetOperation method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ operations_pb2.Operation: Response from GetOperation method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/{name=operations/**}",
+ },
+ ]
+
+ request, metadata = self._interceptor.pre_get_operation(request, metadata)
+ request_kwargs = json_format.MessageToDict(request)
+ transcoded_request = path_template.transcode(http_options, **request_kwargs)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(json.dumps(transcoded_request["query_params"]))
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ resp = operations_pb2.Operation()
+ resp = json_format.Parse(response.content.decode("utf-8"), resp)
+ resp = self._interceptor.post_get_operation(resp)
+ return resp
+
+ @property
+ def kind(self) -> str:
+ return "rest"
+
+ def close(self):
+ self._session.close()
+
+
+__all__ = ("TagKeysRestTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/__init__.py
@@ -0,0 +1,22 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .async_client import TagValuesAsyncClient
+from .client import TagValuesClient
+
+__all__ = (
+ "TagValuesClient",
+ "TagValuesAsyncClient",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/async_client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/async_client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/async_client.py
@@ -0,0 +1,1469 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import functools
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+)
+
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.api_core.client_options import ClientOptions
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.tag_values import pagers
+from google.cloud.resourcemanager_v3.types import tag_values
+
+from .client import TagValuesClient
+from .transports.base import DEFAULT_CLIENT_INFO, TagValuesTransport
+from .transports.grpc_asyncio import TagValuesGrpcAsyncIOTransport
+
+
+class TagValuesAsyncClient:
+ """Allow users to create and manage tag values."""
+
+ _client: TagValuesClient
+
+ DEFAULT_ENDPOINT = TagValuesClient.DEFAULT_ENDPOINT
+ DEFAULT_MTLS_ENDPOINT = TagValuesClient.DEFAULT_MTLS_ENDPOINT
+
+ tag_value_path = staticmethod(TagValuesClient.tag_value_path)
+ parse_tag_value_path = staticmethod(TagValuesClient.parse_tag_value_path)
+ common_billing_account_path = staticmethod(
+ TagValuesClient.common_billing_account_path
+ )
+ parse_common_billing_account_path = staticmethod(
+ TagValuesClient.parse_common_billing_account_path
+ )
+ common_folder_path = staticmethod(TagValuesClient.common_folder_path)
+ parse_common_folder_path = staticmethod(TagValuesClient.parse_common_folder_path)
+ common_organization_path = staticmethod(TagValuesClient.common_organization_path)
+ parse_common_organization_path = staticmethod(
+ TagValuesClient.parse_common_organization_path
+ )
+ common_project_path = staticmethod(TagValuesClient.common_project_path)
+ parse_common_project_path = staticmethod(TagValuesClient.parse_common_project_path)
+ common_location_path = staticmethod(TagValuesClient.common_location_path)
+ parse_common_location_path = staticmethod(
+ TagValuesClient.parse_common_location_path
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagValuesAsyncClient: The constructed client.
+ """
+ return TagValuesClient.from_service_account_info.__func__(TagValuesAsyncClient, info, *args, **kwargs) # type: ignore
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagValuesAsyncClient: The constructed client.
+ """
+ return TagValuesClient.from_service_account_file.__func__(TagValuesAsyncClient, filename, *args, **kwargs) # type: ignore
+
+ from_service_account_json = from_service_account_file
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ return TagValuesClient.get_mtls_endpoint_and_cert_source(client_options) # type: ignore
+
+ @property
+ def transport(self) -> TagValuesTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ TagValuesTransport: The transport used by the client instance.
+ """
+ return self._client.transport
+
+ get_transport_class = functools.partial(
+ type(TagValuesClient).get_transport_class, type(TagValuesClient)
+ )
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Union[str, TagValuesTransport] = "grpc_asyncio",
+ client_options: Optional[ClientOptions] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the tag values client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, ~.TagValuesTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (ClientOptions): Custom options for the client. It
+ won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ self._client = TagValuesClient(
+ credentials=credentials,
+ transport=transport,
+ client_options=client_options,
+ client_info=client_info,
+ )
+
+ async def list_tag_values(
+ self,
+ request: Optional[Union[tag_values.ListTagValuesRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListTagValuesAsyncPager:
+ r"""Lists all TagValues for a specific TagKey.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_list_tag_values():
+ # Create a client
+ client = resourcemanager_v3.TagValuesAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListTagValuesRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_tag_values(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.ListTagValuesRequest, dict]]):
+ The request object. The request message for listing TagValues for the
+ specified TagKey. Resource name for TagKey, parent of
+ the TagValues to be listed, in the format
+ ``tagKeys/123``.
+ parent (:class:`str`):
+ Required.
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.tag_values.pagers.ListTagValuesAsyncPager:
+ The ListTagValues response.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_values.ListTagValuesRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_tag_values,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListTagValuesAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_tag_value(
+ self,
+ request: Optional[Union[tag_values.GetTagValueRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_values.TagValue:
+ r"""Retrieves a TagValue. This method will return
+ ``PERMISSION_DENIED`` if the value does not exist or the user
+ does not have permission to view it.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_get_tag_value():
+ # Create a client
+ client = resourcemanager_v3.TagValuesAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetTagValueRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_tag_value(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.GetTagValueRequest, dict]]):
+ The request object. The request message for getting a
+ TagValue.
+ name (:class:`str`):
+ Required. Resource name for TagValue to be fetched in
+ the format ``tagValues/456``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.TagValue:
+ A TagValue is a child of a particular
+ TagKey. This is used to group cloud
+ resources for the purpose of controlling
+ them using policies.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_values.GetTagValueRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_tag_value,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_namespaced_tag_value(
+ self,
+ request: Optional[Union[tag_values.GetNamespacedTagValueRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_values.TagValue:
+ r"""Retrieves a TagValue by its namespaced name. This method will
+ return ``PERMISSION_DENIED`` if the value does not exist or the
+ user does not have permission to view it.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_get_namespaced_tag_value():
+ # Create a client
+ client = resourcemanager_v3.TagValuesAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetNamespacedTagValueRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_namespaced_tag_value(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.GetNamespacedTagValueRequest, dict]]):
+ The request object. The request message for getting a
+ TagValue by its namespaced name.
+ name (:class:`str`):
+ Required. A namespaced tag value name in the following
+ format:
+
+ ``{parentId}/{tagKeyShort}/{tagValueShort}``
+
+ Examples:
+
+ - ``42/foo/abc`` for a value with short name "abc"
+ under the key with short name "foo" under the
+ organization with ID 42
+ - ``r2-d2/bar/xyz`` for a value with short name "xyz"
+ under the key with short name "bar" under the project
+ with ID "r2-d2"
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.TagValue:
+ A TagValue is a child of a particular
+ TagKey. This is used to group cloud
+ resources for the purpose of controlling
+ them using policies.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_values.GetNamespacedTagValueRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_namespaced_tag_value,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def create_tag_value(
+ self,
+ request: Optional[Union[tag_values.CreateTagValueRequest, dict]] = None,
+ *,
+ tag_value: Optional[tag_values.TagValue] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Creates a TagValue as a child of the specified
+ TagKey. If a another request with the same parameters is
+ sent while the original request is in process the second
+ request will receive an error. A maximum of 1000
+ TagValues can exist under a TagKey at any given time.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_create_tag_value():
+ # Create a client
+ client = resourcemanager_v3.TagValuesAsyncClient()
+
+ # Initialize request argument(s)
+ tag_value = resourcemanager_v3.TagValue()
+ tag_value.short_name = "short_name_value"
+
+ request = resourcemanager_v3.CreateTagValueRequest(
+ tag_value=tag_value,
+ )
+
+ # Make the request
+ operation = client.create_tag_value(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.CreateTagValueRequest, dict]]):
+ The request object. The request message for creating a
+ TagValue.
+ tag_value (:class:`google.cloud.resourcemanager_v3.types.TagValue`):
+ Required. The TagValue to be created. Only fields
+ ``short_name``, ``description``, and ``parent`` are
+ considered during the creation request.
+
+ This corresponds to the ``tag_value`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.TagValue` A TagValue is a child of a particular TagKey. This is used to group
+ cloud resources for the purpose of controlling them
+ using policies.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([tag_value])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_values.CreateTagValueRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if tag_value is not None:
+ request.tag_value = tag_value
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.create_tag_value,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ tag_values.TagValue,
+ metadata_type=tag_values.CreateTagValueMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def update_tag_value(
+ self,
+ request: Optional[Union[tag_values.UpdateTagValueRequest, dict]] = None,
+ *,
+ tag_value: Optional[tag_values.TagValue] = None,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Updates the attributes of the TagValue resource.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_update_tag_value():
+ # Create a client
+ client = resourcemanager_v3.TagValuesAsyncClient()
+
+ # Initialize request argument(s)
+ tag_value = resourcemanager_v3.TagValue()
+ tag_value.short_name = "short_name_value"
+
+ request = resourcemanager_v3.UpdateTagValueRequest(
+ tag_value=tag_value,
+ )
+
+ # Make the request
+ operation = client.update_tag_value(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.UpdateTagValueRequest, dict]]):
+ The request object. The request message for updating a
+ TagValue.
+ tag_value (:class:`google.cloud.resourcemanager_v3.types.TagValue`):
+ Required. The new definition of the TagValue. Only
+ fields ``description`` and ``etag`` fields can be
+ updated by this request. If the ``etag`` field is
+ nonempty, it must match the ``etag`` field of the
+ existing ControlGroup. Otherwise, ``ABORTED`` will be
+ returned.
+
+ This corresponds to the ``tag_value`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`):
+ Optional. Fields to be updated.
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.TagValue` A TagValue is a child of a particular TagKey. This is used to group
+ cloud resources for the purpose of controlling them
+ using policies.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([tag_value, update_mask])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_values.UpdateTagValueRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if tag_value is not None:
+ request.tag_value = tag_value
+ if update_mask is not None:
+ request.update_mask = update_mask
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.update_tag_value,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("tag_value.name", request.tag_value.name),)
+ ),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ tag_values.TagValue,
+ metadata_type=tag_values.UpdateTagValueMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def delete_tag_value(
+ self,
+ request: Optional[Union[tag_values.DeleteTagValueRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation_async.AsyncOperation:
+ r"""Deletes a TagValue. The TagValue cannot have any
+ bindings when it is deleted.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ async def sample_delete_tag_value():
+ # Create a client
+ client = resourcemanager_v3.TagValuesAsyncClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.DeleteTagValueRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_tag_value(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = (await operation).result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.resourcemanager_v3.types.DeleteTagValueRequest, dict]]):
+ The request object. The request message for deleting a
+ TagValue.
+ name (:class:`str`):
+ Required. Resource name for TagValue
+ to be deleted in the format
+ tagValues/456.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation_async.AsyncOperation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.TagValue` A TagValue is a child of a particular TagKey. This is used to group
+ cloud resources for the purpose of controlling them
+ using policies.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = tag_values.DeleteTagValueRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.delete_tag_value,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation_async.from_gapic(
+ response,
+ self._client._transport.operations_client,
+ tag_values.TagValue,
+ metadata_type=tag_values.DeleteTagValueMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.GetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Gets the access control policy for a TagValue. The returned
+ policy may be empty if no such policy or resource exists. The
+ ``resource`` field should be the TagValue's resource name. For
+ example: ``tagValues/1234``. The caller must have the
+ ``cloudresourcemanager.googleapis.com/tagValues.getIamPolicy``
+ permission on the identified TagValue to get the access control
+ policy.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_get_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.TagValuesAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = await client.get_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.GetIamPolicyRequest, dict]]):
+ The request object. Request message for ``GetIamPolicy`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy is being requested. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.GetIamPolicyRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource=resource,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_iam_policy,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def set_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.SetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Sets the access control policy on a TagValue, replacing any
+ existing policy. The ``resource`` field should be the TagValue's
+ resource name. For example: ``tagValues/1234``. The caller must
+ have ``resourcemanager.tagValues.setIamPolicy`` permission on
+ the identified tagValue.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_set_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.TagValuesAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = await client.set_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.SetIamPolicyRequest, dict]]):
+ The request object. Request message for ``SetIamPolicy`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy is being specified. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.SetIamPolicyRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource=resource,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.set_iam_policy,
+ default_timeout=60.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def test_iam_permissions(
+ self,
+ request: Optional[Union[iam_policy_pb2.TestIamPermissionsRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ permissions: Optional[MutableSequence[str]] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Returns permissions that a caller has on the specified TagValue.
+ The ``resource`` field should be the TagValue's resource name.
+ For example: ``tagValues/1234``.
+
+ There are no permissions required for making this API call.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ async def sample_test_iam_permissions():
+ # Create a client
+ client = resourcemanager_v3.TagValuesAsyncClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource="resource_value",
+ permissions=['permissions_value1', 'permissions_value2'],
+ )
+
+ # Make the request
+ response = await client.test_iam_permissions(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest, dict]]):
+ The request object. Request message for ``TestIamPermissions`` method.
+ resource (:class:`str`):
+ REQUIRED: The resource for which the
+ policy detail is being requested. See
+ the operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ permissions (:class:`MutableSequence[str]`):
+ The set of permissions to check for the ``resource``.
+ Permissions with wildcards (such as '*' or 'storage.*')
+ are not allowed. For more information see `IAM
+ Overview <https://cloud.google.com/iam/docs/overview#permissions>`__.
+
+ This corresponds to the ``permissions`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for TestIamPermissions method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource, permissions])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = iam_policy_pb2.TestIamPermissionsRequest(**request)
+ elif not request:
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource=resource,
+ permissions=permissions,
+ )
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.test_iam_permissions,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._client._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, exc_type, exc, tb):
+ await self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("TagValuesAsyncClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/client.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/client.py
@@ -0,0 +1,1667 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import os
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+ cast,
+)
+
+from google.api_core import client_options as client_options_lib
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.exceptions import MutualTLSChannelError # type: ignore
+from google.auth.transport import mtls # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.api_core import operation # type: ignore
+from google.api_core import operation_async # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.services.tag_values import pagers
+from google.cloud.resourcemanager_v3.types import tag_values
+
+from .transports.base import DEFAULT_CLIENT_INFO, TagValuesTransport
+from .transports.grpc import TagValuesGrpcTransport
+from .transports.grpc_asyncio import TagValuesGrpcAsyncIOTransport
+from .transports.rest import TagValuesRestTransport
+
+
+class TagValuesClientMeta(type):
+ """Metaclass for the TagValues client.
+
+ This provides class-level methods for building and retrieving
+ support objects (e.g. transport) without polluting the client instance
+ objects.
+ """
+
+ _transport_registry = OrderedDict() # type: Dict[str, Type[TagValuesTransport]]
+ _transport_registry["grpc"] = TagValuesGrpcTransport
+ _transport_registry["grpc_asyncio"] = TagValuesGrpcAsyncIOTransport
+ _transport_registry["rest"] = TagValuesRestTransport
+
+ def get_transport_class(
+ cls,
+ label: Optional[str] = None,
+ ) -> Type[TagValuesTransport]:
+ """Returns an appropriate transport class.
+
+ Args:
+ label: The name of the desired transport. If none is
+ provided, then the first transport in the registry is used.
+
+ Returns:
+ The transport class to use.
+ """
+ # If a specific transport is requested, return that one.
+ if label:
+ return cls._transport_registry[label]
+
+ # No transport is requested; return the default (that is, the first one
+ # in the dictionary).
+ return next(iter(cls._transport_registry.values()))
+
+
+class TagValuesClient(metaclass=TagValuesClientMeta):
+ """Allow users to create and manage tag values."""
+
+ @staticmethod
+ def _get_default_mtls_endpoint(api_endpoint):
+ """Converts api endpoint to mTLS endpoint.
+
+ Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to
+ "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively.
+ Args:
+ api_endpoint (Optional[str]): the api endpoint to convert.
+ Returns:
+ str: converted mTLS api endpoint.
+ """
+ if not api_endpoint:
+ return api_endpoint
+
+ mtls_endpoint_re = re.compile(
+ r"(?P<name>[^.]+)(?P<mtls>\.mtls)?(?P<sandbox>\.sandbox)?(?P<googledomain>\.googleapis\.com)?"
+ )
+
+ m = mtls_endpoint_re.match(api_endpoint)
+ name, mtls, sandbox, googledomain = m.groups()
+ if mtls or not googledomain:
+ return api_endpoint
+
+ if sandbox:
+ return api_endpoint.replace(
+ "sandbox.googleapis.com", "mtls.sandbox.googleapis.com"
+ )
+
+ return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com")
+
+ DEFAULT_ENDPOINT = "cloudresourcemanager.googleapis.com"
+ DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore
+ DEFAULT_ENDPOINT
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagValuesClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_info(info)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ TagValuesClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_file(filename)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ from_service_account_json = from_service_account_file
+
+ @property
+ def transport(self) -> TagValuesTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ TagValuesTransport: The transport used by the client
+ instance.
+ """
+ return self._transport
+
+ @staticmethod
+ def tag_value_path(
+ tag_value: str,
+ ) -> str:
+ """Returns a fully-qualified tag_value string."""
+ return "tagValues/{tag_value}".format(
+ tag_value=tag_value,
+ )
+
+ @staticmethod
+ def parse_tag_value_path(path: str) -> Dict[str, str]:
+ """Parses a tag_value path into its component segments."""
+ m = re.match(r"^tagValues/(?P<tag_value>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_billing_account_path(
+ billing_account: str,
+ ) -> str:
+ """Returns a fully-qualified billing_account string."""
+ return "billingAccounts/{billing_account}".format(
+ billing_account=billing_account,
+ )
+
+ @staticmethod
+ def parse_common_billing_account_path(path: str) -> Dict[str, str]:
+ """Parse a billing_account path into its component segments."""
+ m = re.match(r"^billingAccounts/(?P<billing_account>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_folder_path(
+ folder: str,
+ ) -> str:
+ """Returns a fully-qualified folder string."""
+ return "folders/{folder}".format(
+ folder=folder,
+ )
+
+ @staticmethod
+ def parse_common_folder_path(path: str) -> Dict[str, str]:
+ """Parse a folder path into its component segments."""
+ m = re.match(r"^folders/(?P<folder>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_organization_path(
+ organization: str,
+ ) -> str:
+ """Returns a fully-qualified organization string."""
+ return "organizations/{organization}".format(
+ organization=organization,
+ )
+
+ @staticmethod
+ def parse_common_organization_path(path: str) -> Dict[str, str]:
+ """Parse a organization path into its component segments."""
+ m = re.match(r"^organizations/(?P<organization>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_project_path(
+ project: str,
+ ) -> str:
+ """Returns a fully-qualified project string."""
+ return "projects/{project}".format(
+ project=project,
+ )
+
+ @staticmethod
+ def parse_common_project_path(path: str) -> Dict[str, str]:
+ """Parse a project path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_location_path(
+ project: str,
+ location: str,
+ ) -> str:
+ """Returns a fully-qualified location string."""
+ return "projects/{project}/locations/{location}".format(
+ project=project,
+ location=location,
+ )
+
+ @staticmethod
+ def parse_common_location_path(path: str) -> Dict[str, str]:
+ """Parse a location path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[client_options_lib.ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ use_client_cert = os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")
+ use_mtls_endpoint = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto")
+ if use_client_cert not in ("true", "false"):
+ raise ValueError(
+ "Environment variable `GOOGLE_API_USE_CLIENT_CERTIFICATE` must be either `true` or `false`"
+ )
+ if use_mtls_endpoint not in ("auto", "never", "always"):
+ raise MutualTLSChannelError(
+ "Environment variable `GOOGLE_API_USE_MTLS_ENDPOINT` must be `never`, `auto` or `always`"
+ )
+
+ # Figure out the client cert source to use.
+ client_cert_source = None
+ if use_client_cert == "true":
+ if client_options.client_cert_source:
+ client_cert_source = client_options.client_cert_source
+ elif mtls.has_default_client_cert_source():
+ client_cert_source = mtls.default_client_cert_source()
+
+ # Figure out which api endpoint to use.
+ if client_options.api_endpoint is not None:
+ api_endpoint = client_options.api_endpoint
+ elif use_mtls_endpoint == "always" or (
+ use_mtls_endpoint == "auto" and client_cert_source
+ ):
+ api_endpoint = cls.DEFAULT_MTLS_ENDPOINT
+ else:
+ api_endpoint = cls.DEFAULT_ENDPOINT
+
+ return api_endpoint, client_cert_source
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Optional[Union[str, TagValuesTransport]] = None,
+ client_options: Optional[Union[client_options_lib.ClientOptions, dict]] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the tag values client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, TagValuesTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (Optional[Union[google.api_core.client_options.ClientOptions, dict]]): Custom options for the
+ client. It won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ if isinstance(client_options, dict):
+ client_options = client_options_lib.from_dict(client_options)
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ client_options = cast(client_options_lib.ClientOptions, client_options)
+
+ api_endpoint, client_cert_source_func = self.get_mtls_endpoint_and_cert_source(
+ client_options
+ )
+
+ api_key_value = getattr(client_options, "api_key", None)
+ if api_key_value and credentials:
+ raise ValueError(
+ "client_options.api_key and credentials are mutually exclusive"
+ )
+
+ # Save or instantiate the transport.
+ # Ordinarily, we provide the transport, but allowing a custom transport
+ # instance provides an extensibility point for unusual situations.
+ if isinstance(transport, TagValuesTransport):
+ # transport is a TagValuesTransport instance.
+ if credentials or client_options.credentials_file or api_key_value:
+ raise ValueError(
+ "When providing a transport instance, "
+ "provide its credentials directly."
+ )
+ if client_options.scopes:
+ raise ValueError(
+ "When providing a transport instance, provide its scopes "
+ "directly."
+ )
+ self._transport = transport
+ else:
+ import google.auth._default # type: ignore
+
+ if api_key_value and hasattr(
+ google.auth._default, "get_api_key_credentials"
+ ):
+ credentials = google.auth._default.get_api_key_credentials(
+ api_key_value
+ )
+
+ Transport = type(self).get_transport_class(transport)
+ self._transport = Transport(
+ credentials=credentials,
+ credentials_file=client_options.credentials_file,
+ host=api_endpoint,
+ scopes=client_options.scopes,
+ client_cert_source_for_mtls=client_cert_source_func,
+ quota_project_id=client_options.quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=True,
+ api_audience=client_options.api_audience,
+ )
+
+ def list_tag_values(
+ self,
+ request: Optional[Union[tag_values.ListTagValuesRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListTagValuesPager:
+ r"""Lists all TagValues for a specific TagKey.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_list_tag_values():
+ # Create a client
+ client = resourcemanager_v3.TagValuesClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.ListTagValuesRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_tag_values(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.ListTagValuesRequest, dict]):
+ The request object. The request message for listing TagValues for the
+ specified TagKey. Resource name for TagKey, parent of
+ the TagValues to be listed, in the format
+ ``tagKeys/123``.
+ parent (str):
+ Required.
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.services.tag_values.pagers.ListTagValuesPager:
+ The ListTagValues response.
+ Iterating over this object will yield
+ results and resolve additional pages
+ automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_values.ListTagValuesRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_values.ListTagValuesRequest):
+ request = tag_values.ListTagValuesRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_tag_values]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListTagValuesPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_tag_value(
+ self,
+ request: Optional[Union[tag_values.GetTagValueRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_values.TagValue:
+ r"""Retrieves a TagValue. This method will return
+ ``PERMISSION_DENIED`` if the value does not exist or the user
+ does not have permission to view it.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_get_tag_value():
+ # Create a client
+ client = resourcemanager_v3.TagValuesClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetTagValueRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_tag_value(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.GetTagValueRequest, dict]):
+ The request object. The request message for getting a
+ TagValue.
+ name (str):
+ Required. Resource name for TagValue to be fetched in
+ the format ``tagValues/456``.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.TagValue:
+ A TagValue is a child of a particular
+ TagKey. This is used to group cloud
+ resources for the purpose of controlling
+ them using policies.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_values.GetTagValueRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_values.GetTagValueRequest):
+ request = tag_values.GetTagValueRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_tag_value]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_namespaced_tag_value(
+ self,
+ request: Optional[Union[tag_values.GetNamespacedTagValueRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_values.TagValue:
+ r"""Retrieves a TagValue by its namespaced name. This method will
+ return ``PERMISSION_DENIED`` if the value does not exist or the
+ user does not have permission to view it.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_get_namespaced_tag_value():
+ # Create a client
+ client = resourcemanager_v3.TagValuesClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.GetNamespacedTagValueRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_namespaced_tag_value(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.GetNamespacedTagValueRequest, dict]):
+ The request object. The request message for getting a
+ TagValue by its namespaced name.
+ name (str):
+ Required. A namespaced tag value name in the following
+ format:
+
+ ``{parentId}/{tagKeyShort}/{tagValueShort}``
+
+ Examples:
+
+ - ``42/foo/abc`` for a value with short name "abc"
+ under the key with short name "foo" under the
+ organization with ID 42
+ - ``r2-d2/bar/xyz`` for a value with short name "xyz"
+ under the key with short name "bar" under the project
+ with ID "r2-d2"
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.resourcemanager_v3.types.TagValue:
+ A TagValue is a child of a particular
+ TagKey. This is used to group cloud
+ resources for the purpose of controlling
+ them using policies.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_values.GetNamespacedTagValueRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_values.GetNamespacedTagValueRequest):
+ request = tag_values.GetNamespacedTagValueRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_namespaced_tag_value]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def create_tag_value(
+ self,
+ request: Optional[Union[tag_values.CreateTagValueRequest, dict]] = None,
+ *,
+ tag_value: Optional[tag_values.TagValue] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Creates a TagValue as a child of the specified
+ TagKey. If a another request with the same parameters is
+ sent while the original request is in process the second
+ request will receive an error. A maximum of 1000
+ TagValues can exist under a TagKey at any given time.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_create_tag_value():
+ # Create a client
+ client = resourcemanager_v3.TagValuesClient()
+
+ # Initialize request argument(s)
+ tag_value = resourcemanager_v3.TagValue()
+ tag_value.short_name = "short_name_value"
+
+ request = resourcemanager_v3.CreateTagValueRequest(
+ tag_value=tag_value,
+ )
+
+ # Make the request
+ operation = client.create_tag_value(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.CreateTagValueRequest, dict]):
+ The request object. The request message for creating a
+ TagValue.
+ tag_value (google.cloud.resourcemanager_v3.types.TagValue):
+ Required. The TagValue to be created. Only fields
+ ``short_name``, ``description``, and ``parent`` are
+ considered during the creation request.
+
+ This corresponds to the ``tag_value`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.TagValue` A TagValue is a child of a particular TagKey. This is used to group
+ cloud resources for the purpose of controlling them
+ using policies.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([tag_value])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_values.CreateTagValueRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_values.CreateTagValueRequest):
+ request = tag_values.CreateTagValueRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if tag_value is not None:
+ request.tag_value = tag_value
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.create_tag_value]
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ tag_values.TagValue,
+ metadata_type=tag_values.CreateTagValueMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def update_tag_value(
+ self,
+ request: Optional[Union[tag_values.UpdateTagValueRequest, dict]] = None,
+ *,
+ tag_value: Optional[tag_values.TagValue] = None,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Updates the attributes of the TagValue resource.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_update_tag_value():
+ # Create a client
+ client = resourcemanager_v3.TagValuesClient()
+
+ # Initialize request argument(s)
+ tag_value = resourcemanager_v3.TagValue()
+ tag_value.short_name = "short_name_value"
+
+ request = resourcemanager_v3.UpdateTagValueRequest(
+ tag_value=tag_value,
+ )
+
+ # Make the request
+ operation = client.update_tag_value(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.UpdateTagValueRequest, dict]):
+ The request object. The request message for updating a
+ TagValue.
+ tag_value (google.cloud.resourcemanager_v3.types.TagValue):
+ Required. The new definition of the TagValue. Only
+ fields ``description`` and ``etag`` fields can be
+ updated by this request. If the ``etag`` field is
+ nonempty, it must match the ``etag`` field of the
+ existing ControlGroup. Otherwise, ``ABORTED`` will be
+ returned.
+
+ This corresponds to the ``tag_value`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Optional. Fields to be updated.
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.TagValue` A TagValue is a child of a particular TagKey. This is used to group
+ cloud resources for the purpose of controlling them
+ using policies.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([tag_value, update_mask])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_values.UpdateTagValueRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_values.UpdateTagValueRequest):
+ request = tag_values.UpdateTagValueRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if tag_value is not None:
+ request.tag_value = tag_value
+ if update_mask is not None:
+ request.update_mask = update_mask
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.update_tag_value]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("tag_value.name", request.tag_value.name),)
+ ),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ tag_values.TagValue,
+ metadata_type=tag_values.UpdateTagValueMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def delete_tag_value(
+ self,
+ request: Optional[Union[tag_values.DeleteTagValueRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operation.Operation:
+ r"""Deletes a TagValue. The TagValue cannot have any
+ bindings when it is deleted.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+
+ def sample_delete_tag_value():
+ # Create a client
+ client = resourcemanager_v3.TagValuesClient()
+
+ # Initialize request argument(s)
+ request = resourcemanager_v3.DeleteTagValueRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ operation = client.delete_tag_value(request=request)
+
+ print("Waiting for operation to complete...")
+
+ response = operation.result()
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.resourcemanager_v3.types.DeleteTagValueRequest, dict]):
+ The request object. The request message for deleting a
+ TagValue.
+ name (str):
+ Required. Resource name for TagValue
+ to be deleted in the format
+ tagValues/456.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.api_core.operation.Operation:
+ An object representing a long-running operation.
+
+ The result type for the operation will be :class:`google.cloud.resourcemanager_v3.types.TagValue` A TagValue is a child of a particular TagKey. This is used to group
+ cloud resources for the purpose of controlling them
+ using policies.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a tag_values.DeleteTagValueRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, tag_values.DeleteTagValueRequest):
+ request = tag_values.DeleteTagValueRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.delete_tag_value]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Wrap the response in an operation future.
+ response = operation.from_gapic(
+ response,
+ self._transport.operations_client,
+ tag_values.TagValue,
+ metadata_type=tag_values.DeleteTagValueMetadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.GetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Gets the access control policy for a TagValue. The returned
+ policy may be empty if no such policy or resource exists. The
+ ``resource`` field should be the TagValue's resource name. For
+ example: ``tagValues/1234``. The caller must have the
+ ``cloudresourcemanager.googleapis.com/tagValues.getIamPolicy``
+ permission on the identified TagValue to get the access control
+ policy.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_get_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.TagValuesClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.GetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = client.get_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.GetIamPolicyRequest, dict]):
+ The request object. Request message for ``GetIamPolicy`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy is being requested. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.GetIamPolicyRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.GetIamPolicyRequest()
+ if resource is not None:
+ request.resource = resource
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_iam_policy]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def set_iam_policy(
+ self,
+ request: Optional[Union[iam_policy_pb2.SetIamPolicyRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Sets the access control policy on a TagValue, replacing any
+ existing policy. The ``resource`` field should be the TagValue's
+ resource name. For example: ``tagValues/1234``. The caller must
+ have ``resourcemanager.tagValues.setIamPolicy`` permission on
+ the identified tagValue.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_set_iam_policy():
+ # Create a client
+ client = resourcemanager_v3.TagValuesClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.SetIamPolicyRequest(
+ resource="resource_value",
+ )
+
+ # Make the request
+ response = client.set_iam_policy(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.SetIamPolicyRequest, dict]):
+ The request object. Request message for ``SetIamPolicy`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy is being specified. See the
+ operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which specifies access
+ controls for Google Cloud resources.
+
+ A Policy is a collection of bindings. A binding binds
+ one or more members, or principals, to a single role.
+ Principals can be user accounts, service accounts,
+ Google groups, and domains (such as G Suite). A role
+ is a named list of permissions; each role can be an
+ IAM predefined role or a user-created custom role.
+
+ For some types of Google Cloud resources, a binding
+ can also specify a condition, which is a logical
+ expression that allows access to a resource only if
+ the expression evaluates to true. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the [IAM
+ documentation](\ https://cloud.google.com/iam/help/conditions/resource-policies).
+
+ **JSON example:**
+
+ {
+ "bindings": [
+ {
+ "role":
+ "roles/resourcemanager.organizationAdmin",
+ "members": [ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+
+ }, { "role":
+ "roles/resourcemanager.organizationViewer",
+ "members": [ "user:eve@example.com" ],
+ "condition": { "title": "expirable access",
+ "description": "Does not grant access after
+ Sep 2020", "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')", } }
+
+ ], "etag": "BwWWja0YfJA=", "version": 3
+
+ }
+
+ **YAML example:**
+
+ bindings: - members: - user:\ mike@example.com -
+ group:\ admins@example.com - domain:google.com -
+ serviceAccount:\ my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin -
+ members: - user:\ eve@example.com role:
+ roles/resourcemanager.organizationViewer
+ condition: title: expirable access description:
+ Does not grant access after Sep 2020 expression:
+ request.time <
+ timestamp('2020-10-01T00:00:00.000Z') etag:
+ BwWWja0YfJA= version: 3
+
+ For a description of IAM and its features, see the
+ [IAM
+ documentation](\ https://cloud.google.com/iam/docs/).
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.SetIamPolicyRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.SetIamPolicyRequest()
+ if resource is not None:
+ request.resource = resource
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.set_iam_policy]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def test_iam_permissions(
+ self,
+ request: Optional[Union[iam_policy_pb2.TestIamPermissionsRequest, dict]] = None,
+ *,
+ resource: Optional[str] = None,
+ permissions: Optional[MutableSequence[str]] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Returns permissions that a caller has on the specified TagValue.
+ The ``resource`` field should be the TagValue's resource name.
+ For example: ``tagValues/1234``.
+
+ There are no permissions required for making this API call.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import resourcemanager_v3
+ from google.iam.v1 import iam_policy_pb2 # type: ignore
+
+ def sample_test_iam_permissions():
+ # Create a client
+ client = resourcemanager_v3.TagValuesClient()
+
+ # Initialize request argument(s)
+ request = iam_policy_pb2.TestIamPermissionsRequest(
+ resource="resource_value",
+ permissions=['permissions_value1', 'permissions_value2'],
+ )
+
+ # Make the request
+ response = client.test_iam_permissions(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest, dict]):
+ The request object. Request message for ``TestIamPermissions`` method.
+ resource (str):
+ REQUIRED: The resource for which the
+ policy detail is being requested. See
+ the operation documentation for the
+ appropriate value for this field.
+
+ This corresponds to the ``resource`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ permissions (MutableSequence[str]):
+ The set of permissions to check for the ``resource``.
+ Permissions with wildcards (such as '*' or 'storage.*')
+ are not allowed. For more information see `IAM
+ Overview <https://cloud.google.com/iam/docs/overview#permissions>`__.
+
+ This corresponds to the ``permissions`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for TestIamPermissions method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([resource, permissions])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ if isinstance(request, dict):
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ request = iam_policy_pb2.TestIamPermissionsRequest(**request)
+ elif not request:
+ # Null request, just make one.
+ request = iam_policy_pb2.TestIamPermissionsRequest()
+ if resource is not None:
+ request.resource = resource
+ if permissions:
+ request.permissions.extend(permissions)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.test_iam_permissions]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("resource", request.resource),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def __enter__(self) -> "TagValuesClient":
+ return self
+
+ def __exit__(self, type, value, traceback):
+ """Releases underlying transport's resources.
+
+ .. warning::
+ ONLY use as a context manager if the transport is NOT shared
+ with other clients! Exiting the with block will CLOSE the transport
+ and may cause errors in other clients!
+ """
+ self.transport.close()
+
+ def get_operation(
+ self,
+ request: Optional[operations_pb2.GetOperationRequest] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Gets the latest state of a long-running operation.
+
+ Args:
+ request (:class:`~.operations_pb2.GetOperationRequest`):
+ The request object. Request message for
+ `GetOperation` method.
+ retry (google.api_core.retry.Retry): Designation of what errors,
+ if any, should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ Returns:
+ ~.operations_pb2.Operation:
+ An ``Operation`` object.
+ """
+ # Create or coerce a protobuf request object.
+ # The request isn't a proto-plus wrapped type,
+ # so it must be constructed via keyword expansion.
+ if isinstance(request, dict):
+ request = operations_pb2.GetOperationRequest(**request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method.wrap_method(
+ self._transport.get_operation,
+ default_timeout=None,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("TagValuesClient",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/pagers.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/pagers.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/pagers.py
@@ -0,0 +1,155 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import (
+ Any,
+ AsyncIterator,
+ Awaitable,
+ Callable,
+ Iterator,
+ Optional,
+ Sequence,
+ Tuple,
+)
+
+from google.cloud.resourcemanager_v3.types import tag_values
+
+
+class ListTagValuesPager:
+ """A pager for iterating through ``list_tag_values`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListTagValuesResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``tag_values`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListTagValues`` requests and continue to iterate
+ through the ``tag_values`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListTagValuesResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., tag_values.ListTagValuesResponse],
+ request: tag_values.ListTagValuesRequest,
+ response: tag_values.ListTagValuesResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListTagValuesRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListTagValuesResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = tag_values.ListTagValuesRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[tag_values.ListTagValuesResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[tag_values.TagValue]:
+ for page in self.pages:
+ yield from page.tag_values
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListTagValuesAsyncPager:
+ """A pager for iterating through ``list_tag_values`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.resourcemanager_v3.types.ListTagValuesResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``tag_values`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListTagValues`` requests and continue to iterate
+ through the ``tag_values`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.resourcemanager_v3.types.ListTagValuesResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[tag_values.ListTagValuesResponse]],
+ request: tag_values.ListTagValuesRequest,
+ response: tag_values.ListTagValuesResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.resourcemanager_v3.types.ListTagValuesRequest):
+ The initial request object.
+ response (google.cloud.resourcemanager_v3.types.ListTagValuesResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = tag_values.ListTagValuesRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[tag_values.ListTagValuesResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[tag_values.TagValue]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.tag_values:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/__init__.py
@@ -0,0 +1,36 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+from typing import Dict, Type
+
+from .base import TagValuesTransport
+from .grpc import TagValuesGrpcTransport
+from .grpc_asyncio import TagValuesGrpcAsyncIOTransport
+from .rest import TagValuesRestInterceptor, TagValuesRestTransport
+
+# Compile a registry of transports.
+_transport_registry = OrderedDict() # type: Dict[str, Type[TagValuesTransport]]
+_transport_registry["grpc"] = TagValuesGrpcTransport
+_transport_registry["grpc_asyncio"] = TagValuesGrpcAsyncIOTransport
+_transport_registry["rest"] = TagValuesRestTransport
+
+__all__ = (
+ "TagValuesTransport",
+ "TagValuesGrpcTransport",
+ "TagValuesGrpcAsyncIOTransport",
+ "TagValuesRestTransport",
+ "TagValuesRestInterceptor",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/base.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/base.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/base.py
@@ -0,0 +1,320 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import abc
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Union
+
+import google.api_core
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1, operations_v1
+from google.api_core import retry as retries
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.resourcemanager_v3 import gapic_version as package_version
+from google.cloud.resourcemanager_v3.types import tag_values
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+class TagValuesTransport(abc.ABC):
+ """Abstract transport class for TagValues."""
+
+ AUTH_SCOPES = (
+ "https://www.googleapis.com/auth/cloud-platform",
+ "https://www.googleapis.com/auth/cloud-platform.read-only",
+ )
+
+ DEFAULT_HOST: str = "cloudresourcemanager.googleapis.com"
+
+ def __init__(
+ self,
+ *,
+ host: str = DEFAULT_HOST,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ **kwargs,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A list of scopes.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ """
+
+ scopes_kwargs = {"scopes": scopes, "default_scopes": self.AUTH_SCOPES}
+
+ # Save the scopes.
+ self._scopes = scopes
+
+ # If no credentials are provided, then determine the appropriate
+ # defaults.
+ if credentials and credentials_file:
+ raise core_exceptions.DuplicateCredentialArgs(
+ "'credentials_file' and 'credentials' are mutually exclusive"
+ )
+
+ if credentials_file is not None:
+ credentials, _ = google.auth.load_credentials_from_file(
+ credentials_file, **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ elif credentials is None:
+ credentials, _ = google.auth.default(
+ **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ # Don't apply audience if the credentials file passed from user.
+ if hasattr(credentials, "with_gdch_audience"):
+ credentials = credentials.with_gdch_audience(
+ api_audience if api_audience else host
+ )
+
+ # If the credentials are service account credentials, then always try to use self signed JWT.
+ if (
+ always_use_jwt_access
+ and isinstance(credentials, service_account.Credentials)
+ and hasattr(service_account.Credentials, "with_always_use_jwt_access")
+ ):
+ credentials = credentials.with_always_use_jwt_access(True)
+
+ # Save the credentials.
+ self._credentials = credentials
+
+ # Save the hostname. Default to port 443 (HTTPS) if none is specified.
+ if ":" not in host:
+ host += ":443"
+ self._host = host
+
+ def _prep_wrapped_messages(self, client_info):
+ # Precompute the wrapped methods.
+ self._wrapped_methods = {
+ self.list_tag_values: gapic_v1.method.wrap_method(
+ self.list_tag_values,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.get_tag_value: gapic_v1.method.wrap_method(
+ self.get_tag_value,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.get_namespaced_tag_value: gapic_v1.method.wrap_method(
+ self.get_namespaced_tag_value,
+ default_timeout=None,
+ client_info=client_info,
+ ),
+ self.create_tag_value: gapic_v1.method.wrap_method(
+ self.create_tag_value,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.update_tag_value: gapic_v1.method.wrap_method(
+ self.update_tag_value,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.delete_tag_value: gapic_v1.method.wrap_method(
+ self.delete_tag_value,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.get_iam_policy: gapic_v1.method.wrap_method(
+ self.get_iam_policy,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=60.0,
+ ),
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.set_iam_policy: gapic_v1.method.wrap_method(
+ self.set_iam_policy,
+ default_timeout=60.0,
+ client_info=client_info,
+ ),
+ self.test_iam_permissions: gapic_v1.method.wrap_method(
+ self.test_iam_permissions,
+ default_timeout=None,
+ client_info=client_info,
+ ),
+ }
+
+ def close(self):
+ """Closes resources associated with the transport.
+
+ .. warning::
+ Only call this method if the transport is NOT shared
+ with other clients - this may cause errors in other clients!
+ """
+ raise NotImplementedError()
+
+ @property
+ def operations_client(self):
+ """Return the client designed to process long-running operations."""
+ raise NotImplementedError()
+
+ @property
+ def list_tag_values(
+ self,
+ ) -> Callable[
+ [tag_values.ListTagValuesRequest],
+ Union[
+ tag_values.ListTagValuesResponse,
+ Awaitable[tag_values.ListTagValuesResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_tag_value(
+ self,
+ ) -> Callable[
+ [tag_values.GetTagValueRequest],
+ Union[tag_values.TagValue, Awaitable[tag_values.TagValue]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_namespaced_tag_value(
+ self,
+ ) -> Callable[
+ [tag_values.GetNamespacedTagValueRequest],
+ Union[tag_values.TagValue, Awaitable[tag_values.TagValue]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def create_tag_value(
+ self,
+ ) -> Callable[
+ [tag_values.CreateTagValueRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def update_tag_value(
+ self,
+ ) -> Callable[
+ [tag_values.UpdateTagValueRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def delete_tag_value(
+ self,
+ ) -> Callable[
+ [tag_values.DeleteTagValueRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.GetIamPolicyRequest],
+ Union[policy_pb2.Policy, Awaitable[policy_pb2.Policy]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.SetIamPolicyRequest],
+ Union[policy_pb2.Policy, Awaitable[policy_pb2.Policy]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ Union[
+ iam_policy_pb2.TestIamPermissionsResponse,
+ Awaitable[iam_policy_pb2.TestIamPermissionsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[
+ [operations_pb2.GetOperationRequest],
+ Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def kind(self) -> str:
+ raise NotImplementedError()
+
+
+__all__ = ("TagValuesTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/grpc.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/grpc.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/grpc.py
@@ -0,0 +1,534 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers, operations_v1
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+
+from google.cloud.resourcemanager_v3.types import tag_values
+
+from .base import DEFAULT_CLIENT_INFO, TagValuesTransport
+
+
+class TagValuesGrpcTransport(TagValuesTransport):
+ """gRPC backend transport for TagValues.
+
+ Allow users to create and manage tag values.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _stubs: Dict[str, Callable]
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[grpc.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ channel (Optional[grpc.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> grpc.Channel:
+ """Create and return a gRPC channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ grpc.Channel: A gRPC channel object.
+
+ Raises:
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+
+ return grpc_helpers.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ @property
+ def grpc_channel(self) -> grpc.Channel:
+ """Return the channel designed to connect to this service."""
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsClient(self.grpc_channel)
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def list_tag_values(
+ self,
+ ) -> Callable[[tag_values.ListTagValuesRequest], tag_values.ListTagValuesResponse]:
+ r"""Return a callable for the list tag values method over gRPC.
+
+ Lists all TagValues for a specific TagKey.
+
+ Returns:
+ Callable[[~.ListTagValuesRequest],
+ ~.ListTagValuesResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_tag_values" not in self._stubs:
+ self._stubs["list_tag_values"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/ListTagValues",
+ request_serializer=tag_values.ListTagValuesRequest.serialize,
+ response_deserializer=tag_values.ListTagValuesResponse.deserialize,
+ )
+ return self._stubs["list_tag_values"]
+
+ @property
+ def get_tag_value(
+ self,
+ ) -> Callable[[tag_values.GetTagValueRequest], tag_values.TagValue]:
+ r"""Return a callable for the get tag value method over gRPC.
+
+ Retrieves a TagValue. This method will return
+ ``PERMISSION_DENIED`` if the value does not exist or the user
+ does not have permission to view it.
+
+ Returns:
+ Callable[[~.GetTagValueRequest],
+ ~.TagValue]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_tag_value" not in self._stubs:
+ self._stubs["get_tag_value"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/GetTagValue",
+ request_serializer=tag_values.GetTagValueRequest.serialize,
+ response_deserializer=tag_values.TagValue.deserialize,
+ )
+ return self._stubs["get_tag_value"]
+
+ @property
+ def get_namespaced_tag_value(
+ self,
+ ) -> Callable[[tag_values.GetNamespacedTagValueRequest], tag_values.TagValue]:
+ r"""Return a callable for the get namespaced tag value method over gRPC.
+
+ Retrieves a TagValue by its namespaced name. This method will
+ return ``PERMISSION_DENIED`` if the value does not exist or the
+ user does not have permission to view it.
+
+ Returns:
+ Callable[[~.GetNamespacedTagValueRequest],
+ ~.TagValue]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_namespaced_tag_value" not in self._stubs:
+ self._stubs["get_namespaced_tag_value"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/GetNamespacedTagValue",
+ request_serializer=tag_values.GetNamespacedTagValueRequest.serialize,
+ response_deserializer=tag_values.TagValue.deserialize,
+ )
+ return self._stubs["get_namespaced_tag_value"]
+
+ @property
+ def create_tag_value(
+ self,
+ ) -> Callable[[tag_values.CreateTagValueRequest], operations_pb2.Operation]:
+ r"""Return a callable for the create tag value method over gRPC.
+
+ Creates a TagValue as a child of the specified
+ TagKey. If a another request with the same parameters is
+ sent while the original request is in process the second
+ request will receive an error. A maximum of 1000
+ TagValues can exist under a TagKey at any given time.
+
+ Returns:
+ Callable[[~.CreateTagValueRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_tag_value" not in self._stubs:
+ self._stubs["create_tag_value"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/CreateTagValue",
+ request_serializer=tag_values.CreateTagValueRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_tag_value"]
+
+ @property
+ def update_tag_value(
+ self,
+ ) -> Callable[[tag_values.UpdateTagValueRequest], operations_pb2.Operation]:
+ r"""Return a callable for the update tag value method over gRPC.
+
+ Updates the attributes of the TagValue resource.
+
+ Returns:
+ Callable[[~.UpdateTagValueRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_tag_value" not in self._stubs:
+ self._stubs["update_tag_value"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/UpdateTagValue",
+ request_serializer=tag_values.UpdateTagValueRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["update_tag_value"]
+
+ @property
+ def delete_tag_value(
+ self,
+ ) -> Callable[[tag_values.DeleteTagValueRequest], operations_pb2.Operation]:
+ r"""Return a callable for the delete tag value method over gRPC.
+
+ Deletes a TagValue. The TagValue cannot have any
+ bindings when it is deleted.
+
+ Returns:
+ Callable[[~.DeleteTagValueRequest],
+ ~.Operation]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_tag_value" not in self._stubs:
+ self._stubs["delete_tag_value"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/DeleteTagValue",
+ request_serializer=tag_values.DeleteTagValueRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_tag_value"]
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], policy_pb2.Policy]:
+ r"""Return a callable for the get iam policy method over gRPC.
+
+ Gets the access control policy for a TagValue. The returned
+ policy may be empty if no such policy or resource exists. The
+ ``resource`` field should be the TagValue's resource name. For
+ example: ``tagValues/1234``. The caller must have the
+ ``cloudresourcemanager.googleapis.com/tagValues.getIamPolicy``
+ permission on the identified TagValue to get the access control
+ policy.
+
+ Returns:
+ Callable[[~.GetIamPolicyRequest],
+ ~.Policy]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_iam_policy" not in self._stubs:
+ self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/GetIamPolicy",
+ request_serializer=iam_policy_pb2.GetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["get_iam_policy"]
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], policy_pb2.Policy]:
+ r"""Return a callable for the set iam policy method over gRPC.
+
+ Sets the access control policy on a TagValue, replacing any
+ existing policy. The ``resource`` field should be the TagValue's
+ resource name. For example: ``tagValues/1234``. The caller must
+ have ``resourcemanager.tagValues.setIamPolicy`` permission on
+ the identified tagValue.
+
+ Returns:
+ Callable[[~.SetIamPolicyRequest],
+ ~.Policy]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "set_iam_policy" not in self._stubs:
+ self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/SetIamPolicy",
+ request_serializer=iam_policy_pb2.SetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["set_iam_policy"]
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ iam_policy_pb2.TestIamPermissionsResponse,
+ ]:
+ r"""Return a callable for the test iam permissions method over gRPC.
+
+ Returns permissions that a caller has on the specified TagValue.
+ The ``resource`` field should be the TagValue's resource name.
+ For example: ``tagValues/1234``.
+
+ There are no permissions required for making this API call.
+
+ Returns:
+ Callable[[~.TestIamPermissionsRequest],
+ ~.TestIamPermissionsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "test_iam_permissions" not in self._stubs:
+ self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/TestIamPermissions",
+ request_serializer=iam_policy_pb2.TestIamPermissionsRequest.SerializeToString,
+ response_deserializer=iam_policy_pb2.TestIamPermissionsResponse.FromString,
+ )
+ return self._stubs["test_iam_permissions"]
+
+ def close(self):
+ self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+ @property
+ def kind(self) -> str:
+ return "grpc"
+
+
+__all__ = ("TagValuesGrpcTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/grpc_asyncio.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/grpc_asyncio.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/grpc_asyncio.py
@@ -0,0 +1,545 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers_async, operations_v1
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+import grpc # type: ignore
+from grpc.experimental import aio # type: ignore
+
+from google.cloud.resourcemanager_v3.types import tag_values
+
+from .base import DEFAULT_CLIENT_INFO, TagValuesTransport
+from .grpc import TagValuesGrpcTransport
+
+
+class TagValuesGrpcAsyncIOTransport(TagValuesTransport):
+ """gRPC AsyncIO backend transport for TagValues.
+
+ Allow users to create and manage tag values.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _grpc_channel: aio.Channel
+ _stubs: Dict[str, Callable] = {}
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> aio.Channel:
+ """Create and return a gRPC AsyncIO channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ aio.Channel: A gRPC AsyncIO channel object.
+ """
+
+ return grpc_helpers_async.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[aio.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ channel (Optional[aio.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+ self._operations_client: Optional[operations_v1.OperationsAsyncClient] = None
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def grpc_channel(self) -> aio.Channel:
+ """Create the channel designed to connect to this service.
+
+ This property caches on the instance; repeated calls return
+ the same channel.
+ """
+ # Return the channel from cache.
+ return self._grpc_channel
+
+ @property
+ def operations_client(self) -> operations_v1.OperationsAsyncClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Quick check: Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ self._operations_client = operations_v1.OperationsAsyncClient(
+ self.grpc_channel
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ @property
+ def list_tag_values(
+ self,
+ ) -> Callable[
+ [tag_values.ListTagValuesRequest], Awaitable[tag_values.ListTagValuesResponse]
+ ]:
+ r"""Return a callable for the list tag values method over gRPC.
+
+ Lists all TagValues for a specific TagKey.
+
+ Returns:
+ Callable[[~.ListTagValuesRequest],
+ Awaitable[~.ListTagValuesResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_tag_values" not in self._stubs:
+ self._stubs["list_tag_values"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/ListTagValues",
+ request_serializer=tag_values.ListTagValuesRequest.serialize,
+ response_deserializer=tag_values.ListTagValuesResponse.deserialize,
+ )
+ return self._stubs["list_tag_values"]
+
+ @property
+ def get_tag_value(
+ self,
+ ) -> Callable[[tag_values.GetTagValueRequest], Awaitable[tag_values.TagValue]]:
+ r"""Return a callable for the get tag value method over gRPC.
+
+ Retrieves a TagValue. This method will return
+ ``PERMISSION_DENIED`` if the value does not exist or the user
+ does not have permission to view it.
+
+ Returns:
+ Callable[[~.GetTagValueRequest],
+ Awaitable[~.TagValue]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_tag_value" not in self._stubs:
+ self._stubs["get_tag_value"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/GetTagValue",
+ request_serializer=tag_values.GetTagValueRequest.serialize,
+ response_deserializer=tag_values.TagValue.deserialize,
+ )
+ return self._stubs["get_tag_value"]
+
+ @property
+ def get_namespaced_tag_value(
+ self,
+ ) -> Callable[
+ [tag_values.GetNamespacedTagValueRequest], Awaitable[tag_values.TagValue]
+ ]:
+ r"""Return a callable for the get namespaced tag value method over gRPC.
+
+ Retrieves a TagValue by its namespaced name. This method will
+ return ``PERMISSION_DENIED`` if the value does not exist or the
+ user does not have permission to view it.
+
+ Returns:
+ Callable[[~.GetNamespacedTagValueRequest],
+ Awaitable[~.TagValue]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_namespaced_tag_value" not in self._stubs:
+ self._stubs["get_namespaced_tag_value"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/GetNamespacedTagValue",
+ request_serializer=tag_values.GetNamespacedTagValueRequest.serialize,
+ response_deserializer=tag_values.TagValue.deserialize,
+ )
+ return self._stubs["get_namespaced_tag_value"]
+
+ @property
+ def create_tag_value(
+ self,
+ ) -> Callable[
+ [tag_values.CreateTagValueRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the create tag value method over gRPC.
+
+ Creates a TagValue as a child of the specified
+ TagKey. If a another request with the same parameters is
+ sent while the original request is in process the second
+ request will receive an error. A maximum of 1000
+ TagValues can exist under a TagKey at any given time.
+
+ Returns:
+ Callable[[~.CreateTagValueRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_tag_value" not in self._stubs:
+ self._stubs["create_tag_value"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/CreateTagValue",
+ request_serializer=tag_values.CreateTagValueRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["create_tag_value"]
+
+ @property
+ def update_tag_value(
+ self,
+ ) -> Callable[
+ [tag_values.UpdateTagValueRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the update tag value method over gRPC.
+
+ Updates the attributes of the TagValue resource.
+
+ Returns:
+ Callable[[~.UpdateTagValueRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_tag_value" not in self._stubs:
+ self._stubs["update_tag_value"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/UpdateTagValue",
+ request_serializer=tag_values.UpdateTagValueRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["update_tag_value"]
+
+ @property
+ def delete_tag_value(
+ self,
+ ) -> Callable[
+ [tag_values.DeleteTagValueRequest], Awaitable[operations_pb2.Operation]
+ ]:
+ r"""Return a callable for the delete tag value method over gRPC.
+
+ Deletes a TagValue. The TagValue cannot have any
+ bindings when it is deleted.
+
+ Returns:
+ Callable[[~.DeleteTagValueRequest],
+ Awaitable[~.Operation]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_tag_value" not in self._stubs:
+ self._stubs["delete_tag_value"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/DeleteTagValue",
+ request_serializer=tag_values.DeleteTagValueRequest.serialize,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["delete_tag_value"]
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], Awaitable[policy_pb2.Policy]]:
+ r"""Return a callable for the get iam policy method over gRPC.
+
+ Gets the access control policy for a TagValue. The returned
+ policy may be empty if no such policy or resource exists. The
+ ``resource`` field should be the TagValue's resource name. For
+ example: ``tagValues/1234``. The caller must have the
+ ``cloudresourcemanager.googleapis.com/tagValues.getIamPolicy``
+ permission on the identified TagValue to get the access control
+ policy.
+
+ Returns:
+ Callable[[~.GetIamPolicyRequest],
+ Awaitable[~.Policy]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_iam_policy" not in self._stubs:
+ self._stubs["get_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/GetIamPolicy",
+ request_serializer=iam_policy_pb2.GetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["get_iam_policy"]
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], Awaitable[policy_pb2.Policy]]:
+ r"""Return a callable for the set iam policy method over gRPC.
+
+ Sets the access control policy on a TagValue, replacing any
+ existing policy. The ``resource`` field should be the TagValue's
+ resource name. For example: ``tagValues/1234``. The caller must
+ have ``resourcemanager.tagValues.setIamPolicy`` permission on
+ the identified tagValue.
+
+ Returns:
+ Callable[[~.SetIamPolicyRequest],
+ Awaitable[~.Policy]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "set_iam_policy" not in self._stubs:
+ self._stubs["set_iam_policy"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/SetIamPolicy",
+ request_serializer=iam_policy_pb2.SetIamPolicyRequest.SerializeToString,
+ response_deserializer=policy_pb2.Policy.FromString,
+ )
+ return self._stubs["set_iam_policy"]
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ Awaitable[iam_policy_pb2.TestIamPermissionsResponse],
+ ]:
+ r"""Return a callable for the test iam permissions method over gRPC.
+
+ Returns permissions that a caller has on the specified TagValue.
+ The ``resource`` field should be the TagValue's resource name.
+ For example: ``tagValues/1234``.
+
+ There are no permissions required for making this API call.
+
+ Returns:
+ Callable[[~.TestIamPermissionsRequest],
+ Awaitable[~.TestIamPermissionsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "test_iam_permissions" not in self._stubs:
+ self._stubs["test_iam_permissions"] = self.grpc_channel.unary_unary(
+ "/google.cloud.resourcemanager.v3.TagValues/TestIamPermissions",
+ request_serializer=iam_policy_pb2.TestIamPermissionsRequest.SerializeToString,
+ response_deserializer=iam_policy_pb2.TestIamPermissionsResponse.FromString,
+ )
+ return self._stubs["test_iam_permissions"]
+
+ def close(self):
+ return self.grpc_channel.close()
+
+ @property
+ def get_operation(
+ self,
+ ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]:
+ r"""Return a callable for the get_operation method over gRPC."""
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_operation" not in self._stubs:
+ self._stubs["get_operation"] = self.grpc_channel.unary_unary(
+ "/google.longrunning.Operations/GetOperation",
+ request_serializer=operations_pb2.GetOperationRequest.SerializeToString,
+ response_deserializer=operations_pb2.Operation.FromString,
+ )
+ return self._stubs["get_operation"]
+
+
+__all__ = ("TagValuesGrpcAsyncIOTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/rest.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/rest.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/services/tag_values/transports/rest.py
@@ -0,0 +1,1663 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import dataclasses
+import json # type: ignore
+import re
+from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import (
+ gapic_v1,
+ operations_v1,
+ path_template,
+ rest_helpers,
+ rest_streaming,
+)
+from google.api_core import exceptions as core_exceptions
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.auth.transport.requests import AuthorizedSession # type: ignore
+from google.longrunning import operations_pb2
+from google.protobuf import json_format
+import grpc # type: ignore
+from requests import __version__ as requests_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+
+from google.iam.v1 import iam_policy_pb2 # type: ignore
+from google.iam.v1 import policy_pb2 # type: ignore
+from google.longrunning import operations_pb2 # type: ignore
+
+from google.cloud.resourcemanager_v3.types import tag_values
+
+from .base import DEFAULT_CLIENT_INFO as BASE_DEFAULT_CLIENT_INFO
+from .base import TagValuesTransport
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=BASE_DEFAULT_CLIENT_INFO.gapic_version,
+ grpc_version=None,
+ rest_version=requests_version,
+)
+
+
+class TagValuesRestInterceptor:
+ """Interceptor for TagValues.
+
+ Interceptors are used to manipulate requests, request metadata, and responses
+ in arbitrary ways.
+ Example use cases include:
+ * Logging
+ * Verifying requests according to service or custom semantics
+ * Stripping extraneous information from responses
+
+ These use cases and more can be enabled by injecting an
+ instance of a custom subclass when constructing the TagValuesRestTransport.
+
+ .. code-block:: python
+ class MyCustomTagValuesInterceptor(TagValuesRestInterceptor):
+ def pre_create_tag_value(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_create_tag_value(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_delete_tag_value(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_delete_tag_value(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_iam_policy(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_iam_policy(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_namespaced_tag_value(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_namespaced_tag_value(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_tag_value(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_tag_value(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_tag_values(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_tag_values(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_set_iam_policy(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_set_iam_policy(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_test_iam_permissions(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_test_iam_permissions(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_update_tag_value(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_update_tag_value(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ transport = TagValuesRestTransport(interceptor=MyCustomTagValuesInterceptor())
+ client = TagValuesClient(transport=transport)
+
+
+ """
+
+ def pre_create_tag_value(
+ self,
+ request: tag_values.CreateTagValueRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_values.CreateTagValueRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for create_tag_value
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagValues server.
+ """
+ return request, metadata
+
+ def post_create_tag_value(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for create_tag_value
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagValues server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_delete_tag_value(
+ self,
+ request: tag_values.DeleteTagValueRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_values.DeleteTagValueRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for delete_tag_value
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagValues server.
+ """
+ return request, metadata
+
+ def post_delete_tag_value(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for delete_tag_value
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagValues server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_iam_policy(
+ self,
+ request: iam_policy_pb2.GetIamPolicyRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.GetIamPolicyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_iam_policy
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagValues server.
+ """
+ return request, metadata
+
+ def post_get_iam_policy(self, response: policy_pb2.Policy) -> policy_pb2.Policy:
+ """Post-rpc interceptor for get_iam_policy
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagValues server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_namespaced_tag_value(
+ self,
+ request: tag_values.GetNamespacedTagValueRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_values.GetNamespacedTagValueRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_namespaced_tag_value
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagValues server.
+ """
+ return request, metadata
+
+ def post_get_namespaced_tag_value(
+ self, response: tag_values.TagValue
+ ) -> tag_values.TagValue:
+ """Post-rpc interceptor for get_namespaced_tag_value
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagValues server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_tag_value(
+ self,
+ request: tag_values.GetTagValueRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_values.GetTagValueRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_tag_value
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagValues server.
+ """
+ return request, metadata
+
+ def post_get_tag_value(self, response: tag_values.TagValue) -> tag_values.TagValue:
+ """Post-rpc interceptor for get_tag_value
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagValues server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_tag_values(
+ self,
+ request: tag_values.ListTagValuesRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_values.ListTagValuesRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_tag_values
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagValues server.
+ """
+ return request, metadata
+
+ def post_list_tag_values(
+ self, response: tag_values.ListTagValuesResponse
+ ) -> tag_values.ListTagValuesResponse:
+ """Post-rpc interceptor for list_tag_values
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagValues server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_set_iam_policy(
+ self,
+ request: iam_policy_pb2.SetIamPolicyRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.SetIamPolicyRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for set_iam_policy
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagValues server.
+ """
+ return request, metadata
+
+ def post_set_iam_policy(self, response: policy_pb2.Policy) -> policy_pb2.Policy:
+ """Post-rpc interceptor for set_iam_policy
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagValues server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_test_iam_permissions(
+ self,
+ request: iam_policy_pb2.TestIamPermissionsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[iam_policy_pb2.TestIamPermissionsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for test_iam_permissions
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagValues server.
+ """
+ return request, metadata
+
+ def post_test_iam_permissions(
+ self, response: iam_policy_pb2.TestIamPermissionsResponse
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ """Post-rpc interceptor for test_iam_permissions
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagValues server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_update_tag_value(
+ self,
+ request: tag_values.UpdateTagValueRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[tag_values.UpdateTagValueRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for update_tag_value
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagValues server.
+ """
+ return request, metadata
+
+ def post_update_tag_value(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for update_tag_value
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagValues server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_operation(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[operations_pb2.GetOperationRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the TagValues server.
+ """
+ return request, metadata
+
+ def post_get_operation(
+ self, response: operations_pb2.Operation
+ ) -> operations_pb2.Operation:
+ """Post-rpc interceptor for get_operation
+
+ Override in a subclass to manipulate the response
+ after it is returned by the TagValues server but before
+ it is returned to user code.
+ """
+ return response
+
+
+@dataclasses.dataclass
+class TagValuesRestStub:
+ _session: AuthorizedSession
+ _host: str
+ _interceptor: TagValuesRestInterceptor
+
+
+class TagValuesRestTransport(TagValuesTransport):
+ """REST backend transport for TagValues.
+
+ Allow users to create and manage tag values.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends JSON representations of protocol buffers over HTTP/1.1
+
+ """
+
+ def __init__(
+ self,
+ *,
+ host: str = "cloudresourcemanager.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ url_scheme: str = "https",
+ interceptor: Optional[TagValuesRestInterceptor] = None,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Callable[[], Tuple[bytes, bytes]]): Client
+ certificate to configure mutual TLS HTTP channel. It is ignored
+ if ``channel`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you are developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ url_scheme: the protocol scheme for the API endpoint. Normally
+ "https", but for testing or local servers,
+ "http" can be specified.
+ """
+ # Run the base constructor
+ # TODO(yon-mg): resolve other ctor params i.e. scopes, quota, etc.
+ # TODO: When custom host (api_endpoint) is set, `scopes` must *also* be set on the
+ # credentials object
+ maybe_url_match = re.match("^(?P<scheme>http(?:s)?://)?(?P<host>.*)$", host)
+ if maybe_url_match is None:
+ raise ValueError(
+ f"Unexpected hostname structure: {host}"
+ ) # pragma: NO COVER
+
+ url_match_items = maybe_url_match.groupdict()
+
+ host = f"{url_scheme}://{host}" if not url_match_items["scheme"] else host
+
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+ self._session = AuthorizedSession(
+ self._credentials, default_host=self.DEFAULT_HOST
+ )
+ self._operations_client: Optional[operations_v1.AbstractOperationsClient] = None
+ if client_cert_source_for_mtls:
+ self._session.configure_mtls_channel(client_cert_source_for_mtls)
+ self._interceptor = interceptor or TagValuesRestInterceptor()
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def operations_client(self) -> operations_v1.AbstractOperationsClient:
+ """Create the client designed to process long-running operations.
+
+ This property caches on the instance; repeated calls return the same
+ client.
+ """
+ # Only create a new client if we do not already have one.
+ if self._operations_client is None:
+ http_options: Dict[str, List[Dict[str, str]]] = {
+ "google.longrunning.Operations.GetOperation": [
+ {
+ "method": "get",
+ "uri": "/v3/{name=operations/**}",
+ },
+ ],
+ }
+
+ rest_transport = operations_v1.OperationsRestTransport(
+ host=self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ scopes=self._scopes,
+ http_options=http_options,
+ path_prefix="v3",
+ )
+
+ self._operations_client = operations_v1.AbstractOperationsClient(
+ transport=rest_transport
+ )
+
+ # Return the client from cache.
+ return self._operations_client
+
+ class _CreateTagValue(TagValuesRestStub):
+ def __hash__(self):
+ return hash("CreateTagValue")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_values.CreateTagValueRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the create tag value method over HTTP.
+
+ Args:
+ request (~.tag_values.CreateTagValueRequest):
+ The request object. The request message for creating a
+ TagValue.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/tagValues",
+ "body": "tag_value",
+ },
+ ]
+ request, metadata = self._interceptor.pre_create_tag_value(
+ request, metadata
+ )
+ pb_request = tag_values.CreateTagValueRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_create_tag_value(resp)
+ return resp
+
+ class _DeleteTagValue(TagValuesRestStub):
+ def __hash__(self):
+ return hash("DeleteTagValue")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_values.DeleteTagValueRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the delete tag value method over HTTP.
+
+ Args:
+ request (~.tag_values.DeleteTagValueRequest):
+ The request object. The request message for deleting a
+ TagValue.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "delete",
+ "uri": "/v3/{name=tagValues/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_delete_tag_value(
+ request, metadata
+ )
+ pb_request = tag_values.DeleteTagValueRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_delete_tag_value(resp)
+ return resp
+
+ class _GetIamPolicy(TagValuesRestStub):
+ def __hash__(self):
+ return hash("GetIamPolicy")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.GetIamPolicyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Call the get iam policy method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.GetIamPolicyRequest):
+ The request object. Request message for ``GetIamPolicy`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which
+ specifies access controls for Google Cloud resources.
+
+ A ``Policy`` is a collection of ``bindings``. A
+ ``binding`` binds one or more ``members``, or
+ principals, to a single ``role``. Principals can be user
+ accounts, service accounts, Google groups, and domains
+ (such as G Suite). A ``role`` is a named list of
+ permissions; each ``role`` can be an IAM predefined role
+ or a user-created custom role.
+
+ For some types of Google Cloud resources, a ``binding``
+ can also specify a ``condition``, which is a logical
+ expression that allows access to a resource only if the
+ expression evaluates to ``true``. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the `IAM
+ documentation <https://cloud.google.com/iam/help/conditions/resource-policies>`__.
+
+ **JSON example:**
+
+ ::
+
+ {
+ "bindings": [
+ {
+ "role": "roles/resourcemanager.organizationAdmin",
+ "members": [
+ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+ },
+ {
+ "role": "roles/resourcemanager.organizationViewer",
+ "members": [
+ "user:eve@example.com"
+ ],
+ "condition": {
+ "title": "expirable access",
+ "description": "Does not grant access after Sep 2020",
+ "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')",
+ }
+ }
+ ],
+ "etag": "BwWWja0YfJA=",
+ "version": 3
+ }
+
+ **YAML example:**
+
+ ::
+
+ bindings:
+ - members:
+ - user:mike@example.com
+ - group:admins@example.com
+ - domain:google.com
+ - serviceAccount:my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin
+ - members:
+ - user:eve@example.com
+ role: roles/resourcemanager.organizationViewer
+ condition:
+ title: expirable access
+ description: Does not grant access after Sep 2020
+ expression: request.time < timestamp('2020-10-01T00:00:00.000Z')
+ etag: BwWWja0YfJA=
+ version: 3
+
+ For a description of IAM and its features, see the `IAM
+ documentation <https://cloud.google.com/iam/docs/>`__.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=tagValues/*}:getIamPolicy",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_iam_policy(request, metadata)
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = policy_pb2.Policy()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_iam_policy(resp)
+ return resp
+
+ class _GetNamespacedTagValue(TagValuesRestStub):
+ def __hash__(self):
+ return hash("GetNamespacedTagValue")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "name": "",
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_values.GetNamespacedTagValueRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_values.TagValue:
+ r"""Call the get namespaced tag value method over HTTP.
+
+ Args:
+ request (~.tag_values.GetNamespacedTagValueRequest):
+ The request object. The request message for getting a
+ TagValue by its namespaced name.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.tag_values.TagValue:
+ A TagValue is a child of a particular
+ TagKey. This is used to group cloud
+ resources for the purpose of controlling
+ them using policies.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/tagValues/namespaced",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_namespaced_tag_value(
+ request, metadata
+ )
+ pb_request = tag_values.GetNamespacedTagValueRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = tag_values.TagValue()
+ pb_resp = tag_values.TagValue.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_namespaced_tag_value(resp)
+ return resp
+
+ class _GetTagValue(TagValuesRestStub):
+ def __hash__(self):
+ return hash("GetTagValue")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_values.GetTagValueRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_values.TagValue:
+ r"""Call the get tag value method over HTTP.
+
+ Args:
+ request (~.tag_values.GetTagValueRequest):
+ The request object. The request message for getting a
+ TagValue.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.tag_values.TagValue:
+ A TagValue is a child of a particular
+ TagKey. This is used to group cloud
+ resources for the purpose of controlling
+ them using policies.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/{name=tagValues/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_tag_value(request, metadata)
+ pb_request = tag_values.GetTagValueRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = tag_values.TagValue()
+ pb_resp = tag_values.TagValue.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_tag_value(resp)
+ return resp
+
+ class _ListTagValues(TagValuesRestStub):
+ def __hash__(self):
+ return hash("ListTagValues")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "parent": "",
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_values.ListTagValuesRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> tag_values.ListTagValuesResponse:
+ r"""Call the list tag values method over HTTP.
+
+ Args:
+ request (~.tag_values.ListTagValuesRequest):
+ The request object. The request message for listing TagValues for the
+ specified TagKey. Resource name for TagKey, parent of
+ the TagValues to be listed, in the format
+ ``tagKeys/123``.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.tag_values.ListTagValuesResponse:
+ The ListTagValues response.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/tagValues",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_tag_values(request, metadata)
+ pb_request = tag_values.ListTagValuesRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = tag_values.ListTagValuesResponse()
+ pb_resp = tag_values.ListTagValuesResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_tag_values(resp)
+ return resp
+
+ class _SetIamPolicy(TagValuesRestStub):
+ def __hash__(self):
+ return hash("SetIamPolicy")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.SetIamPolicyRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> policy_pb2.Policy:
+ r"""Call the set iam policy method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.SetIamPolicyRequest):
+ The request object. Request message for ``SetIamPolicy`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.policy_pb2.Policy:
+ An Identity and Access Management (IAM) policy, which
+ specifies access controls for Google Cloud resources.
+
+ A ``Policy`` is a collection of ``bindings``. A
+ ``binding`` binds one or more ``members``, or
+ principals, to a single ``role``. Principals can be user
+ accounts, service accounts, Google groups, and domains
+ (such as G Suite). A ``role`` is a named list of
+ permissions; each ``role`` can be an IAM predefined role
+ or a user-created custom role.
+
+ For some types of Google Cloud resources, a ``binding``
+ can also specify a ``condition``, which is a logical
+ expression that allows access to a resource only if the
+ expression evaluates to ``true``. A condition can add
+ constraints based on attributes of the request, the
+ resource, or both. To learn which resources support
+ conditions in their IAM policies, see the `IAM
+ documentation <https://cloud.google.com/iam/help/conditions/resource-policies>`__.
+
+ **JSON example:**
+
+ ::
+
+ {
+ "bindings": [
+ {
+ "role": "roles/resourcemanager.organizationAdmin",
+ "members": [
+ "user:mike@example.com",
+ "group:admins@example.com",
+ "domain:google.com",
+ "serviceAccount:my-project-id@appspot.gserviceaccount.com"
+ ]
+ },
+ {
+ "role": "roles/resourcemanager.organizationViewer",
+ "members": [
+ "user:eve@example.com"
+ ],
+ "condition": {
+ "title": "expirable access",
+ "description": "Does not grant access after Sep 2020",
+ "expression": "request.time <
+ timestamp('2020-10-01T00:00:00.000Z')",
+ }
+ }
+ ],
+ "etag": "BwWWja0YfJA=",
+ "version": 3
+ }
+
+ **YAML example:**
+
+ ::
+
+ bindings:
+ - members:
+ - user:mike@example.com
+ - group:admins@example.com
+ - domain:google.com
+ - serviceAccount:my-project-id@appspot.gserviceaccount.com
+ role: roles/resourcemanager.organizationAdmin
+ - members:
+ - user:eve@example.com
+ role: roles/resourcemanager.organizationViewer
+ condition:
+ title: expirable access
+ description: Does not grant access after Sep 2020
+ expression: request.time < timestamp('2020-10-01T00:00:00.000Z')
+ etag: BwWWja0YfJA=
+ version: 3
+
+ For a description of IAM and its features, see the `IAM
+ documentation <https://cloud.google.com/iam/docs/>`__.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=tagValues/*}:setIamPolicy",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_set_iam_policy(request, metadata)
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = policy_pb2.Policy()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_set_iam_policy(resp)
+ return resp
+
+ class _TestIamPermissions(TagValuesRestStub):
+ def __hash__(self):
+ return hash("TestIamPermissions")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: iam_policy_pb2.TestIamPermissionsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> iam_policy_pb2.TestIamPermissionsResponse:
+ r"""Call the test iam permissions method over HTTP.
+
+ Args:
+ request (~.iam_policy_pb2.TestIamPermissionsRequest):
+ The request object. Request message for ``TestIamPermissions`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.iam_policy_pb2.TestIamPermissionsResponse:
+ Response message for ``TestIamPermissions`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v3/{resource=tagValues/*}:testIamPermissions",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_test_iam_permissions(
+ request, metadata
+ )
+ pb_request = request
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = iam_policy_pb2.TestIamPermissionsResponse()
+ pb_resp = resp
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_test_iam_permissions(resp)
+ return resp
+
+ class _UpdateTagValue(TagValuesRestStub):
+ def __hash__(self):
+ return hash("UpdateTagValue")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: tag_values.UpdateTagValueRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+ r"""Call the update tag value method over HTTP.
+
+ Args:
+ request (~.tag_values.UpdateTagValueRequest):
+ The request object. The request message for updating a
+ TagValue.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.operations_pb2.Operation:
+ This resource represents a
+ long-running operation that is the
+ result of a network API call.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "patch",
+ "uri": "/v3/{tag_value.name=tagValues/*}",
+ "body": "tag_value",
+ },
+ ]
+ request, metadata = self._interceptor.pre_update_tag_value(
+ request, metadata
+ )
+ pb_request = tag_values.UpdateTagValueRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = operations_pb2.Operation()
+ json_format.Parse(response.content, resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_update_tag_value(resp)
+ return resp
+
+ @property
+ def create_tag_value(
+ self,
+ ) -> Callable[[tag_values.CreateTagValueRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._CreateTagValue(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def delete_tag_value(
+ self,
+ ) -> Callable[[tag_values.DeleteTagValueRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._DeleteTagValue(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.GetIamPolicyRequest], policy_pb2.Policy]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetIamPolicy(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_namespaced_tag_value(
+ self,
+ ) -> Callable[[tag_values.GetNamespacedTagValueRequest], tag_values.TagValue]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetNamespacedTagValue(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_tag_value(
+ self,
+ ) -> Callable[[tag_values.GetTagValueRequest], tag_values.TagValue]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetTagValue(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_tag_values(
+ self,
+ ) -> Callable[[tag_values.ListTagValuesRequest], tag_values.ListTagValuesResponse]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListTagValues(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def set_iam_policy(
+ self,
+ ) -> Callable[[iam_policy_pb2.SetIamPolicyRequest], policy_pb2.Policy]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._SetIamPolicy(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def test_iam_permissions(
+ self,
+ ) -> Callable[
+ [iam_policy_pb2.TestIamPermissionsRequest],
+ iam_policy_pb2.TestIamPermissionsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._TestIamPermissions(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def update_tag_value(
+ self,
+ ) -> Callable[[tag_values.UpdateTagValueRequest], operations_pb2.Operation]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._UpdateTagValue(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_operation(self):
+ return self._GetOperation(self._session, self._host, self._interceptor) # type: ignore
+
+ class _GetOperation(TagValuesRestStub):
+ def __call__(
+ self,
+ request: operations_pb2.GetOperationRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> operations_pb2.Operation:
+
+ r"""Call the get operation method over HTTP.
+
+ Args:
+ request (operations_pb2.GetOperationRequest):
+ The request object for GetOperation method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ operations_pb2.Operation: Response from GetOperation method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v3/{name=operations/**}",
+ },
+ ]
+
+ request, metadata = self._interceptor.pre_get_operation(request, metadata)
+ request_kwargs = json_format.MessageToDict(request)
+ transcoded_request = path_template.transcode(http_options, **request_kwargs)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(json.dumps(transcoded_request["query_params"]))
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ resp = operations_pb2.Operation()
+ resp = json_format.Parse(response.content.decode("utf-8"), resp)
+ resp = self._interceptor.post_get_operation(resp)
+ return resp
+
+ @property
+ def kind(self) -> str:
+ return "rest"
+
+ def close(self):
+ self._session.close()
+
+
+__all__ = ("TagValuesRestTransport",)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/__init__.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/__init__.py
@@ -0,0 +1,188 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .folders import (
+ CreateFolderMetadata,
+ CreateFolderRequest,
+ DeleteFolderMetadata,
+ DeleteFolderRequest,
+ Folder,
+ GetFolderRequest,
+ ListFoldersRequest,
+ ListFoldersResponse,
+ MoveFolderMetadata,
+ MoveFolderRequest,
+ SearchFoldersRequest,
+ SearchFoldersResponse,
+ UndeleteFolderMetadata,
+ UndeleteFolderRequest,
+ UpdateFolderMetadata,
+ UpdateFolderRequest,
+)
+from .organizations import (
+ DeleteOrganizationMetadata,
+ GetOrganizationRequest,
+ Organization,
+ SearchOrganizationsRequest,
+ SearchOrganizationsResponse,
+ UndeleteOrganizationMetadata,
+)
+from .projects import (
+ CreateProjectMetadata,
+ CreateProjectRequest,
+ DeleteProjectMetadata,
+ DeleteProjectRequest,
+ GetProjectRequest,
+ ListProjectsRequest,
+ ListProjectsResponse,
+ MoveProjectMetadata,
+ MoveProjectRequest,
+ Project,
+ SearchProjectsRequest,
+ SearchProjectsResponse,
+ UndeleteProjectMetadata,
+ UndeleteProjectRequest,
+ UpdateProjectMetadata,
+ UpdateProjectRequest,
+)
+from .tag_bindings import (
+ CreateTagBindingMetadata,
+ CreateTagBindingRequest,
+ DeleteTagBindingMetadata,
+ DeleteTagBindingRequest,
+ EffectiveTag,
+ ListEffectiveTagsRequest,
+ ListEffectiveTagsResponse,
+ ListTagBindingsRequest,
+ ListTagBindingsResponse,
+ TagBinding,
+)
+from .tag_holds import (
+ CreateTagHoldMetadata,
+ CreateTagHoldRequest,
+ DeleteTagHoldMetadata,
+ DeleteTagHoldRequest,
+ ListTagHoldsRequest,
+ ListTagHoldsResponse,
+ TagHold,
+)
+from .tag_keys import (
+ CreateTagKeyMetadata,
+ CreateTagKeyRequest,
+ DeleteTagKeyMetadata,
+ DeleteTagKeyRequest,
+ GetNamespacedTagKeyRequest,
+ GetTagKeyRequest,
+ ListTagKeysRequest,
+ ListTagKeysResponse,
+ Purpose,
+ TagKey,
+ UpdateTagKeyMetadata,
+ UpdateTagKeyRequest,
+)
+from .tag_values import (
+ CreateTagValueMetadata,
+ CreateTagValueRequest,
+ DeleteTagValueMetadata,
+ DeleteTagValueRequest,
+ GetNamespacedTagValueRequest,
+ GetTagValueRequest,
+ ListTagValuesRequest,
+ ListTagValuesResponse,
+ TagValue,
+ UpdateTagValueMetadata,
+ UpdateTagValueRequest,
+)
+
+__all__ = (
+ "CreateFolderMetadata",
+ "CreateFolderRequest",
+ "DeleteFolderMetadata",
+ "DeleteFolderRequest",
+ "Folder",
+ "GetFolderRequest",
+ "ListFoldersRequest",
+ "ListFoldersResponse",
+ "MoveFolderMetadata",
+ "MoveFolderRequest",
+ "SearchFoldersRequest",
+ "SearchFoldersResponse",
+ "UndeleteFolderMetadata",
+ "UndeleteFolderRequest",
+ "UpdateFolderMetadata",
+ "UpdateFolderRequest",
+ "DeleteOrganizationMetadata",
+ "GetOrganizationRequest",
+ "Organization",
+ "SearchOrganizationsRequest",
+ "SearchOrganizationsResponse",
+ "UndeleteOrganizationMetadata",
+ "CreateProjectMetadata",
+ "CreateProjectRequest",
+ "DeleteProjectMetadata",
+ "DeleteProjectRequest",
+ "GetProjectRequest",
+ "ListProjectsRequest",
+ "ListProjectsResponse",
+ "MoveProjectMetadata",
+ "MoveProjectRequest",
+ "Project",
+ "SearchProjectsRequest",
+ "SearchProjectsResponse",
+ "UndeleteProjectMetadata",
+ "UndeleteProjectRequest",
+ "UpdateProjectMetadata",
+ "UpdateProjectRequest",
+ "CreateTagBindingMetadata",
+ "CreateTagBindingRequest",
+ "DeleteTagBindingMetadata",
+ "DeleteTagBindingRequest",
+ "EffectiveTag",
+ "ListEffectiveTagsRequest",
+ "ListEffectiveTagsResponse",
+ "ListTagBindingsRequest",
+ "ListTagBindingsResponse",
+ "TagBinding",
+ "CreateTagHoldMetadata",
+ "CreateTagHoldRequest",
+ "DeleteTagHoldMetadata",
+ "DeleteTagHoldRequest",
+ "ListTagHoldsRequest",
+ "ListTagHoldsResponse",
+ "TagHold",
+ "CreateTagKeyMetadata",
+ "CreateTagKeyRequest",
+ "DeleteTagKeyMetadata",
+ "DeleteTagKeyRequest",
+ "GetNamespacedTagKeyRequest",
+ "GetTagKeyRequest",
+ "ListTagKeysRequest",
+ "ListTagKeysResponse",
+ "TagKey",
+ "UpdateTagKeyMetadata",
+ "UpdateTagKeyRequest",
+ "Purpose",
+ "CreateTagValueMetadata",
+ "CreateTagValueRequest",
+ "DeleteTagValueMetadata",
+ "DeleteTagValueRequest",
+ "GetNamespacedTagValueRequest",
+ "GetTagValueRequest",
+ "ListTagValuesRequest",
+ "ListTagValuesResponse",
+ "TagValue",
+ "UpdateTagValueMetadata",
+ "UpdateTagValueRequest",
+)
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/folders.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/folders.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/folders.py
@@ -0,0 +1,501 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.resourcemanager.v3",
+ manifest={
+ "Folder",
+ "GetFolderRequest",
+ "ListFoldersRequest",
+ "ListFoldersResponse",
+ "SearchFoldersRequest",
+ "SearchFoldersResponse",
+ "CreateFolderRequest",
+ "CreateFolderMetadata",
+ "UpdateFolderRequest",
+ "UpdateFolderMetadata",
+ "MoveFolderRequest",
+ "MoveFolderMetadata",
+ "DeleteFolderRequest",
+ "DeleteFolderMetadata",
+ "UndeleteFolderRequest",
+ "UndeleteFolderMetadata",
+ },
+)
+
+
+class Folder(proto.Message):
+ r"""A folder in an organization's resource hierarchy, used to
+ organize that organization's resources.
+
+ Attributes:
+ name (str):
+ Output only. The resource name of the folder. Its format is
+ ``folders/{folder_id}``, for example: "folders/1234".
+ parent (str):
+ Required. The folder's parent's resource name. Updates to
+ the folder's parent must be performed using
+ [MoveFolder][google.cloud.resourcemanager.v3.Folders.MoveFolder].
+ display_name (str):
+ The folder's display name. A folder's display name must be
+ unique amongst its siblings. For example, no two folders
+ with the same parent can share the same display name. The
+ display name must start and end with a letter or digit, may
+ contain letters, digits, spaces, hyphens and underscores and
+ can be no longer than 30 characters. This is captured by the
+ regular expression:
+ ``[\p{L}\p{N}]([\p{L}\p{N}_- ]{0,28}[\p{L}\p{N}])?``.
+ state (google.cloud.resourcemanager_v3.types.Folder.State):
+ Output only. The lifecycle state of the folder. Updates to
+ the state must be performed using
+ [DeleteFolder][google.cloud.resourcemanager.v3.Folders.DeleteFolder]
+ and
+ [UndeleteFolder][google.cloud.resourcemanager.v3.Folders.UndeleteFolder].
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. Timestamp when the folder was
+ created.
+ update_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. Timestamp when the folder was
+ last modified.
+ delete_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. Timestamp when the folder was
+ requested to be deleted.
+ etag (str):
+ Output only. A checksum computed by the
+ server based on the current value of the folder
+ resource. This may be sent on update and delete
+ requests to ensure the client has an up-to-date
+ value before proceeding.
+ """
+
+ class State(proto.Enum):
+ r"""Folder lifecycle states.
+
+ Values:
+ STATE_UNSPECIFIED (0):
+ Unspecified state.
+ ACTIVE (1):
+ The normal and active state.
+ DELETE_REQUESTED (2):
+ The folder has been marked for deletion by
+ the user.
+ """
+ STATE_UNSPECIFIED = 0
+ ACTIVE = 1
+ DELETE_REQUESTED = 2
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ parent: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ display_name: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ state: State = proto.Field(
+ proto.ENUM,
+ number=4,
+ enum=State,
+ )
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=5,
+ message=timestamp_pb2.Timestamp,
+ )
+ update_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=6,
+ message=timestamp_pb2.Timestamp,
+ )
+ delete_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=7,
+ message=timestamp_pb2.Timestamp,
+ )
+ etag: str = proto.Field(
+ proto.STRING,
+ number=8,
+ )
+
+
+class GetFolderRequest(proto.Message):
+ r"""The GetFolder request message.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the folder to retrieve. Must
+ be of the form ``folders/{folder_id}``.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListFoldersRequest(proto.Message):
+ r"""The ListFolders request message.
+
+ Attributes:
+ parent (str):
+ Required. The name of the parent resource whose folders are
+ being listed. Only children of this parent resource are
+ listed; descendants are not listed.
+
+ If the parent is a folder, use the value
+ ``folders/{folder_id}``. If the parent is an organization,
+ use the value ``organizations/{org_id}``.
+
+ Access to this method is controlled by checking the
+ ``resourcemanager.folders.list`` permission on the
+ ``parent``.
+ page_size (int):
+ Optional. The maximum number of folders to
+ return in the response. The server can return
+ fewer folders than requested. If unspecified,
+ server picks an appropriate default.
+ page_token (str):
+ Optional. A pagination token returned from a previous call
+ to ``ListFolders`` that indicates where this listing should
+ continue from.
+ show_deleted (bool):
+ Optional. Controls whether folders in the
+ [DELETE_REQUESTED][google.cloud.resourcemanager.v3.Folder.State.DELETE_REQUESTED]
+ state should be returned. Defaults to false.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ show_deleted: bool = proto.Field(
+ proto.BOOL,
+ number=4,
+ )
+
+
+class ListFoldersResponse(proto.Message):
+ r"""The ListFolders response message.
+
+ Attributes:
+ folders (MutableSequence[google.cloud.resourcemanager_v3.types.Folder]):
+ A possibly paginated list of folders that are
+ direct descendants of the specified parent
+ resource.
+ next_page_token (str):
+ A pagination token returned from a previous call to
+ ``ListFolders`` that indicates from where listing should
+ continue.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ folders: MutableSequence["Folder"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message="Folder",
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class SearchFoldersRequest(proto.Message):
+ r"""The request message for searching folders.
+
+ Attributes:
+ page_size (int):
+ Optional. The maximum number of folders to
+ return in the response. The server can return
+ fewer folders than requested. If unspecified,
+ server picks an appropriate default.
+ page_token (str):
+ Optional. A pagination token returned from a previous call
+ to ``SearchFolders`` that indicates from where search should
+ continue.
+ query (str):
+ Optional. Search criteria used to select the folders to
+ return. If no search criteria is specified then all
+ accessible folders will be returned.
+
+ Query expressions can be used to restrict results based upon
+ displayName, state and parent, where the operators ``=``
+ (``:``) ``NOT``, ``AND`` and ``OR`` can be used along with
+ the suffix wildcard symbol ``*``.
+
+ The ``displayName`` field in a query expression should use
+ escaped quotes for values that include whitespace to prevent
+ unexpected behavior.
+
+ ::
+
+ | Field | Description |
+ |-------------------------|----------------------------------------|
+ | displayName | Filters by displayName. |
+ | parent | Filters by parent (for example: folders/123). |
+ | state, lifecycleState | Filters by state. |
+
+ Some example queries are:
+
+ - Query ``displayName=Test*`` returns Folder resources
+ whose display name starts with "Test".
+ - Query ``state=ACTIVE`` returns Folder resources with
+ ``state`` set to ``ACTIVE``.
+ - Query ``parent=folders/123`` returns Folder resources
+ that have ``folders/123`` as a parent resource.
+ - Query ``parent=folders/123 AND state=ACTIVE`` returns
+ active Folder resources that have ``folders/123`` as a
+ parent resource.
+ - Query ``displayName=\\"Test String\\"`` returns Folder
+ resources with display names that include both "Test" and
+ "String".
+ """
+
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=1,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ query: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+class SearchFoldersResponse(proto.Message):
+ r"""The response message for searching folders.
+
+ Attributes:
+ folders (MutableSequence[google.cloud.resourcemanager_v3.types.Folder]):
+ A possibly paginated folder search results.
+ the specified parent resource.
+ next_page_token (str):
+ A pagination token returned from a previous call to
+ ``SearchFolders`` that indicates from where searching should
+ continue.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ folders: MutableSequence["Folder"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message="Folder",
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class CreateFolderRequest(proto.Message):
+ r"""The CreateFolder request message.
+
+ Attributes:
+ folder (google.cloud.resourcemanager_v3.types.Folder):
+ Required. The folder being created, only the
+ display name and parent will be consulted. All
+ other fields will be ignored.
+ """
+
+ folder: "Folder" = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message="Folder",
+ )
+
+
+class CreateFolderMetadata(proto.Message):
+ r"""Metadata pertaining to the Folder creation process.
+
+ Attributes:
+ display_name (str):
+ The display name of the folder.
+ parent (str):
+ The resource name of the folder or
+ organization we are creating the folder under.
+ """
+
+ display_name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ parent: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class UpdateFolderRequest(proto.Message):
+ r"""The request sent to the
+ [UpdateFolder][google.cloud.resourcemanager.v3.Folder.UpdateFolder]
+ method.
+
+ Only the ``display_name`` field can be changed. All other fields
+ will be ignored. Use the
+ [MoveFolder][google.cloud.resourcemanager.v3.Folders.MoveFolder]
+ method to change the ``parent`` field.
+
+ Attributes:
+ folder (google.cloud.resourcemanager_v3.types.Folder):
+ Required. The new definition of the Folder. It must include
+ the ``name`` field, which cannot be changed.
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Required. Fields to be updated. Only the ``display_name``
+ can be updated.
+ """
+
+ folder: "Folder" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message="Folder",
+ )
+ update_mask: field_mask_pb2.FieldMask = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=field_mask_pb2.FieldMask,
+ )
+
+
+class UpdateFolderMetadata(proto.Message):
+ r"""A status object which is used as the ``metadata`` field for the
+ Operation returned by UpdateFolder.
+
+ """
+
+
+class MoveFolderRequest(proto.Message):
+ r"""The MoveFolder request message.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the Folder to move. Must be
+ of the form folders/{folder_id}
+ destination_parent (str):
+ Required. The resource name of the folder or organization
+ which should be the folder's new parent. Must be of the form
+ ``folders/{folder_id}`` or ``organizations/{org_id}``.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ destination_parent: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class MoveFolderMetadata(proto.Message):
+ r"""Metadata pertaining to the folder move process.
+
+ Attributes:
+ display_name (str):
+ The display name of the folder.
+ source_parent (str):
+ The resource name of the folder's parent.
+ destination_parent (str):
+ The resource name of the folder or
+ organization to move the folder to.
+ """
+
+ display_name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ source_parent: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ destination_parent: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+class DeleteFolderRequest(proto.Message):
+ r"""The DeleteFolder request message.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the folder to be deleted.
+ Must be of the form ``folders/{folder_id}``.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class DeleteFolderMetadata(proto.Message):
+ r"""A status object which is used as the ``metadata`` field for the
+ ``Operation`` returned by ``DeleteFolder``.
+
+ """
+
+
+class UndeleteFolderRequest(proto.Message):
+ r"""The UndeleteFolder request message.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the folder to undelete. Must
+ be of the form ``folders/{folder_id}``.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class UndeleteFolderMetadata(proto.Message):
+ r"""A status object which is used as the ``metadata`` field for the
+ ``Operation`` returned by ``UndeleteFolder``.
+
+ """
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/organizations.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/organizations.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/organizations.py
@@ -0,0 +1,255 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import timestamp_pb2 # type: ignore
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.resourcemanager.v3",
+ manifest={
+ "Organization",
+ "GetOrganizationRequest",
+ "SearchOrganizationsRequest",
+ "SearchOrganizationsResponse",
+ "DeleteOrganizationMetadata",
+ "UndeleteOrganizationMetadata",
+ },
+)
+
+
+class Organization(proto.Message):
+ r"""The root node in the resource hierarchy to which a particular
+ entity's (a company, for example) resources belong.
+
+
+ .. _oneof: https://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields
+
+ Attributes:
+ name (str):
+ Output only. The resource name of the organization. This is
+ the organization's relative path in the API. Its format is
+ "organizations/[organization_id]". For example,
+ "organizations/1234".
+ display_name (str):
+ Output only. A human-readable string that
+ refers to the organization in the Google Cloud
+ Console. This string is set by the server and
+ cannot be changed. The string will be set to the
+ primary domain (for example, "google.com") of
+ the Google Workspace customer that owns the
+ organization.
+ directory_customer_id (str):
+ Immutable. The G Suite / Workspace customer
+ id used in the Directory API.
+
+ This field is a member of `oneof`_ ``owner``.
+ state (google.cloud.resourcemanager_v3.types.Organization.State):
+ Output only. The organization's current
+ lifecycle state.
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. Timestamp when the Organization
+ was created.
+ update_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. Timestamp when the Organization
+ was last modified.
+ delete_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. Timestamp when the Organization
+ was requested for deletion.
+ etag (str):
+ Output only. A checksum computed by the
+ server based on the current value of the
+ Organization resource. This may be sent on
+ update and delete requests to ensure the client
+ has an up-to-date value before proceeding.
+ """
+
+ class State(proto.Enum):
+ r"""Organization lifecycle states.
+
+ Values:
+ STATE_UNSPECIFIED (0):
+ Unspecified state. This is only useful for
+ distinguishing unset values.
+ ACTIVE (1):
+ The normal and active state.
+ DELETE_REQUESTED (2):
+ The organization has been marked for deletion
+ by the user.
+ """
+ STATE_UNSPECIFIED = 0
+ ACTIVE = 1
+ DELETE_REQUESTED = 2
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ display_name: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ directory_customer_id: str = proto.Field(
+ proto.STRING,
+ number=3,
+ oneof="owner",
+ )
+ state: State = proto.Field(
+ proto.ENUM,
+ number=4,
+ enum=State,
+ )
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=5,
+ message=timestamp_pb2.Timestamp,
+ )
+ update_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=6,
+ message=timestamp_pb2.Timestamp,
+ )
+ delete_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=7,
+ message=timestamp_pb2.Timestamp,
+ )
+ etag: str = proto.Field(
+ proto.STRING,
+ number=8,
+ )
+
+
+class GetOrganizationRequest(proto.Message):
+ r"""The request sent to the ``GetOrganization`` method. The ``name``
+ field is required. ``organization_id`` is no longer accepted.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the Organization to fetch.
+ This is the organization's relative path in the API,
+ formatted as "organizations/[organizationId]". For example,
+ "organizations/1234".
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class SearchOrganizationsRequest(proto.Message):
+ r"""The request sent to the ``SearchOrganizations`` method.
+
+ Attributes:
+ page_size (int):
+ Optional. The maximum number of organizations
+ to return in the response. The server can return
+ fewer organizations than requested. If
+ unspecified, server picks an appropriate
+ default.
+ page_token (str):
+ Optional. A pagination token returned from a previous call
+ to ``SearchOrganizations`` that indicates from where listing
+ should continue.
+ query (str):
+ Optional. An optional query string used to filter the
+ Organizations to return in the response. Query rules are
+ case-insensitive.
+
+ ::
+
+ | Field | Description |
+ |------------------|--------------------------------------------|
+ | directoryCustomerId, owner.directoryCustomerId | Filters by directory
+ customer id. |
+ | domain | Filters by domain. |
+
+ Organizations may be queried by ``directoryCustomerId`` or
+ by ``domain``, where the domain is a G Suite domain, for
+ example:
+
+ - Query ``directorycustomerid:123456789`` returns
+ Organization resources with
+ ``owner.directory_customer_id`` equal to ``123456789``.
+ - Query ``domain:google.com`` returns Organization
+ resources corresponding to the domain ``google.com``.
+ """
+
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=1,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ query: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+class SearchOrganizationsResponse(proto.Message):
+ r"""The response returned from the ``SearchOrganizations`` method.
+
+ Attributes:
+ organizations (MutableSequence[google.cloud.resourcemanager_v3.types.Organization]):
+ The list of Organizations that matched the
+ search query, possibly paginated.
+ next_page_token (str):
+ A pagination token to be used to retrieve the
+ next page of results. If the result is too large
+ to fit within the page size specified in the
+ request, this field will be set with a token
+ that can be used to fetch the next page of
+ results. If this field is empty, it indicates
+ that this response contains the last page of
+ results.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ organizations: MutableSequence["Organization"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message="Organization",
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class DeleteOrganizationMetadata(proto.Message):
+ r"""A status object which is used as the ``metadata`` field for the
+ operation returned by DeleteOrganization.
+
+ """
+
+
+class UndeleteOrganizationMetadata(proto.Message):
+ r"""A status object which is used as the ``metadata`` field for the
+ Operation returned by UndeleteOrganization.
+
+ """
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/projects.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/projects.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/projects.py
@@ -0,0 +1,584 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.resourcemanager.v3",
+ manifest={
+ "Project",
+ "GetProjectRequest",
+ "ListProjectsRequest",
+ "ListProjectsResponse",
+ "SearchProjectsRequest",
+ "SearchProjectsResponse",
+ "CreateProjectRequest",
+ "CreateProjectMetadata",
+ "UpdateProjectRequest",
+ "UpdateProjectMetadata",
+ "MoveProjectRequest",
+ "MoveProjectMetadata",
+ "DeleteProjectRequest",
+ "DeleteProjectMetadata",
+ "UndeleteProjectRequest",
+ "UndeleteProjectMetadata",
+ },
+)
+
+
+class Project(proto.Message):
+ r"""A project is a high-level Google Cloud entity. It is a
+ container for ACLs, APIs, App Engine Apps, VMs, and other Google
+ Cloud Platform resources.
+
+ Attributes:
+ name (str):
+ Output only. The unique resource name of the project. It is
+ an int64 generated number prefixed by "projects/".
+
+ Example: ``projects/415104041262``
+ parent (str):
+ Optional. A reference to a parent Resource. eg.,
+ ``organizations/123`` or ``folders/876``.
+ project_id (str):
+ Immutable. The unique, user-assigned id of the project. It
+ must be 6 to 30 lowercase ASCII letters, digits, or hyphens.
+ It must start with a letter. Trailing hyphens are
+ prohibited.
+
+ Example: ``tokyo-rain-123``
+ state (google.cloud.resourcemanager_v3.types.Project.State):
+ Output only. The project lifecycle state.
+ display_name (str):
+ Optional. A user-assigned display name of the project. When
+ present it must be between 4 to 30 characters. Allowed
+ characters are: lowercase and uppercase letters, numbers,
+ hyphen, single-quote, double-quote, space, and exclamation
+ point.
+
+ Example: ``My Project``
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. Creation time.
+ update_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The most recent time this
+ resource was modified.
+ delete_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time at which this resource
+ was requested for deletion.
+ etag (str):
+ Output only. A checksum computed by the
+ server based on the current value of the Project
+ resource. This may be sent on update and delete
+ requests to ensure the client has an up-to-date
+ value before proceeding.
+ labels (MutableMapping[str, str]):
+ Optional. The labels associated with this project.
+
+ Label keys must be between 1 and 63 characters long and must
+ conform to the following regular expression:
+ [a-z]([-a-z0-9]*[a-z0-9])?.
+
+ Label values must be between 0 and 63 characters long and
+ must conform to the regular expression
+ ([a-z]([-a-z0-9]*[a-z0-9])?)?.
+
+ No more than 64 labels can be associated with a given
+ resource.
+
+ Clients should store labels in a representation such as JSON
+ that does not depend on specific characters being
+ disallowed.
+
+ Example: ``"myBusinessDimension" : "businessValue"``
+ """
+
+ class State(proto.Enum):
+ r"""Project lifecycle states.
+
+ Values:
+ STATE_UNSPECIFIED (0):
+ Unspecified state. This is only used/useful
+ for distinguishing unset values.
+ ACTIVE (1):
+ The normal and active state.
+ DELETE_REQUESTED (2):
+ The project has been marked for deletion by the user (by
+ invoking
+ [DeleteProject][google.cloud.resourcemanager.v3.Projects.DeleteProject])
+ or by the system (Google Cloud Platform). This can generally
+ be reversed by invoking [UndeleteProject]
+ [google.cloud.resourcemanager.v3.Projects.UndeleteProject].
+ """
+ STATE_UNSPECIFIED = 0
+ ACTIVE = 1
+ DELETE_REQUESTED = 2
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ parent: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ project_id: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ state: State = proto.Field(
+ proto.ENUM,
+ number=4,
+ enum=State,
+ )
+ display_name: str = proto.Field(
+ proto.STRING,
+ number=5,
+ )
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=6,
+ message=timestamp_pb2.Timestamp,
+ )
+ update_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=7,
+ message=timestamp_pb2.Timestamp,
+ )
+ delete_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=8,
+ message=timestamp_pb2.Timestamp,
+ )
+ etag: str = proto.Field(
+ proto.STRING,
+ number=9,
+ )
+ labels: MutableMapping[str, str] = proto.MapField(
+ proto.STRING,
+ proto.STRING,
+ number=10,
+ )
+
+
+class GetProjectRequest(proto.Message):
+ r"""The request sent to the
+ [GetProject][google.cloud.resourcemanager.v3.Projects.GetProject]
+ method.
+
+ Attributes:
+ name (str):
+ Required. The name of the project (for example,
+ ``projects/415104041262``).
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListProjectsRequest(proto.Message):
+ r"""The request sent to the
+ [ListProjects][google.cloud.resourcemanager.v3.Projects.ListProjects]
+ method.
+
+ Attributes:
+ parent (str):
+ Required. The name of the parent resource whose projects are
+ being listed. Only children of this parent resource are
+ listed; descendants are not listed.
+
+ If the parent is a folder, use the value
+ ``folders/{folder_id}``. If the parent is an organization,
+ use the value ``organizations/{org_id}``.
+ page_token (str):
+ Optional. A pagination token returned from a previous call
+ to [ListProjects]
+ [google.cloud.resourcemanager.v3.Projects.ListProjects] that
+ indicates from where listing should continue.
+ page_size (int):
+ Optional. The maximum number of projects to
+ return in the response. The server can return
+ fewer projects than requested. If unspecified,
+ server picks an appropriate default.
+ show_deleted (bool):
+ Optional. Indicate that projects in the ``DELETE_REQUESTED``
+ state should also be returned. Normally only ``ACTIVE``
+ projects are returned.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+ show_deleted: bool = proto.Field(
+ proto.BOOL,
+ number=4,
+ )
+
+
+class ListProjectsResponse(proto.Message):
+ r"""A page of the response received from the
+ [ListProjects][google.cloud.resourcemanager.v3.Projects.ListProjects]
+ method.
+
+ A paginated response where more pages are available has
+ ``next_page_token`` set. This token can be used in a subsequent
+ request to retrieve the next request page.
+
+ NOTE: A response may contain fewer elements than the request
+ ``page_size`` and still have a ``next_page_token``.
+
+ Attributes:
+ projects (MutableSequence[google.cloud.resourcemanager_v3.types.Project]):
+ The list of Projects under the parent. This
+ list can be paginated.
+ next_page_token (str):
+ Pagination token.
+
+ If the result set is too large to fit in a single response,
+ this token is returned. It encodes the position of the
+ current result cursor. Feeding this value into a new list
+ request with the ``page_token`` parameter gives the next
+ page of the results.
+
+ When ``next_page_token`` is not filled in, there is no next
+ page and the list returned is the last page in the result
+ set.
+
+ Pagination tokens have a limited lifetime.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ projects: MutableSequence["Project"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message="Project",
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class SearchProjectsRequest(proto.Message):
+ r"""The request sent to the
+ [SearchProjects][google.cloud.resourcemanager.v3.Projects.SearchProjects]
+ method.
+
+ Attributes:
+ query (str):
+ Optional. A query string for searching for projects that the
+ caller has ``resourcemanager.projects.get`` permission to.
+ If multiple fields are included in the query, then it will
+ return results that match any of the fields. Some eligible
+ fields are:
+
+ - **``displayName``, ``name``**: Filters by displayName.
+ - **``parent``**: Project's parent (for example:
+ ``folders/123``, ``organizations/*``). Prefer ``parent``
+ field over ``parent.type`` and ``parent.id``.
+ - **``parent.type``**: Parent's type: ``folder`` or
+ ``organization``.
+ - **``parent.id``**: Parent's id number (for example:
+ ``123``).
+ - **``id``, ``projectId``**: Filters by projectId.
+ - **``state``, ``lifecycleState``**: Filters by state.
+ - **``labels``**: Filters by label name or value.
+ - **``labels.<key>`` (where ``<key>`` is the name of a
+ label)**: Filters by label name.
+
+ Search expressions are case insensitive.
+
+ Some examples queries:
+
+ - **``name:how*``**: The project's name starts with "how".
+ - **``name:Howl``**: The project's name is ``Howl`` or
+ ``howl``.
+ - **``name:HOWL``**: Equivalent to above.
+ - **``NAME:howl``**: Equivalent to above.
+ - **``labels.color:*``**: The project has the label
+ ``color``.
+ - **``labels.color:red``**: The project's label ``color``
+ has the value ``red``.
+ - **``labels.color:red labels.size:big``**: The project's
+ label ``color`` has the value ``red`` or its label
+ ``size`` has the value ``big``.
+
+ If no query is specified, the call will return projects for
+ which the user has the ``resourcemanager.projects.get``
+ permission.
+ page_token (str):
+ Optional. A pagination token returned from a previous call
+ to [ListProjects]
+ [google.cloud.resourcemanager.v3.Projects.ListProjects] that
+ indicates from where listing should continue.
+ page_size (int):
+ Optional. The maximum number of projects to
+ return in the response. The server can return
+ fewer projects than requested. If unspecified,
+ server picks an appropriate default.
+ """
+
+ query: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+
+
+class SearchProjectsResponse(proto.Message):
+ r"""A page of the response received from the
+ [SearchProjects][google.cloud.resourcemanager.v3.Projects.SearchProjects]
+ method.
+
+ A paginated response where more pages are available has
+ ``next_page_token`` set. This token can be used in a subsequent
+ request to retrieve the next request page.
+
+ Attributes:
+ projects (MutableSequence[google.cloud.resourcemanager_v3.types.Project]):
+ The list of Projects that matched the list
+ filter query. This list can be paginated.
+ next_page_token (str):
+ Pagination token.
+
+ If the result set is too large to fit in a single response,
+ this token is returned. It encodes the position of the
+ current result cursor. Feeding this value into a new list
+ request with the ``page_token`` parameter gives the next
+ page of the results.
+
+ When ``next_page_token`` is not filled in, there is no next
+ page and the list returned is the last page in the result
+ set.
+
+ Pagination tokens have a limited lifetime.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ projects: MutableSequence["Project"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message="Project",
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class CreateProjectRequest(proto.Message):
+ r"""The request sent to the
+ [CreateProject][google.cloud.resourcemanager.v3.Projects.CreateProject]
+ method.
+
+ Attributes:
+ project (google.cloud.resourcemanager_v3.types.Project):
+ Required. The Project to create.
+
+ Project ID is required. If the requested ID is unavailable,
+ the request fails.
+
+ If the ``parent`` field is set, the
+ ``resourcemanager.projects.create`` permission is checked on
+ the parent resource. If no parent is set and the
+ authorization credentials belong to an Organization, the
+ parent will be set to that Organization.
+ """
+
+ project: "Project" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message="Project",
+ )
+
+
+class CreateProjectMetadata(proto.Message):
+ r"""A status object which is used as the ``metadata`` field for the
+ Operation returned by CreateProject. It provides insight for when
+ significant phases of Project creation have completed.
+
+ Attributes:
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Creation time of the project creation
+ workflow.
+ gettable (bool):
+ True if the project can be retrieved using ``GetProject``.
+ No other operations on the project are guaranteed to work
+ until the project creation is complete.
+ ready (bool):
+ True if the project creation process is
+ complete.
+ """
+
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message=timestamp_pb2.Timestamp,
+ )
+ gettable: bool = proto.Field(
+ proto.BOOL,
+ number=2,
+ )
+ ready: bool = proto.Field(
+ proto.BOOL,
+ number=3,
+ )
+
+
+class UpdateProjectRequest(proto.Message):
+ r"""The request sent to the
+ [UpdateProject][google.cloud.resourcemanager.v3.Projects.UpdateProject]
+ method.
+
+ Only the ``display_name`` and ``labels`` fields can be change. Use
+ the
+ [MoveProject][google.cloud.resourcemanager.v3.Projects.MoveProject]
+ method to change the ``parent`` field.
+
+ Attributes:
+ project (google.cloud.resourcemanager_v3.types.Project):
+ Required. The new definition of the project.
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Optional. An update mask to selectively
+ update fields.
+ """
+
+ project: "Project" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message="Project",
+ )
+ update_mask: field_mask_pb2.FieldMask = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=field_mask_pb2.FieldMask,
+ )
+
+
+class UpdateProjectMetadata(proto.Message):
+ r"""A status object which is used as the ``metadata`` field for the
+ Operation returned by UpdateProject.
+
+ """
+
+
+class MoveProjectRequest(proto.Message):
+ r"""The request sent to
+ [MoveProject][google.cloud.resourcemanager.v3.Projects.MoveProject]
+ method.
+
+ Attributes:
+ name (str):
+ Required. The name of the project to move.
+ destination_parent (str):
+ Required. The new parent to move the Project
+ under.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ destination_parent: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class MoveProjectMetadata(proto.Message):
+ r"""A status object which is used as the ``metadata`` field for the
+ Operation returned by MoveProject.
+
+ """
+
+
+class DeleteProjectRequest(proto.Message):
+ r"""[DeleteProject][google.cloud.resourcemanager.v3.Projects.DeleteProject]
+ method.
+
+ Attributes:
+ name (str):
+ Required. The name of the Project (for example,
+ ``projects/415104041262``).
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class DeleteProjectMetadata(proto.Message):
+ r"""A status object which is used as the ``metadata`` field for the
+ Operation returned by ``DeleteProject``.
+
+ """
+
+
+class UndeleteProjectRequest(proto.Message):
+ r"""The request sent to the [UndeleteProject]
+ [google.cloud.resourcemanager.v3.Projects.UndeleteProject] method.
+
+ Attributes:
+ name (str):
+ Required. The name of the project (for example,
+ ``projects/415104041262``).
+
+ Required.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class UndeleteProjectMetadata(proto.Message):
+ r"""A status object which is used as the ``metadata`` field for the
+ Operation returned by ``UndeleteProject``.
+
+ """
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/tag_bindings.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/tag_bindings.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/tag_bindings.py
@@ -0,0 +1,343 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.resourcemanager.v3",
+ manifest={
+ "TagBinding",
+ "CreateTagBindingMetadata",
+ "CreateTagBindingRequest",
+ "DeleteTagBindingMetadata",
+ "DeleteTagBindingRequest",
+ "ListTagBindingsRequest",
+ "ListTagBindingsResponse",
+ "ListEffectiveTagsRequest",
+ "ListEffectiveTagsResponse",
+ "EffectiveTag",
+ },
+)
+
+
+class TagBinding(proto.Message):
+ r"""A TagBinding represents a connection between a TagValue and a
+ cloud resource Once a TagBinding is created, the TagValue is
+ applied to all the descendants of the Google Cloud resource.
+
+ Attributes:
+ name (str):
+ Output only. The name of the TagBinding. This is a String of
+ the form:
+ ``tagBindings/{full-resource-name}/{tag-value-name}`` (e.g.
+ ``tagBindings/%2F%2Fcloudresourcemanager.googleapis.com%2Fprojects%2F123/tagValues/456``).
+ parent (str):
+ The full resource name of the resource the TagValue is bound
+ to. E.g.
+ ``//cloudresourcemanager.googleapis.com/projects/123``
+ tag_value (str):
+ The TagValue of the TagBinding. Must be of the form
+ ``tagValues/456``.
+ tag_value_namespaced_name (str):
+ The namespaced name for the TagValue of the TagBinding. Must
+ be in the format
+ ``{parent_id}/{tag_key_short_name}/{short_name}``.
+
+ For methods that support TagValue namespaced name, only one
+ of tag_value_namespaced_name or tag_value may be filled.
+ Requests with both fields will be rejected.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ parent: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ tag_value: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ tag_value_namespaced_name: str = proto.Field(
+ proto.STRING,
+ number=4,
+ )
+
+
+class CreateTagBindingMetadata(proto.Message):
+ r"""Runtime operation information for creating a TagValue."""
+
+
+class CreateTagBindingRequest(proto.Message):
+ r"""The request message to create a TagBinding.
+
+ Attributes:
+ tag_binding (google.cloud.resourcemanager_v3.types.TagBinding):
+ Required. The TagBinding to be created.
+ validate_only (bool):
+ Optional. Set to true to perform the
+ validations necessary for creating the resource,
+ but not actually perform the action.
+ """
+
+ tag_binding: "TagBinding" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message="TagBinding",
+ )
+ validate_only: bool = proto.Field(
+ proto.BOOL,
+ number=2,
+ )
+
+
+class DeleteTagBindingMetadata(proto.Message):
+ r"""Runtime operation information for deleting a TagBinding."""
+
+
+class DeleteTagBindingRequest(proto.Message):
+ r"""The request message to delete a TagBinding.
+
+ Attributes:
+ name (str):
+ Required. The name of the TagBinding. This is a String of
+ the form: ``tagBindings/{id}`` (e.g.
+ ``tagBindings/%2F%2Fcloudresourcemanager.googleapis.com%2Fprojects%2F123/tagValues/456``).
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListTagBindingsRequest(proto.Message):
+ r"""The request message to list all TagBindings for a parent.
+
+ Attributes:
+ parent (str):
+ Required. The full resource name of a
+ resource for which you want to list existing
+ TagBindings. E.g.
+ "//cloudresourcemanager.googleapis.com/projects/123".
+ page_size (int):
+ Optional. The maximum number of TagBindings
+ to return in the response. The server allows a
+ maximum of 300 TagBindings to return. If
+ unspecified, the server will use 100 as the
+ default.
+ page_token (str):
+ Optional. A pagination token returned from a previous call
+ to ``ListTagBindings`` that indicates where this listing
+ should continue from.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+class ListTagBindingsResponse(proto.Message):
+ r"""The ListTagBindings response.
+
+ Attributes:
+ tag_bindings (MutableSequence[google.cloud.resourcemanager_v3.types.TagBinding]):
+ A possibly paginated list of TagBindings for
+ the specified resource.
+ next_page_token (str):
+ Pagination token.
+
+ If the result set is too large to fit in a single response,
+ this token is returned. It encodes the position of the
+ current result cursor. Feeding this value into a new list
+ request with the ``page_token`` parameter gives the next
+ page of the results.
+
+ When ``next_page_token`` is not filled in, there is no next
+ page and the list returned is the last page in the result
+ set.
+
+ Pagination tokens have a limited lifetime.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ tag_bindings: MutableSequence["TagBinding"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message="TagBinding",
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class ListEffectiveTagsRequest(proto.Message):
+ r"""The request message to ListEffectiveTags
+
+ Attributes:
+ parent (str):
+ Required. The full resource name of a
+ resource for which you want to list the
+ effective tags. E.g.
+ "//cloudresourcemanager.googleapis.com/projects/123".
+ page_size (int):
+ Optional. The maximum number of effective
+ tags to return in the response. The server
+ allows a maximum of 300 effective tags to return
+ in a single page. If unspecified, the server
+ will use 100 as the default.
+ page_token (str):
+ Optional. A pagination token returned from a previous call
+ to ``ListEffectiveTags`` that indicates from where this
+ listing should continue.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+class ListEffectiveTagsResponse(proto.Message):
+ r"""The response of ListEffectiveTags.
+
+ Attributes:
+ effective_tags (MutableSequence[google.cloud.resourcemanager_v3.types.EffectiveTag]):
+ A possibly paginated list of effective tags
+ for the specified resource.
+ next_page_token (str):
+ Pagination token.
+
+ If the result set is too large to fit in a single response,
+ this token is returned. It encodes the position of the
+ current result cursor. Feeding this value into a new list
+ request with the ``page_token`` parameter gives the next
+ page of the results.
+
+ When ``next_page_token`` is not filled in, there is no next
+ page and the list returned is the last page in the result
+ set.
+
+ Pagination tokens have a limited lifetime.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ effective_tags: MutableSequence["EffectiveTag"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message="EffectiveTag",
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class EffectiveTag(proto.Message):
+ r"""An EffectiveTag represents a tag that applies to a resource during
+ policy evaluation. Tags can be either directly bound to a resource
+ or inherited from its ancestor. EffectiveTag contains the name and
+ namespaced_name of the tag value and tag key, with additional fields
+ of ``inherited`` to indicate the inheritance status of the effective
+ tag.
+
+ Attributes:
+ tag_value (str):
+ Resource name for TagValue in the format ``tagValues/456``.
+ namespaced_tag_value (str):
+ The namespaced name of the TagValue. Can be in the form
+ ``{organization_id}/{tag_key_short_name}/{tag_value_short_name}``
+ or
+ ``{project_id}/{tag_key_short_name}/{tag_value_short_name}``
+ or
+ ``{project_number}/{tag_key_short_name}/{tag_value_short_name}``.
+ tag_key (str):
+ The name of the TagKey, in the format ``tagKeys/{id}``, such
+ as ``tagKeys/123``.
+ namespaced_tag_key (str):
+ The namespaced name of the TagKey. Can be in the form
+ ``{organization_id}/{tag_key_short_name}`` or
+ ``{project_id}/{tag_key_short_name}`` or
+ ``{project_number}/{tag_key_short_name}``.
+ tag_key_parent_name (str):
+ The parent name of the tag key. Must be in the format
+ ``organizations/{organization_id}`` or
+ ``projects/{project_number}``
+ inherited (bool):
+ Indicates the inheritance status of a tag
+ value attached to the given resource. If the tag
+ value is inherited from one of the resource's
+ ancestors, inherited will be true. If false,
+ then the tag value is directly attached to the
+ resource, inherited will be false.
+ """
+
+ tag_value: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ namespaced_tag_value: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ tag_key: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ namespaced_tag_key: str = proto.Field(
+ proto.STRING,
+ number=4,
+ )
+ tag_key_parent_name: str = proto.Field(
+ proto.STRING,
+ number=6,
+ )
+ inherited: bool = proto.Field(
+ proto.BOOL,
+ number=5,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/tag_holds.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/tag_holds.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/tag_holds.py
@@ -0,0 +1,246 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import timestamp_pb2 # type: ignore
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.resourcemanager.v3",
+ manifest={
+ "TagHold",
+ "CreateTagHoldRequest",
+ "CreateTagHoldMetadata",
+ "DeleteTagHoldRequest",
+ "DeleteTagHoldMetadata",
+ "ListTagHoldsRequest",
+ "ListTagHoldsResponse",
+ },
+)
+
+
+class TagHold(proto.Message):
+ r"""A TagHold represents the use of a TagValue that is not captured by
+ TagBindings. If a TagValue has any TagHolds, deletion will be
+ blocked. This resource is intended to be created in the same cloud
+ location as the ``holder``.
+
+ Attributes:
+ name (str):
+ Output only. The resource name of a TagHold. This is a
+ String of the form:
+ ``tagValues/{tag-value-id}/tagHolds/{tag-hold-id}`` (e.g.
+ ``tagValues/123/tagHolds/456``). This resource name is
+ generated by the server.
+ holder (str):
+ Required. The name of the resource where the TagValue is
+ being used. Must be less than 200 characters. E.g.
+ ``//compute.googleapis.com/compute/projects/myproject/regions/us-east-1/instanceGroupManagers/instance-group``
+ origin (str):
+ Optional. An optional string representing the origin of this
+ request. This field should include human-understandable
+ information to distinguish origins from each other. Must be
+ less than 200 characters. E.g. ``migs-35678234``
+ help_link (str):
+ Optional. A URL where an end user can learn more about
+ removing this hold. E.g.
+ ``https://cloud.google.com/resource-manager/docs/tags/tags-creating-and-managing``
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time this TagHold was
+ created.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ holder: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ origin: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ help_link: str = proto.Field(
+ proto.STRING,
+ number=4,
+ )
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=5,
+ message=timestamp_pb2.Timestamp,
+ )
+
+
+class CreateTagHoldRequest(proto.Message):
+ r"""The request message to create a TagHold.
+
+ Attributes:
+ parent (str):
+ Required. The resource name of the TagHold's parent
+ TagValue. Must be of the form: ``tagValues/{tag-value-id}``.
+ tag_hold (google.cloud.resourcemanager_v3.types.TagHold):
+ Required. The TagHold to be created.
+ validate_only (bool):
+ Optional. Set to true to perform the
+ validations necessary for creating the resource,
+ but not actually perform the action.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ tag_hold: "TagHold" = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message="TagHold",
+ )
+ validate_only: bool = proto.Field(
+ proto.BOOL,
+ number=3,
+ )
+
+
+class CreateTagHoldMetadata(proto.Message):
+ r"""Runtime operation information for creating a TagHold.
+ (-- The metadata is currently empty, but may include information
+ in the future. --)
+
+ """
+
+
+class DeleteTagHoldRequest(proto.Message):
+ r"""The request message to delete a TagHold.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the TagHold to delete. Must
+ be of the form:
+ ``tagValues/{tag-value-id}/tagHolds/{tag-hold-id}``.
+ validate_only (bool):
+ Optional. Set to true to perform the
+ validations necessary for deleting the resource,
+ but not actually perform the action.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ validate_only: bool = proto.Field(
+ proto.BOOL,
+ number=2,
+ )
+
+
+class DeleteTagHoldMetadata(proto.Message):
+ r"""Runtime operation information for deleting a TagHold.
+ (-- The metadata is currently empty, but may include information
+ in the future. --)
+
+ """
+
+
+class ListTagHoldsRequest(proto.Message):
+ r"""The request message for listing the TagHolds under a
+ TagValue.
+
+ Attributes:
+ parent (str):
+ Required. The resource name of the parent TagValue. Must be
+ of the form: ``tagValues/{tag-value-id}``.
+ page_size (int):
+ Optional. The maximum number of TagHolds to
+ return in the response. The server allows a
+ maximum of 300 TagHolds to return. If
+ unspecified, the server will use 100 as the
+ default.
+ page_token (str):
+ Optional. A pagination token returned from a previous call
+ to ``ListTagHolds`` that indicates where this listing should
+ continue from.
+ filter (str):
+ Optional. Criteria used to select a subset of TagHolds
+ parented by the TagValue to return. This field follows the
+ syntax defined by aip.dev/160; the ``holder`` and ``origin``
+ fields are supported for filtering. Currently only ``AND``
+ syntax is supported. Some example queries are:
+
+ - ``holder = //compute.googleapis.com/compute/projects/myproject/regions/us-east-1/instanceGroupManagers/instance-group``
+ - ``origin = 35678234``
+ - ``holder = //compute.googleapis.com/compute/projects/myproject/regions/us-east-1/instanceGroupManagers/instance-group AND origin = 35678234``
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ filter: str = proto.Field(
+ proto.STRING,
+ number=4,
+ )
+
+
+class ListTagHoldsResponse(proto.Message):
+ r"""The ListTagHolds response.
+
+ Attributes:
+ tag_holds (MutableSequence[google.cloud.resourcemanager_v3.types.TagHold]):
+ A possibly paginated list of TagHolds.
+ next_page_token (str):
+ Pagination token.
+
+ If the result set is too large to fit in a single response,
+ this token is returned. It encodes the position of the
+ current result cursor. Feeding this value into a new list
+ request with the ``page_token`` parameter gives the next
+ page of the results.
+
+ When ``next_page_token`` is not filled in, there is no next
+ page and the list returned is the last page in the result
+ set.
+
+ Pagination tokens have a limited lifetime.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ tag_holds: MutableSequence["TagHold"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message="TagHold",
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/tag_keys.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/tag_keys.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/tag_keys.py
@@ -0,0 +1,381 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.resourcemanager.v3",
+ manifest={
+ "Purpose",
+ "TagKey",
+ "ListTagKeysRequest",
+ "ListTagKeysResponse",
+ "GetTagKeyRequest",
+ "GetNamespacedTagKeyRequest",
+ "CreateTagKeyRequest",
+ "CreateTagKeyMetadata",
+ "UpdateTagKeyRequest",
+ "UpdateTagKeyMetadata",
+ "DeleteTagKeyRequest",
+ "DeleteTagKeyMetadata",
+ },
+)
+
+
+class Purpose(proto.Enum):
+ r"""A purpose for each policy engine requiring such an
+ integration. A single policy engine may have multiple purposes
+ defined, however a TagKey may only specify a single purpose.
+
+ Values:
+ PURPOSE_UNSPECIFIED (0):
+ Unspecified purpose.
+ GCE_FIREWALL (1):
+ Purpose for Compute Engine firewalls. A corresponding
+ ``purpose_data`` should be set for the network the tag is
+ intended for. The key should be ``network`` and the value
+ should be in either of these two formats:
+
+ -
+
+ ``https://www.googleapis.com/compute/{compute_version}/projects/{project_id}/global/networks/{network_id}``
+
+ - ``{project_id}/{network_name}``
+
+ Examples:
+
+ -
+
+ ``https://www.googleapis.com/compute/staging_v1/projects/fail-closed-load-testing/global/networks/6992953698831725600``
+
+ - ``fail-closed-load-testing/load-testing-network``
+ """
+ PURPOSE_UNSPECIFIED = 0
+ GCE_FIREWALL = 1
+
+
+class TagKey(proto.Message):
+ r"""A TagKey, used to group a set of TagValues.
+
+ Attributes:
+ name (str):
+ Immutable. The resource name for a TagKey. Must be in the
+ format ``tagKeys/{tag_key_id}``, where ``tag_key_id`` is the
+ generated numeric id for the TagKey.
+ parent (str):
+ Immutable. The resource name of the TagKey's parent. A
+ TagKey can be parented by an Organization or a Project. For
+ a TagKey parented by an Organization, its parent must be in
+ the form ``organizations/{org_id}``. For a TagKey parented
+ by a Project, its parent can be in the form
+ ``projects/{project_id}`` or ``projects/{project_number}``.
+ short_name (str):
+ Required. Immutable. The user friendly name for a TagKey.
+ The short name should be unique for TagKeys within the same
+ tag namespace.
+
+ The short name must be 1-63 characters, beginning and ending
+ with an alphanumeric character ([a-z0-9A-Z]) with dashes
+ (-), underscores (_), dots (.), and alphanumerics between.
+ namespaced_name (str):
+ Output only. Immutable. Namespaced name of
+ the TagKey.
+ description (str):
+ Optional. User-assigned description of the
+ TagKey. Must not exceed 256 characters.
+
+ Read-write.
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. Creation time.
+ update_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. Update time.
+ etag (str):
+ Optional. Entity tag which users can pass to
+ prevent race conditions. This field is always
+ set in server responses. See UpdateTagKeyRequest
+ for details.
+ purpose (google.cloud.resourcemanager_v3.types.Purpose):
+ Optional. A purpose denotes that this Tag is
+ intended for use in policies of a specific
+ policy engine, and will involve that policy
+ engine in management operations involving this
+ Tag. A purpose does not grant a policy engine
+ exclusive rights to the Tag, and it may be
+ referenced by other policy engines.
+
+ A purpose cannot be changed once set.
+ purpose_data (MutableMapping[str, str]):
+ Optional. Purpose data corresponds to the policy system that
+ the tag is intended for. See documentation for ``Purpose``
+ for formatting of this field.
+
+ Purpose data cannot be changed once set.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ parent: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ short_name: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ namespaced_name: str = proto.Field(
+ proto.STRING,
+ number=4,
+ )
+ description: str = proto.Field(
+ proto.STRING,
+ number=5,
+ )
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=6,
+ message=timestamp_pb2.Timestamp,
+ )
+ update_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=7,
+ message=timestamp_pb2.Timestamp,
+ )
+ etag: str = proto.Field(
+ proto.STRING,
+ number=8,
+ )
+ purpose: "Purpose" = proto.Field(
+ proto.ENUM,
+ number=11,
+ enum="Purpose",
+ )
+ purpose_data: MutableMapping[str, str] = proto.MapField(
+ proto.STRING,
+ proto.STRING,
+ number=12,
+ )
+
+
+class ListTagKeysRequest(proto.Message):
+ r"""The request message for listing all TagKeys under a parent
+ resource.
+
+ Attributes:
+ parent (str):
+ Required. The resource name of the TagKey's parent. Must be
+ of the form ``organizations/{org_id}`` or
+ ``projects/{project_id}`` or ``projects/{project_number}``
+ page_size (int):
+ Optional. The maximum number of TagKeys to
+ return in the response. The server allows a
+ maximum of 300 TagKeys to return. If
+ unspecified, the server will use 100 as the
+ default.
+ page_token (str):
+ Optional. A pagination token returned from a previous call
+ to ``ListTagKey`` that indicates where this listing should
+ continue from.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+class ListTagKeysResponse(proto.Message):
+ r"""The ListTagKeys response message.
+
+ Attributes:
+ tag_keys (MutableSequence[google.cloud.resourcemanager_v3.types.TagKey]):
+ List of TagKeys that live under the specified
+ parent in the request.
+ next_page_token (str):
+ A pagination token returned from a previous call to
+ ``ListTagKeys`` that indicates from where listing should
+ continue.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ tag_keys: MutableSequence["TagKey"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message="TagKey",
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class GetTagKeyRequest(proto.Message):
+ r"""The request message for getting a TagKey.
+
+ Attributes:
+ name (str):
+ Required. A resource name in the format ``tagKeys/{id}``,
+ such as ``tagKeys/123``.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class GetNamespacedTagKeyRequest(proto.Message):
+ r"""The request message for getting a TagKey by its namespaced
+ name.
+
+ Attributes:
+ name (str):
+ Required. A namespaced tag key name in the format
+ ``{parentId}/{tagKeyShort}``, such as ``42/foo`` for a key
+ with short name "foo" under the organization with ID 42 or
+ ``r2-d2/bar`` for a key with short name "bar" under the
+ project ``r2-d2``.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class CreateTagKeyRequest(proto.Message):
+ r"""The request message for creating a TagKey.
+
+ Attributes:
+ tag_key (google.cloud.resourcemanager_v3.types.TagKey):
+ Required. The TagKey to be created. Only fields
+ ``short_name``, ``description``, and ``parent`` are
+ considered during the creation request.
+ validate_only (bool):
+ Optional. Set to true to perform validations
+ necessary for creating the resource, but not
+ actually perform the action.
+ """
+
+ tag_key: "TagKey" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message="TagKey",
+ )
+ validate_only: bool = proto.Field(
+ proto.BOOL,
+ number=2,
+ )
+
+
+class CreateTagKeyMetadata(proto.Message):
+ r"""Runtime operation information for creating a TagKey."""
+
+
+class UpdateTagKeyRequest(proto.Message):
+ r"""The request message for updating a TagKey.
+
+ Attributes:
+ tag_key (google.cloud.resourcemanager_v3.types.TagKey):
+ Required. The new definition of the TagKey. Only the
+ ``description`` and ``etag`` fields can be updated by this
+ request. If the ``etag`` field is not empty, it must match
+ the ``etag`` field of the existing tag key. Otherwise,
+ ``ABORTED`` will be returned.
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Fields to be updated. The mask may only contain
+ ``description`` or ``etag``. If omitted entirely, both
+ ``description`` and ``etag`` are assumed to be significant.
+ validate_only (bool):
+ Set as true to perform validations necessary
+ for updating the resource, but not actually
+ perform the action.
+ """
+
+ tag_key: "TagKey" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message="TagKey",
+ )
+ update_mask: field_mask_pb2.FieldMask = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=field_mask_pb2.FieldMask,
+ )
+ validate_only: bool = proto.Field(
+ proto.BOOL,
+ number=3,
+ )
+
+
+class UpdateTagKeyMetadata(proto.Message):
+ r"""Runtime operation information for updating a TagKey."""
+
+
+class DeleteTagKeyRequest(proto.Message):
+ r"""The request message for deleting a TagKey.
+
+ Attributes:
+ name (str):
+ Required. The resource name of a TagKey to be deleted in the
+ format ``tagKeys/123``. The TagKey cannot be a parent of any
+ existing TagValues or it will not be deleted successfully.
+ validate_only (bool):
+ Optional. Set as true to perform validations
+ necessary for deletion, but not actually perform
+ the action.
+ etag (str):
+ Optional. The etag known to the client for
+ the expected state of the TagKey. This is to be
+ used for optimistic concurrency.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ validate_only: bool = proto.Field(
+ proto.BOOL,
+ number=2,
+ )
+ etag: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+class DeleteTagKeyMetadata(proto.Message):
+ r"""Runtime operation information for deleting a TagKey."""
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/tag_values.py b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/tag_values.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/google/cloud/resourcemanager_v3/types/tag_values.py
@@ -0,0 +1,331 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.resourcemanager.v3",
+ manifest={
+ "TagValue",
+ "ListTagValuesRequest",
+ "ListTagValuesResponse",
+ "GetTagValueRequest",
+ "GetNamespacedTagValueRequest",
+ "CreateTagValueRequest",
+ "CreateTagValueMetadata",
+ "UpdateTagValueRequest",
+ "UpdateTagValueMetadata",
+ "DeleteTagValueRequest",
+ "DeleteTagValueMetadata",
+ },
+)
+
+
+class TagValue(proto.Message):
+ r"""A TagValue is a child of a particular TagKey. This is used to
+ group cloud resources for the purpose of controlling them using
+ policies.
+
+ Attributes:
+ name (str):
+ Immutable. Resource name for TagValue in the format
+ ``tagValues/456``.
+ parent (str):
+ Immutable. The resource name of the new TagValue's parent
+ TagKey. Must be of the form ``tagKeys/{tag_key_id}``.
+ short_name (str):
+ Required. Immutable. User-assigned short name for TagValue.
+ The short name should be unique for TagValues within the
+ same parent TagKey.
+
+ The short name must be 63 characters or less, beginning and
+ ending with an alphanumeric character ([a-z0-9A-Z]) with
+ dashes (-), underscores (_), dots (.), and alphanumerics
+ between.
+ namespaced_name (str):
+ Output only. The namespaced name of the TagValue. Can be in
+ the form
+ ``{organization_id}/{tag_key_short_name}/{tag_value_short_name}``
+ or
+ ``{project_id}/{tag_key_short_name}/{tag_value_short_name}``
+ or
+ ``{project_number}/{tag_key_short_name}/{tag_value_short_name}``.
+ description (str):
+ Optional. User-assigned description of the
+ TagValue. Must not exceed 256 characters.
+
+ Read-write.
+ create_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. Creation time.
+ update_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. Update time.
+ etag (str):
+ Optional. Entity tag which users can pass to
+ prevent race conditions. This field is always
+ set in server responses. See
+ UpdateTagValueRequest for details.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ parent: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ short_name: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ namespaced_name: str = proto.Field(
+ proto.STRING,
+ number=4,
+ )
+ description: str = proto.Field(
+ proto.STRING,
+ number=5,
+ )
+ create_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=6,
+ message=timestamp_pb2.Timestamp,
+ )
+ update_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=7,
+ message=timestamp_pb2.Timestamp,
+ )
+ etag: str = proto.Field(
+ proto.STRING,
+ number=8,
+ )
+
+
+class ListTagValuesRequest(proto.Message):
+ r"""The request message for listing TagValues for the specified TagKey.
+ Resource name for TagKey, parent of the TagValues to be listed, in
+ the format ``tagKeys/123``.
+
+ Attributes:
+ parent (str):
+ Required.
+ page_size (int):
+ Optional. The maximum number of TagValues to
+ return in the response. The server allows a
+ maximum of 300 TagValues to return. If
+ unspecified, the server will use 100 as the
+ default.
+ page_token (str):
+ Optional. A pagination token returned from a previous call
+ to ``ListTagValues`` that indicates where this listing
+ should continue from.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+class ListTagValuesResponse(proto.Message):
+ r"""The ListTagValues response.
+
+ Attributes:
+ tag_values (MutableSequence[google.cloud.resourcemanager_v3.types.TagValue]):
+ A possibly paginated list of TagValues that
+ are direct descendants of the specified parent
+ TagKey.
+ next_page_token (str):
+ A pagination token returned from a previous call to
+ ``ListTagValues`` that indicates from where listing should
+ continue. This is currently not used, but the server may at
+ any point start supplying a valid token.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ tag_values: MutableSequence["TagValue"] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message="TagValue",
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class GetTagValueRequest(proto.Message):
+ r"""The request message for getting a TagValue.
+
+ Attributes:
+ name (str):
+ Required. Resource name for TagValue to be fetched in the
+ format ``tagValues/456``.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class GetNamespacedTagValueRequest(proto.Message):
+ r"""The request message for getting a TagValue by its namespaced
+ name.
+
+ Attributes:
+ name (str):
+ Required. A namespaced tag value name in the following
+ format:
+
+ ``{parentId}/{tagKeyShort}/{tagValueShort}``
+
+ Examples:
+
+ - ``42/foo/abc`` for a value with short name "abc" under
+ the key with short name "foo" under the organization with
+ ID 42
+ - ``r2-d2/bar/xyz`` for a value with short name "xyz" under
+ the key with short name "bar" under the project with ID
+ "r2-d2".
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class CreateTagValueRequest(proto.Message):
+ r"""The request message for creating a TagValue.
+
+ Attributes:
+ tag_value (google.cloud.resourcemanager_v3.types.TagValue):
+ Required. The TagValue to be created. Only fields
+ ``short_name``, ``description``, and ``parent`` are
+ considered during the creation request.
+ validate_only (bool):
+ Optional. Set as true to perform the
+ validations necessary for creating the resource,
+ but not actually perform the action.
+ """
+
+ tag_value: "TagValue" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message="TagValue",
+ )
+ validate_only: bool = proto.Field(
+ proto.BOOL,
+ number=2,
+ )
+
+
+class CreateTagValueMetadata(proto.Message):
+ r"""Runtime operation information for creating a TagValue."""
+
+
+class UpdateTagValueRequest(proto.Message):
+ r"""The request message for updating a TagValue.
+
+ Attributes:
+ tag_value (google.cloud.resourcemanager_v3.types.TagValue):
+ Required. The new definition of the TagValue. Only fields
+ ``description`` and ``etag`` fields can be updated by this
+ request. If the ``etag`` field is nonempty, it must match
+ the ``etag`` field of the existing ControlGroup. Otherwise,
+ ``ABORTED`` will be returned.
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Optional. Fields to be updated.
+ validate_only (bool):
+ Optional. True to perform validations
+ necessary for updating the resource, but not
+ actually perform the action.
+ """
+
+ tag_value: "TagValue" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message="TagValue",
+ )
+ update_mask: field_mask_pb2.FieldMask = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=field_mask_pb2.FieldMask,
+ )
+ validate_only: bool = proto.Field(
+ proto.BOOL,
+ number=3,
+ )
+
+
+class UpdateTagValueMetadata(proto.Message):
+ r"""Runtime operation information for updating a TagValue."""
+
+
+class DeleteTagValueRequest(proto.Message):
+ r"""The request message for deleting a TagValue.
+
+ Attributes:
+ name (str):
+ Required. Resource name for TagValue to be
+ deleted in the format tagValues/456.
+ validate_only (bool):
+ Optional. Set as true to perform the
+ validations necessary for deletion, but not
+ actually perform the action.
+ etag (str):
+ Optional. The etag known to the client for
+ the expected state of the TagValue. This is to
+ be used for optimistic concurrency.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ validate_only: bool = proto.Field(
+ proto.BOOL,
+ number=2,
+ )
+ etag: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+class DeleteTagValueMetadata(proto.Message):
+ r"""Runtime operation information for deleting a TagValue."""
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-resource-manager/noxfile.py b/packages/google-cloud-resource-manager/noxfile.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/noxfile.py
@@ -0,0 +1,428 @@
+# -*- coding: utf-8 -*-
+#
+# Copyright 2018 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# Generated by synthtool. DO NOT EDIT!
+
+from __future__ import absolute_import
+
+import os
+import pathlib
+import re
+import shutil
+import warnings
+
+import nox
+
+BLACK_VERSION = "black==22.3.0"
+ISORT_VERSION = "isort==5.10.1"
+LINT_PATHS = ["docs", "google", "tests", "noxfile.py", "setup.py"]
+
+DEFAULT_PYTHON_VERSION = "3.9"
+
+UNIT_TEST_PYTHON_VERSIONS = ["3.7", "3.8", "3.9", "3.10", "3.11"]
+UNIT_TEST_STANDARD_DEPENDENCIES = [
+ "mock",
+ "asyncmock",
+ "pytest",
+ "pytest-cov",
+ "pytest-asyncio",
+]
+UNIT_TEST_EXTERNAL_DEPENDENCIES = []
+UNIT_TEST_LOCAL_DEPENDENCIES = []
+UNIT_TEST_DEPENDENCIES = []
+UNIT_TEST_EXTRAS = []
+UNIT_TEST_EXTRAS_BY_PYTHON = {}
+
+SYSTEM_TEST_PYTHON_VERSIONS = []
+SYSTEM_TEST_STANDARD_DEPENDENCIES = [
+ "mock",
+ "pytest",
+ "google-cloud-testutils",
+]
+SYSTEM_TEST_EXTERNAL_DEPENDENCIES = []
+SYSTEM_TEST_LOCAL_DEPENDENCIES = []
+SYSTEM_TEST_DEPENDENCIES = []
+SYSTEM_TEST_EXTRAS = []
+SYSTEM_TEST_EXTRAS_BY_PYTHON = {}
+
+CURRENT_DIRECTORY = pathlib.Path(__file__).parent.absolute()
+
+# 'docfx' is excluded since it only needs to run in 'docs-presubmit'
+nox.options.sessions = [
+ "unit",
+ "system",
+ "cover",
+ "lint",
+ "lint_setup_py",
+ "blacken",
+ "docs",
+]
+
+# Error if a python version is missing
+nox.options.error_on_missing_interpreters = True
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def lint(session):
+ """Run linters.
+
+ Returns a failure if the linters find linting errors or sufficiently
+ serious code quality issues.
+ """
+ session.install("flake8", BLACK_VERSION)
+ session.run(
+ "black",
+ "--check",
+ *LINT_PATHS,
+ )
+ session.run("flake8", "google", "tests")
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def blacken(session):
+ """Run black. Format code to uniform standard."""
+ session.install(BLACK_VERSION)
+ session.run(
+ "black",
+ *LINT_PATHS,
+ )
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def format(session):
+ """
+ Run isort to sort imports. Then run black
+ to format code to uniform standard.
+ """
+ session.install(BLACK_VERSION, ISORT_VERSION)
+ # Use the --fss option to sort imports using strict alphabetical order.
+ # See https://pycqa.github.io/isort/docs/configuration/options.html#force-sort-within-sections
+ session.run(
+ "isort",
+ "--fss",
+ *LINT_PATHS,
+ )
+ session.run(
+ "black",
+ *LINT_PATHS,
+ )
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def lint_setup_py(session):
+ """Verify that setup.py is valid (including RST check)."""
+ session.install("docutils", "pygments")
+ session.run("python", "setup.py", "check", "--restructuredtext", "--strict")
+
+
+def install_unittest_dependencies(session, *constraints):
+ standard_deps = UNIT_TEST_STANDARD_DEPENDENCIES + UNIT_TEST_DEPENDENCIES
+ session.install(*standard_deps, *constraints)
+
+ if UNIT_TEST_EXTERNAL_DEPENDENCIES:
+ warnings.warn(
+ "'unit_test_external_dependencies' is deprecated. Instead, please "
+ "use 'unit_test_dependencies' or 'unit_test_local_dependencies'.",
+ DeprecationWarning,
+ )
+ session.install(*UNIT_TEST_EXTERNAL_DEPENDENCIES, *constraints)
+
+ if UNIT_TEST_LOCAL_DEPENDENCIES:
+ session.install(*UNIT_TEST_LOCAL_DEPENDENCIES, *constraints)
+
+ if UNIT_TEST_EXTRAS_BY_PYTHON:
+ extras = UNIT_TEST_EXTRAS_BY_PYTHON.get(session.python, [])
+ elif UNIT_TEST_EXTRAS:
+ extras = UNIT_TEST_EXTRAS
+ else:
+ extras = []
+
+ if extras:
+ session.install("-e", f".[{','.join(extras)}]", *constraints)
+ else:
+ session.install("-e", ".", *constraints)
+
+
+def default(session):
+ # Install all test dependencies, then install this package in-place.
+
+ constraints_path = str(
+ CURRENT_DIRECTORY / "testing" / f"constraints-{session.python}.txt"
+ )
+ install_unittest_dependencies(session, "-c", constraints_path)
+
+ # Run py.test against the unit tests.
+ session.run(
+ "py.test",
+ "--quiet",
+ f"--junitxml=unit_{session.python}_sponge_log.xml",
+ "--cov=google",
+ "--cov=tests/unit",
+ "--cov-append",
+ "--cov-config=.coveragerc",
+ "--cov-report=",
+ "--cov-fail-under=0",
+ os.path.join("tests", "unit"),
+ *session.posargs,
+ )
+
+
+@nox.session(python=UNIT_TEST_PYTHON_VERSIONS)
+def unit(session):
+ """Run the unit test suite."""
+ default(session)
+
+
+def install_systemtest_dependencies(session, *constraints):
+
+ # Use pre-release gRPC for system tests.
+ # Exclude version 1.52.0rc1 which has a known issue.
+ # See https://github.com/grpc/grpc/issues/32163
+ session.install("--pre", "grpcio!=1.52.0rc1")
+
+ session.install(*SYSTEM_TEST_STANDARD_DEPENDENCIES, *constraints)
+
+ if SYSTEM_TEST_EXTERNAL_DEPENDENCIES:
+ session.install(*SYSTEM_TEST_EXTERNAL_DEPENDENCIES, *constraints)
+
+ if SYSTEM_TEST_LOCAL_DEPENDENCIES:
+ session.install("-e", *SYSTEM_TEST_LOCAL_DEPENDENCIES, *constraints)
+
+ if SYSTEM_TEST_DEPENDENCIES:
+ session.install("-e", *SYSTEM_TEST_DEPENDENCIES, *constraints)
+
+ if SYSTEM_TEST_EXTRAS_BY_PYTHON:
+ extras = SYSTEM_TEST_EXTRAS_BY_PYTHON.get(session.python, [])
+ elif SYSTEM_TEST_EXTRAS:
+ extras = SYSTEM_TEST_EXTRAS
+ else:
+ extras = []
+
+ if extras:
+ session.install("-e", f".[{','.join(extras)}]", *constraints)
+ else:
+ session.install("-e", ".", *constraints)
+
+
+@nox.session(python=SYSTEM_TEST_PYTHON_VERSIONS)
+def system(session):
+ """Run the system test suite."""
+ constraints_path = str(
+ CURRENT_DIRECTORY / "testing" / f"constraints-{session.python}.txt"
+ )
+ system_test_path = os.path.join("tests", "system.py")
+ system_test_folder_path = os.path.join("tests", "system")
+
+ # Check the value of `RUN_SYSTEM_TESTS` env var. It defaults to true.
+ if os.environ.get("RUN_SYSTEM_TESTS", "true") == "false":
+ session.skip("RUN_SYSTEM_TESTS is set to false, skipping")
+ # Install pyopenssl for mTLS testing.
+ if os.environ.get("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false") == "true":
+ session.install("pyopenssl")
+
+ system_test_exists = os.path.exists(system_test_path)
+ system_test_folder_exists = os.path.exists(system_test_folder_path)
+ # Sanity check: only run tests if found.
+ if not system_test_exists and not system_test_folder_exists:
+ session.skip("System tests were not found")
+
+ install_systemtest_dependencies(session, "-c", constraints_path)
+
+ # Run py.test against the system tests.
+ if system_test_exists:
+ session.run(
+ "py.test",
+ "--quiet",
+ f"--junitxml=system_{session.python}_sponge_log.xml",
+ system_test_path,
+ *session.posargs,
+ )
+ if system_test_folder_exists:
+ session.run(
+ "py.test",
+ "--quiet",
+ f"--junitxml=system_{session.python}_sponge_log.xml",
+ system_test_folder_path,
+ *session.posargs,
+ )
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def cover(session):
+ """Run the final coverage report.
+
+ This outputs the coverage report aggregating coverage from the unit
+ test runs (not system test runs), and then erases coverage data.
+ """
+ session.install("coverage", "pytest-cov")
+ session.run("coverage", "report", "--show-missing", "--fail-under=100")
+
+ session.run("coverage", "erase")
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def docs(session):
+ """Build the docs for this library."""
+
+ session.install("-e", ".")
+ session.install(
+ "sphinx==4.0.1",
+ "alabaster",
+ "recommonmark",
+ )
+
+ shutil.rmtree(os.path.join("docs", "_build"), ignore_errors=True)
+ session.run(
+ "sphinx-build",
+ "-W", # warnings as errors
+ "-T", # show full traceback on exception
+ "-N", # no colors
+ "-b",
+ "html",
+ "-d",
+ os.path.join("docs", "_build", "doctrees", ""),
+ os.path.join("docs", ""),
+ os.path.join("docs", "_build", "html", ""),
+ )
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def docfx(session):
+ """Build the docfx yaml files for this library."""
+
+ session.install("-e", ".")
+ session.install(
+ "sphinx==4.0.1",
+ "alabaster",
+ "recommonmark",
+ "gcp-sphinx-docfx-yaml",
+ )
+
+ shutil.rmtree(os.path.join("docs", "_build"), ignore_errors=True)
+ session.run(
+ "sphinx-build",
+ "-T", # show full traceback on exception
+ "-N", # no colors
+ "-D",
+ (
+ "extensions=sphinx.ext.autodoc,"
+ "sphinx.ext.autosummary,"
+ "docfx_yaml.extension,"
+ "sphinx.ext.intersphinx,"
+ "sphinx.ext.coverage,"
+ "sphinx.ext.napoleon,"
+ "sphinx.ext.todo,"
+ "sphinx.ext.viewcode,"
+ "recommonmark"
+ ),
+ "-b",
+ "html",
+ "-d",
+ os.path.join("docs", "_build", "doctrees", ""),
+ os.path.join("docs", ""),
+ os.path.join("docs", "_build", "html", ""),
+ )
+
+
+@nox.session(python="3.11")
+def prerelease_deps(session):
+ """Run all tests with prerelease versions of dependencies installed."""
+
+ # Install all dependencies
+ session.install("-e", ".[all, tests, tracing]")
+ unit_deps_all = UNIT_TEST_STANDARD_DEPENDENCIES + UNIT_TEST_EXTERNAL_DEPENDENCIES
+ session.install(*unit_deps_all)
+ system_deps_all = (
+ SYSTEM_TEST_STANDARD_DEPENDENCIES
+ + SYSTEM_TEST_EXTERNAL_DEPENDENCIES
+ + SYSTEM_TEST_EXTRAS
+ )
+ session.install(*system_deps_all)
+
+ # Because we test minimum dependency versions on the minimum Python
+ # version, the first version we test with in the unit tests sessions has a
+ # constraints file containing all dependencies and extras.
+ with open(
+ CURRENT_DIRECTORY
+ / "testing"
+ / f"constraints-{UNIT_TEST_PYTHON_VERSIONS[0]}.txt",
+ encoding="utf-8",
+ ) as constraints_file:
+ constraints_text = constraints_file.read()
+
+ # Ignore leading whitespace and comment lines.
+ constraints_deps = [
+ match.group(1)
+ for match in re.finditer(
+ r"^\s*(\S+)(?===\S+)", constraints_text, flags=re.MULTILINE
+ )
+ ]
+
+ session.install(*constraints_deps)
+
+ prerel_deps = [
+ "protobuf",
+ # dependency of grpc
+ "six",
+ "googleapis-common-protos",
+ # Exclude version 1.52.0rc1 which has a known issue. See https://github.com/grpc/grpc/issues/32163
+ "grpcio!=1.52.0rc1",
+ "grpcio-status",
+ "google-api-core",
+ "proto-plus",
+ "google-cloud-testutils",
+ # dependencies of google-cloud-testutils"
+ "click",
+ ]
+
+ for dep in prerel_deps:
+ session.install("--pre", "--no-deps", "--upgrade", dep)
+
+ # Remaining dependencies
+ other_deps = [
+ "requests",
+ "google-auth",
+ ]
+ session.install(*other_deps)
+
+ # Print out prerelease package versions
+ session.run(
+ "python", "-c", "import google.protobuf; print(google.protobuf.__version__)"
+ )
+ session.run("python", "-c", "import grpc; print(grpc.__version__)")
+
+ session.run("py.test", "tests/unit")
+
+ system_test_path = os.path.join("tests", "system.py")
+ system_test_folder_path = os.path.join("tests", "system")
+
+ # Only run system tests if found.
+ if os.path.exists(system_test_path):
+ session.run(
+ "py.test",
+ "--verbose",
+ f"--junitxml=system_{session.python}_sponge_log.xml",
+ system_test_path,
+ *session.posargs,
+ )
+ if os.path.exists(system_test_folder_path):
+ session.run(
+ "py.test",
+ "--verbose",
+ f"--junitxml=system_{session.python}_sponge_log.xml",
+ system_test_folder_path,
+ *session.posargs,
+ )
diff --git a/packages/google-cloud-resource-manager/pylint.config.py b/packages/google-cloud-resource-manager/pylint.config.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/pylint.config.py
@@ -0,0 +1,25 @@
+# Copyright 2017 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""This module is used to configure gcp-devrel-py-tools run-pylint."""
+
+# Library configuration
+
+# library_additions = {}
+# library_replacements = {}
+
+# Test configuration
+
+# test_additions = copy.deepcopy(library_additions)
+# test_replacements = copy.deepcopy(library_replacements)
diff --git a/packages/google-cloud-resource-manager/scripts/fixup_resourcemanager_v3_keywords.py b/packages/google-cloud-resource-manager/scripts/fixup_resourcemanager_v3_keywords.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/scripts/fixup_resourcemanager_v3_keywords.py
@@ -0,0 +1,215 @@
+#! /usr/bin/env python3
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import argparse
+import os
+import libcst as cst
+import pathlib
+import sys
+from typing import (Any, Callable, Dict, List, Sequence, Tuple)
+
+
+def partition(
+ predicate: Callable[[Any], bool],
+ iterator: Sequence[Any]
+) -> Tuple[List[Any], List[Any]]:
+ """A stable, out-of-place partition."""
+ results = ([], [])
+
+ for i in iterator:
+ results[int(predicate(i))].append(i)
+
+ # Returns trueList, falseList
+ return results[1], results[0]
+
+
+class resourcemanagerCallTransformer(cst.CSTTransformer):
+ CTRL_PARAMS: Tuple[str] = ('retry', 'timeout', 'metadata')
+ METHOD_TO_PARAMS: Dict[str, Tuple[str]] = {
+ 'create_folder': ('folder', ),
+ 'create_project': ('project', ),
+ 'create_tag_binding': ('tag_binding', 'validate_only', ),
+ 'create_tag_hold': ('parent', 'tag_hold', 'validate_only', ),
+ 'create_tag_key': ('tag_key', 'validate_only', ),
+ 'create_tag_value': ('tag_value', 'validate_only', ),
+ 'delete_folder': ('name', ),
+ 'delete_project': ('name', ),
+ 'delete_tag_binding': ('name', ),
+ 'delete_tag_hold': ('name', 'validate_only', ),
+ 'delete_tag_key': ('name', 'validate_only', 'etag', ),
+ 'delete_tag_value': ('name', 'validate_only', 'etag', ),
+ 'get_folder': ('name', ),
+ 'get_iam_policy': ('resource', 'options', ),
+ 'get_namespaced_tag_key': ('name', ),
+ 'get_namespaced_tag_value': ('name', ),
+ 'get_organization': ('name', ),
+ 'get_project': ('name', ),
+ 'get_tag_key': ('name', ),
+ 'get_tag_value': ('name', ),
+ 'list_effective_tags': ('parent', 'page_size', 'page_token', ),
+ 'list_folders': ('parent', 'page_size', 'page_token', 'show_deleted', ),
+ 'list_projects': ('parent', 'page_token', 'page_size', 'show_deleted', ),
+ 'list_tag_bindings': ('parent', 'page_size', 'page_token', ),
+ 'list_tag_holds': ('parent', 'page_size', 'page_token', 'filter', ),
+ 'list_tag_keys': ('parent', 'page_size', 'page_token', ),
+ 'list_tag_values': ('parent', 'page_size', 'page_token', ),
+ 'move_folder': ('name', 'destination_parent', ),
+ 'move_project': ('name', 'destination_parent', ),
+ 'search_folders': ('page_size', 'page_token', 'query', ),
+ 'search_organizations': ('page_size', 'page_token', 'query', ),
+ 'search_projects': ('query', 'page_token', 'page_size', ),
+ 'set_iam_policy': ('resource', 'policy', 'update_mask', ),
+ 'test_iam_permissions': ('resource', 'permissions', ),
+ 'undelete_folder': ('name', ),
+ 'undelete_project': ('name', ),
+ 'update_folder': ('folder', 'update_mask', ),
+ 'update_project': ('project', 'update_mask', ),
+ 'update_tag_key': ('tag_key', 'update_mask', 'validate_only', ),
+ 'update_tag_value': ('tag_value', 'update_mask', 'validate_only', ),
+ }
+
+ def leave_Call(self, original: cst.Call, updated: cst.Call) -> cst.CSTNode:
+ try:
+ key = original.func.attr.value
+ kword_params = self.METHOD_TO_PARAMS[key]
+ except (AttributeError, KeyError):
+ # Either not a method from the API or too convoluted to be sure.
+ return updated
+
+ # If the existing code is valid, keyword args come after positional args.
+ # Therefore, all positional args must map to the first parameters.
+ args, kwargs = partition(lambda a: not bool(a.keyword), updated.args)
+ if any(k.keyword.value == "request" for k in kwargs):
+ # We've already fixed this file, don't fix it again.
+ return updated
+
+ kwargs, ctrl_kwargs = partition(
+ lambda a: a.keyword.value not in self.CTRL_PARAMS,
+ kwargs
+ )
+
+ args, ctrl_args = args[:len(kword_params)], args[len(kword_params):]
+ ctrl_kwargs.extend(cst.Arg(value=a.value, keyword=cst.Name(value=ctrl))
+ for a, ctrl in zip(ctrl_args, self.CTRL_PARAMS))
+
+ request_arg = cst.Arg(
+ value=cst.Dict([
+ cst.DictElement(
+ cst.SimpleString("'{}'".format(name)),
+cst.Element(value=arg.value)
+ )
+ # Note: the args + kwargs looks silly, but keep in mind that
+ # the control parameters had to be stripped out, and that
+ # those could have been passed positionally or by keyword.
+ for name, arg in zip(kword_params, args + kwargs)]),
+ keyword=cst.Name("request")
+ )
+
+ return updated.with_changes(
+ args=[request_arg] + ctrl_kwargs
+ )
+
+
+def fix_files(
+ in_dir: pathlib.Path,
+ out_dir: pathlib.Path,
+ *,
+ transformer=resourcemanagerCallTransformer(),
+):
+ """Duplicate the input dir to the output dir, fixing file method calls.
+
+ Preconditions:
+ * in_dir is a real directory
+ * out_dir is a real, empty directory
+ """
+ pyfile_gen = (
+ pathlib.Path(os.path.join(root, f))
+ for root, _, files in os.walk(in_dir)
+ for f in files if os.path.splitext(f)[1] == ".py"
+ )
+
+ for fpath in pyfile_gen:
+ with open(fpath, 'r') as f:
+ src = f.read()
+
+ # Parse the code and insert method call fixes.
+ tree = cst.parse_module(src)
+ updated = tree.visit(transformer)
+
+ # Create the path and directory structure for the new file.
+ updated_path = out_dir.joinpath(fpath.relative_to(in_dir))
+ updated_path.parent.mkdir(parents=True, exist_ok=True)
+
+ # Generate the updated source file at the corresponding path.
+ with open(updated_path, 'w') as f:
+ f.write(updated.code)
+
+
+if __name__ == '__main__':
+ parser = argparse.ArgumentParser(
+ description="""Fix up source that uses the resourcemanager client library.
+
+The existing sources are NOT overwritten but are copied to output_dir with changes made.
+
+Note: This tool operates at a best-effort level at converting positional
+ parameters in client method calls to keyword based parameters.
+ Cases where it WILL FAIL include
+ A) * or ** expansion in a method call.
+ B) Calls via function or method alias (includes free function calls)
+ C) Indirect or dispatched calls (e.g. the method is looked up dynamically)
+
+ These all constitute false negatives. The tool will also detect false
+ positives when an API method shares a name with another method.
+""")
+ parser.add_argument(
+ '-d',
+ '--input-directory',
+ required=True,
+ dest='input_dir',
+ help='the input directory to walk for python files to fix up',
+ )
+ parser.add_argument(
+ '-o',
+ '--output-directory',
+ required=True,
+ dest='output_dir',
+ help='the directory to output files fixed via un-flattening',
+ )
+ args = parser.parse_args()
+ input_dir = pathlib.Path(args.input_dir)
+ output_dir = pathlib.Path(args.output_dir)
+ if not input_dir.is_dir():
+ print(
+ f"input directory '{input_dir}' does not exist or is not a directory",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ if not output_dir.is_dir():
+ print(
+ f"output directory '{output_dir}' does not exist or is not a directory",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ if os.listdir(output_dir):
+ print(
+ f"output directory '{output_dir}' is not empty",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ fix_files(input_dir, output_dir)
diff --git a/packages/google-cloud-resource-manager/scripts/readme-gen/readme_gen.py b/packages/google-cloud-resource-manager/scripts/readme-gen/readme_gen.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/scripts/readme-gen/readme_gen.py
@@ -0,0 +1,69 @@
+#!/usr/bin/env python
+
+# Copyright 2016 Google Inc
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Generates READMEs using configuration defined in yaml."""
+
+import argparse
+import io
+import os
+import subprocess
+
+import jinja2
+import yaml
+
+
+jinja_env = jinja2.Environment(
+ trim_blocks=True,
+ loader=jinja2.FileSystemLoader(
+ os.path.abspath(os.path.join(os.path.dirname(__file__), "templates"))
+ ),
+ autoescape=True,
+)
+
+README_TMPL = jinja_env.get_template('README.tmpl.rst')
+
+
+def get_help(file):
+ return subprocess.check_output(['python', file, '--help']).decode()
+
+
+def main():
+ parser = argparse.ArgumentParser()
+ parser.add_argument('source')
+ parser.add_argument('--destination', default='README.rst')
+
+ args = parser.parse_args()
+
+ source = os.path.abspath(args.source)
+ root = os.path.dirname(source)
+ destination = os.path.join(root, args.destination)
+
+ jinja_env.globals['get_help'] = get_help
+
+ with io.open(source, 'r') as f:
+ config = yaml.load(f)
+
+ # This allows get_help to execute in the right directory.
+ os.chdir(root)
+
+ output = README_TMPL.render(config)
+
+ with io.open(destination, 'w') as f:
+ f.write(output)
+
+
+if __name__ == '__main__':
+ main()
diff --git a/packages/google-cloud-resource-manager/setup.py b/packages/google-cloud-resource-manager/setup.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-resource-manager/setup.py
@@ -0,0 +1,93 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import io
+import os
+
+import setuptools # type: ignore
+
+package_root = os.path.abspath(os.path.dirname(__file__))
+
+name = "google-cloud-resource-manager"
+
+
+description = "Google Cloud Resource Manager API client library"
+
+version = {}
+with open(
+ os.path.join(package_root, "google/cloud/resourcemanager/gapic_version.py")
+) as fp:
+ exec(fp.read(), version)
+version = version["__version__"]
+
+if version[0] == "0":
+ release_status = "Development Status :: 4 - Beta"
+else:
+ release_status = "Development Status :: 5 - Production/Stable"
+
+dependencies = [
+ "google-api-core[grpc] >= 1.34.0, <3.0.0dev,!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,!=2.10.*",
+ "proto-plus >= 1.22.0, <2.0.0dev",
+ "proto-plus >= 1.22.2, <2.0.0dev; python_version>='3.11'",
+ "protobuf>=3.19.5,<5.0.0dev,!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5",
+ "grpc-google-iam-v1 >= 0.12.4, <1.0.0dev",
+]
+url = "https://github.com/googleapis/google-cloud-python"
+
+package_root = os.path.abspath(os.path.dirname(__file__))
+
+readme_filename = os.path.join(package_root, "README.rst")
+with io.open(readme_filename, encoding="utf-8") as readme_file:
+ readme = readme_file.read()
+
+packages = [
+ package
+ for package in setuptools.PEP420PackageFinder.find()
+ if package.startswith("google")
+]
+
+namespaces = ["google", "google.cloud"]
+
+setuptools.setup(
+ name=name,
+ version=version,
+ description=description,
+ long_description=readme,
+ author="Google LLC",
+ author_email="googleapis-packages@google.com",
+ license="Apache 2.0",
+ url=url,
+ classifiers=[
+ release_status,
+ "Intended Audience :: Developers",
+ "License :: OSI Approved :: Apache Software License",
+ "Programming Language :: Python",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3.7",
+ "Programming Language :: Python :: 3.8",
+ "Programming Language :: Python :: 3.9",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
+ "Operating System :: OS Independent",
+ "Topic :: Internet",
+ ],
+ platforms="Posix; MacOS X; Windows",
+ packages=packages,
+ python_requires=">=3.7",
+ namespace_packages=namespaces,
+ install_requires=dependencies,
+ include_package_data=True,
+ zip_safe=False,
+)
| Silent Failure Uploading file to Bucket Configured as a Website
Testing uploading a file to a bucket it silently fails if the bucket is configured as a website. Here is the sample code being used:
``` py
import gcloud.storage
connection = gcloud.storage.get_connection('project-id', 'email@developer.gserviceaccount.com', 'key-file.key')
bucket = connection.get_bucket('some.website.com')
path = 'answer.html'
file_key = bucket.new_key(path)
file_key.set_contents_from_string('42')
key = bucket.get_key(path)
print key.get_contents_as_string()
```
When I run this against a normal bucket the file is saved and read correctly. When run against a bucket that is configured as a website the upload silently fails and is unable to retrieve the value from the bucket to print.
Big query TIMESTAMP field off by factor of 1000
Using master branch. In BigQuery TIMESTAMP fields are off by a factor of 1000 when using fetch_data.
See my in-line comment - https://github.com/GoogleCloudPlatform/gcloud-python/commit/41e64395661a31af8352fb2358e5c95ff5437830#diff-c56e349a3e511bfabb4e8dfcfe686bc8R739
Scheduler needs to shutdown in a way compatible with SimpleQueue and Queue types for worker queue.
Currently one test is failing for the scheduler. The issue is Py3.7 changed the _worker_queue to be a SimpleQueue.
Traceback (most recent call last):
File "/Users/crwilcox/workspace/google-cloud-python/pubsub/tests/unit/pubsub_v1/subscriber/test_scheduler.py", line 51, in test_schedule
scheduler_.shutdown()
File "/Users/crwilcox/workspace/google-cloud-python/pubsub/google/cloud/pubsub_v1/subscriber/scheduler.py", line 121, in shutdown
self._executor._work_queue.
AttributeError: '_queue.SimpleQueue' object has no attribute 'queue'
| I can't seem to reproduce it. I created a bucket and configured it as a website using bucket.configure_website('index.html'), but even then the key.get_contents_as_string() call succeeded. Am I missing something or is there any chance this was fixed in the meantime?
I'm doing some more testing on this to make sure, but it may have something to do with using a secondary key for the service account. I had generated a new key for the service account since I didn't have access to the original key and it was giving me some forbidden errors when I try to create a new bucket.
Should there be a problem using multiple keys for a service account? Or should any of the generated keys work correctly?
I tested again with a fresh service account, but trying to call `bucket.configure_website('index.html')` gives an permission error on an existing bucket that is already serving as a website:
```
{
"error": {
"errors": [
{
"domain": "global",
"reason": "forbidden",
"message": "Forbidden"
}
],
"code": 403,
"message": "Forbidden"
}
}
```
Trying to create a new bucket is giving this error:
```
{
"error": {
"errors": [
{
"domain": "global",
"reason": "forbidden",
"message": "The bucket you tried to create is a domain name owned by another user."
}
],
"code": 403,
"message": "The bucket you tried to create is a domain name owned by another user."
}
}
```
This is using the service account that I am able to create the same domain buckets in the cloud console just fine, so there shouldn't be any issue with the domain ownership.
Closing this out since we could not reproduce.
@gsalgado Please re-open if you are still having this issue.
Sorry just saw this. Closing in favor of #1125 (a bit more detail there)
| 2023-06-05T16:34:41Z | [] | [] |
Traceback (most recent call last):
File "/Users/crwilcox/workspace/google-cloud-python/pubsub/tests/unit/pubsub_v1/subscriber/test_scheduler.py", line 51, in test_schedule
scheduler_.shutdown()
File "/Users/crwilcox/workspace/google-cloud-python/pubsub/google/cloud/pubsub_v1/subscriber/scheduler.py", line 121, in shutdown
self._executor._work_queue.
AttributeError: '_queue.SimpleQueue' object has no attribute 'queue'
| 5,725 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-11356 | 3a6894e831350094a2b4bf12bdca63b484b3da15 | diff --git a/packages/google-cloud-websecurityscanner/docs/conf.py b/packages/google-cloud-websecurityscanner/docs/conf.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/docs/conf.py
@@ -0,0 +1,384 @@
+# -*- coding: utf-8 -*-
+# Copyright 2021 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+# google-cloud-websecurityscanner documentation build configuration file
+#
+# This file is execfile()d with the current directory set to its
+# containing dir.
+#
+# Note that not all possible configuration values are present in this
+# autogenerated file.
+#
+# All configuration values have a default; values that are commented out
+# serve to show the default.
+
+import os
+import shlex
+import sys
+
+# If extensions (or modules to document with autodoc) are in another directory,
+# add these directories to sys.path here. If the directory is relative to the
+# documentation root, use os.path.abspath to make it absolute, like shown here.
+sys.path.insert(0, os.path.abspath(".."))
+
+# For plugins that can not read conf.py.
+# See also: https://github.com/docascode/sphinx-docfx-yaml/issues/85
+sys.path.insert(0, os.path.abspath("."))
+
+__version__ = ""
+
+# -- General configuration ------------------------------------------------
+
+# If your documentation needs a minimal Sphinx version, state it here.
+needs_sphinx = "1.5.5"
+
+# Add any Sphinx extension module names here, as strings. They can be
+# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
+# ones.
+extensions = [
+ "sphinx.ext.autodoc",
+ "sphinx.ext.autosummary",
+ "sphinx.ext.intersphinx",
+ "sphinx.ext.coverage",
+ "sphinx.ext.doctest",
+ "sphinx.ext.napoleon",
+ "sphinx.ext.todo",
+ "sphinx.ext.viewcode",
+ "recommonmark",
+]
+
+# autodoc/autosummary flags
+autoclass_content = "both"
+autodoc_default_options = {"members": True}
+autosummary_generate = True
+
+
+# Add any paths that contain templates here, relative to this directory.
+templates_path = ["_templates"]
+
+# The suffix(es) of source filenames.
+# You can specify multiple suffix as a list of string:
+# source_suffix = ['.rst', '.md']
+source_suffix = [".rst", ".md"]
+
+# The encoding of source files.
+# source_encoding = 'utf-8-sig'
+
+# The root toctree document.
+root_doc = "index"
+
+# General information about the project.
+project = "google-cloud-websecurityscanner"
+copyright = "2019, Google"
+author = "Google APIs"
+
+# The version info for the project you're documenting, acts as replacement for
+# |version| and |release|, also used in various other places throughout the
+# built documents.
+#
+# The full version, including alpha/beta/rc tags.
+release = __version__
+# The short X.Y version.
+version = ".".join(release.split(".")[0:2])
+
+# The language for content autogenerated by Sphinx. Refer to documentation
+# for a list of supported languages.
+#
+# This is also used if you do content translation via gettext catalogs.
+# Usually you set "language" from the command line for these cases.
+language = None
+
+# There are two options for replacing |today|: either, you set today to some
+# non-false value, then it is used:
+# today = ''
+# Else, today_fmt is used as the format for a strftime call.
+# today_fmt = '%B %d, %Y'
+
+# List of patterns, relative to source directory, that match files and
+# directories to ignore when looking for source files.
+exclude_patterns = [
+ "_build",
+ "**/.nox/**/*",
+ "samples/AUTHORING_GUIDE.md",
+ "samples/CONTRIBUTING.md",
+ "samples/snippets/README.rst",
+]
+
+# The reST default role (used for this markup: `text`) to use for all
+# documents.
+# default_role = None
+
+# If true, '()' will be appended to :func: etc. cross-reference text.
+# add_function_parentheses = True
+
+# If true, the current module name will be prepended to all description
+# unit titles (such as .. function::).
+# add_module_names = True
+
+# If true, sectionauthor and moduleauthor directives will be shown in the
+# output. They are ignored by default.
+# show_authors = False
+
+# The name of the Pygments (syntax highlighting) style to use.
+pygments_style = "sphinx"
+
+# A list of ignored prefixes for module index sorting.
+# modindex_common_prefix = []
+
+# If true, keep warnings as "system message" paragraphs in the built documents.
+# keep_warnings = False
+
+# If true, `todo` and `todoList` produce output, else they produce nothing.
+todo_include_todos = True
+
+
+# -- Options for HTML output ----------------------------------------------
+
+# The theme to use for HTML and HTML Help pages. See the documentation for
+# a list of builtin themes.
+html_theme = "alabaster"
+
+# Theme options are theme-specific and customize the look and feel of a theme
+# further. For a list of options available for each theme, see the
+# documentation.
+html_theme_options = {
+ "description": "Google Cloud Client Libraries for google-cloud-websecurityscanner",
+ "github_user": "googleapis",
+ "github_repo": "python-websecurityscanner",
+ "github_banner": True,
+ "font_family": "'Roboto', Georgia, sans",
+ "head_font_family": "'Roboto', Georgia, serif",
+ "code_font_family": "'Roboto Mono', 'Consolas', monospace",
+}
+
+# Add any paths that contain custom themes here, relative to this directory.
+# html_theme_path = []
+
+# The name for this set of Sphinx documents. If None, it defaults to
+# "<project> v<release> documentation".
+# html_title = None
+
+# A shorter title for the navigation bar. Default is the same as html_title.
+# html_short_title = None
+
+# The name of an image file (relative to this directory) to place at the top
+# of the sidebar.
+# html_logo = None
+
+# The name of an image file (within the static path) to use as favicon of the
+# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
+# pixels large.
+# html_favicon = None
+
+# Add any paths that contain custom static files (such as style sheets) here,
+# relative to this directory. They are copied after the builtin static files,
+# so a file named "default.css" will overwrite the builtin "default.css".
+html_static_path = ["_static"]
+
+# Add any extra paths that contain custom files (such as robots.txt or
+# .htaccess) here, relative to this directory. These files are copied
+# directly to the root of the documentation.
+# html_extra_path = []
+
+# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
+# using the given strftime format.
+# html_last_updated_fmt = '%b %d, %Y'
+
+# If true, SmartyPants will be used to convert quotes and dashes to
+# typographically correct entities.
+# html_use_smartypants = True
+
+# Custom sidebar templates, maps document names to template names.
+# html_sidebars = {}
+
+# Additional templates that should be rendered to pages, maps page names to
+# template names.
+# html_additional_pages = {}
+
+# If false, no module index is generated.
+# html_domain_indices = True
+
+# If false, no index is generated.
+# html_use_index = True
+
+# If true, the index is split into individual pages for each letter.
+# html_split_index = False
+
+# If true, links to the reST sources are added to the pages.
+# html_show_sourcelink = True
+
+# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
+# html_show_sphinx = True
+
+# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
+# html_show_copyright = True
+
+# If true, an OpenSearch description file will be output, and all pages will
+# contain a <link> tag referring to it. The value of this option must be the
+# base URL from which the finished HTML is served.
+# html_use_opensearch = ''
+
+# This is the file name suffix for HTML files (e.g. ".xhtml").
+# html_file_suffix = None
+
+# Language to be used for generating the HTML full-text search index.
+# Sphinx supports the following languages:
+# 'da', 'de', 'en', 'es', 'fi', 'fr', 'hu', 'it', 'ja'
+# 'nl', 'no', 'pt', 'ro', 'ru', 'sv', 'tr'
+# html_search_language = 'en'
+
+# A dictionary with options for the search language support, empty by default.
+# Now only 'ja' uses this config value
+# html_search_options = {'type': 'default'}
+
+# The name of a javascript file (relative to the configuration directory) that
+# implements a search results scorer. If empty, the default will be used.
+# html_search_scorer = 'scorer.js'
+
+# Output file base name for HTML help builder.
+htmlhelp_basename = "google-cloud-websecurityscanner-doc"
+
+# -- Options for warnings ------------------------------------------------------
+
+
+suppress_warnings = [
+ # Temporarily suppress this to avoid "more than one target found for
+ # cross-reference" warning, which are intractable for us to avoid while in
+ # a mono-repo.
+ # See https://github.com/sphinx-doc/sphinx/blob
+ # /2a65ffeef5c107c19084fabdd706cdff3f52d93c/sphinx/domains/python.py#L843
+ "ref.python"
+]
+
+# -- Options for LaTeX output ---------------------------------------------
+
+latex_elements = {
+ # The paper size ('letterpaper' or 'a4paper').
+ #'papersize': 'letterpaper',
+ # The font size ('10pt', '11pt' or '12pt').
+ #'pointsize': '10pt',
+ # Additional stuff for the LaTeX preamble.
+ #'preamble': '',
+ # Latex figure (float) alignment
+ #'figure_align': 'htbp',
+}
+
+# Grouping the document tree into LaTeX files. List of tuples
+# (source start file, target name, title,
+# author, documentclass [howto, manual, or own class]).
+latex_documents = [
+ (
+ root_doc,
+ "google-cloud-websecurityscanner.tex",
+ "google-cloud-websecurityscanner Documentation",
+ author,
+ "manual",
+ )
+]
+
+# The name of an image file (relative to this directory) to place at the top of
+# the title page.
+# latex_logo = None
+
+# For "manual" documents, if this is true, then toplevel headings are parts,
+# not chapters.
+# latex_use_parts = False
+
+# If true, show page references after internal links.
+# latex_show_pagerefs = False
+
+# If true, show URL addresses after external links.
+# latex_show_urls = False
+
+# Documents to append as an appendix to all manuals.
+# latex_appendices = []
+
+# If false, no module index is generated.
+# latex_domain_indices = True
+
+
+# -- Options for manual page output ---------------------------------------
+
+# One entry per manual page. List of tuples
+# (source start file, name, description, authors, manual section).
+man_pages = [
+ (
+ root_doc,
+ "google-cloud-websecurityscanner",
+ "google-cloud-websecurityscanner Documentation",
+ [author],
+ 1,
+ )
+]
+
+# If true, show URL addresses after external links.
+# man_show_urls = False
+
+
+# -- Options for Texinfo output -------------------------------------------
+
+# Grouping the document tree into Texinfo files. List of tuples
+# (source start file, target name, title, author,
+# dir menu entry, description, category)
+texinfo_documents = [
+ (
+ root_doc,
+ "google-cloud-websecurityscanner",
+ "google-cloud-websecurityscanner Documentation",
+ author,
+ "google-cloud-websecurityscanner",
+ "google-cloud-websecurityscanner Library",
+ "APIs",
+ )
+]
+
+# Documents to append as an appendix to all manuals.
+# texinfo_appendices = []
+
+# If false, no module index is generated.
+# texinfo_domain_indices = True
+
+# How to display URL addresses: 'footnote', 'no', or 'inline'.
+# texinfo_show_urls = 'footnote'
+
+# If true, do not generate a @detailmenu in the "Top" node's menu.
+# texinfo_no_detailmenu = False
+
+
+# Example configuration for intersphinx: refer to the Python standard library.
+intersphinx_mapping = {
+ "python": ("https://python.readthedocs.org/en/latest/", None),
+ "google-auth": ("https://googleapis.dev/python/google-auth/latest/", None),
+ "google.api_core": (
+ "https://googleapis.dev/python/google-api-core/latest/",
+ None,
+ ),
+ "grpc": ("https://grpc.github.io/grpc/python/", None),
+ "proto-plus": ("https://proto-plus-python.readthedocs.io/en/latest/", None),
+ "protobuf": ("https://googleapis.dev/python/protobuf/latest/", None),
+}
+
+
+# Napoleon settings
+napoleon_google_docstring = True
+napoleon_numpy_docstring = True
+napoleon_include_private_with_doc = False
+napoleon_include_special_with_doc = True
+napoleon_use_admonition_for_examples = False
+napoleon_use_admonition_for_notes = False
+napoleon_use_admonition_for_references = False
+napoleon_use_ivar = False
+napoleon_use_param = True
+napoleon_use_rtype = True
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner/__init__.py
@@ -0,0 +1,107 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from google.cloud.websecurityscanner import gapic_version as package_version
+
+__version__ = package_version.__version__
+
+
+from google.cloud.websecurityscanner_v1.services.web_security_scanner.async_client import (
+ WebSecurityScannerAsyncClient,
+)
+from google.cloud.websecurityscanner_v1.services.web_security_scanner.client import (
+ WebSecurityScannerClient,
+)
+from google.cloud.websecurityscanner_v1.types.crawled_url import CrawledUrl
+from google.cloud.websecurityscanner_v1.types.finding import Finding
+from google.cloud.websecurityscanner_v1.types.finding_addon import (
+ Form,
+ OutdatedLibrary,
+ ViolatingResource,
+ VulnerableHeaders,
+ VulnerableParameters,
+ Xss,
+ Xxe,
+)
+from google.cloud.websecurityscanner_v1.types.finding_type_stats import FindingTypeStats
+from google.cloud.websecurityscanner_v1.types.scan_config import ScanConfig
+from google.cloud.websecurityscanner_v1.types.scan_config_error import ScanConfigError
+from google.cloud.websecurityscanner_v1.types.scan_run import ScanRun
+from google.cloud.websecurityscanner_v1.types.scan_run_error_trace import (
+ ScanRunErrorTrace,
+)
+from google.cloud.websecurityscanner_v1.types.scan_run_log import ScanRunLog
+from google.cloud.websecurityscanner_v1.types.scan_run_warning_trace import (
+ ScanRunWarningTrace,
+)
+from google.cloud.websecurityscanner_v1.types.web_security_scanner import (
+ CreateScanConfigRequest,
+ DeleteScanConfigRequest,
+ GetFindingRequest,
+ GetScanConfigRequest,
+ GetScanRunRequest,
+ ListCrawledUrlsRequest,
+ ListCrawledUrlsResponse,
+ ListFindingsRequest,
+ ListFindingsResponse,
+ ListFindingTypeStatsRequest,
+ ListFindingTypeStatsResponse,
+ ListScanConfigsRequest,
+ ListScanConfigsResponse,
+ ListScanRunsRequest,
+ ListScanRunsResponse,
+ StartScanRunRequest,
+ StopScanRunRequest,
+ UpdateScanConfigRequest,
+)
+
+__all__ = (
+ "WebSecurityScannerClient",
+ "WebSecurityScannerAsyncClient",
+ "CrawledUrl",
+ "Finding",
+ "Form",
+ "OutdatedLibrary",
+ "ViolatingResource",
+ "VulnerableHeaders",
+ "VulnerableParameters",
+ "Xss",
+ "Xxe",
+ "FindingTypeStats",
+ "ScanConfig",
+ "ScanConfigError",
+ "ScanRun",
+ "ScanRunErrorTrace",
+ "ScanRunLog",
+ "ScanRunWarningTrace",
+ "CreateScanConfigRequest",
+ "DeleteScanConfigRequest",
+ "GetFindingRequest",
+ "GetScanConfigRequest",
+ "GetScanRunRequest",
+ "ListCrawledUrlsRequest",
+ "ListCrawledUrlsResponse",
+ "ListFindingsRequest",
+ "ListFindingsResponse",
+ "ListFindingTypeStatsRequest",
+ "ListFindingTypeStatsResponse",
+ "ListScanConfigsRequest",
+ "ListScanConfigsResponse",
+ "ListScanRunsRequest",
+ "ListScanRunsResponse",
+ "StartScanRunRequest",
+ "StopScanRunRequest",
+ "UpdateScanConfigRequest",
+)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner/gapic_version.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner/gapic_version.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner/gapic_version.py
@@ -0,0 +1,16 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+__version__ = "1.12.1" # {x-release-please-version}
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/__init__.py
@@ -0,0 +1,101 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from google.cloud.websecurityscanner_v1 import gapic_version as package_version
+
+__version__ = package_version.__version__
+
+
+from .services.web_security_scanner import (
+ WebSecurityScannerAsyncClient,
+ WebSecurityScannerClient,
+)
+from .types.crawled_url import CrawledUrl
+from .types.finding import Finding
+from .types.finding_addon import (
+ Form,
+ OutdatedLibrary,
+ ViolatingResource,
+ VulnerableHeaders,
+ VulnerableParameters,
+ Xss,
+ Xxe,
+)
+from .types.finding_type_stats import FindingTypeStats
+from .types.scan_config import ScanConfig
+from .types.scan_config_error import ScanConfigError
+from .types.scan_run import ScanRun
+from .types.scan_run_error_trace import ScanRunErrorTrace
+from .types.scan_run_log import ScanRunLog
+from .types.scan_run_warning_trace import ScanRunWarningTrace
+from .types.web_security_scanner import (
+ CreateScanConfigRequest,
+ DeleteScanConfigRequest,
+ GetFindingRequest,
+ GetScanConfigRequest,
+ GetScanRunRequest,
+ ListCrawledUrlsRequest,
+ ListCrawledUrlsResponse,
+ ListFindingsRequest,
+ ListFindingsResponse,
+ ListFindingTypeStatsRequest,
+ ListFindingTypeStatsResponse,
+ ListScanConfigsRequest,
+ ListScanConfigsResponse,
+ ListScanRunsRequest,
+ ListScanRunsResponse,
+ StartScanRunRequest,
+ StopScanRunRequest,
+ UpdateScanConfigRequest,
+)
+
+__all__ = (
+ "WebSecurityScannerAsyncClient",
+ "CrawledUrl",
+ "CreateScanConfigRequest",
+ "DeleteScanConfigRequest",
+ "Finding",
+ "FindingTypeStats",
+ "Form",
+ "GetFindingRequest",
+ "GetScanConfigRequest",
+ "GetScanRunRequest",
+ "ListCrawledUrlsRequest",
+ "ListCrawledUrlsResponse",
+ "ListFindingTypeStatsRequest",
+ "ListFindingTypeStatsResponse",
+ "ListFindingsRequest",
+ "ListFindingsResponse",
+ "ListScanConfigsRequest",
+ "ListScanConfigsResponse",
+ "ListScanRunsRequest",
+ "ListScanRunsResponse",
+ "OutdatedLibrary",
+ "ScanConfig",
+ "ScanConfigError",
+ "ScanRun",
+ "ScanRunErrorTrace",
+ "ScanRunLog",
+ "ScanRunWarningTrace",
+ "StartScanRunRequest",
+ "StopScanRunRequest",
+ "UpdateScanConfigRequest",
+ "ViolatingResource",
+ "VulnerableHeaders",
+ "VulnerableParameters",
+ "WebSecurityScannerClient",
+ "Xss",
+ "Xxe",
+)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/gapic_version.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/gapic_version.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/gapic_version.py
@@ -0,0 +1,16 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+__version__ = "1.12.1" # {x-release-please-version}
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/__init__.py
@@ -0,0 +1,15 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/__init__.py
@@ -0,0 +1,22 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .async_client import WebSecurityScannerAsyncClient
+from .client import WebSecurityScannerClient
+
+__all__ = (
+ "WebSecurityScannerClient",
+ "WebSecurityScannerAsyncClient",
+)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/async_client.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/async_client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/async_client.py
@@ -0,0 +1,1407 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import functools
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+)
+
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.api_core.client_options import ClientOptions
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.websecurityscanner_v1 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.websecurityscanner_v1.services.web_security_scanner import pagers
+from google.cloud.websecurityscanner_v1.types import (
+ crawled_url,
+ finding,
+ finding_addon,
+ finding_type_stats,
+ scan_config,
+ scan_run,
+ scan_run_error_trace,
+ scan_run_warning_trace,
+ web_security_scanner,
+)
+
+from .client import WebSecurityScannerClient
+from .transports.base import DEFAULT_CLIENT_INFO, WebSecurityScannerTransport
+from .transports.grpc_asyncio import WebSecurityScannerGrpcAsyncIOTransport
+
+
+class WebSecurityScannerAsyncClient:
+ """Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud. It
+ crawls your application, and attempts to exercise as many user
+ inputs and event handlers as possible.
+ """
+
+ _client: WebSecurityScannerClient
+
+ DEFAULT_ENDPOINT = WebSecurityScannerClient.DEFAULT_ENDPOINT
+ DEFAULT_MTLS_ENDPOINT = WebSecurityScannerClient.DEFAULT_MTLS_ENDPOINT
+
+ finding_path = staticmethod(WebSecurityScannerClient.finding_path)
+ parse_finding_path = staticmethod(WebSecurityScannerClient.parse_finding_path)
+ common_billing_account_path = staticmethod(
+ WebSecurityScannerClient.common_billing_account_path
+ )
+ parse_common_billing_account_path = staticmethod(
+ WebSecurityScannerClient.parse_common_billing_account_path
+ )
+ common_folder_path = staticmethod(WebSecurityScannerClient.common_folder_path)
+ parse_common_folder_path = staticmethod(
+ WebSecurityScannerClient.parse_common_folder_path
+ )
+ common_organization_path = staticmethod(
+ WebSecurityScannerClient.common_organization_path
+ )
+ parse_common_organization_path = staticmethod(
+ WebSecurityScannerClient.parse_common_organization_path
+ )
+ common_project_path = staticmethod(WebSecurityScannerClient.common_project_path)
+ parse_common_project_path = staticmethod(
+ WebSecurityScannerClient.parse_common_project_path
+ )
+ common_location_path = staticmethod(WebSecurityScannerClient.common_location_path)
+ parse_common_location_path = staticmethod(
+ WebSecurityScannerClient.parse_common_location_path
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ WebSecurityScannerAsyncClient: The constructed client.
+ """
+ return WebSecurityScannerClient.from_service_account_info.__func__(WebSecurityScannerAsyncClient, info, *args, **kwargs) # type: ignore
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ WebSecurityScannerAsyncClient: The constructed client.
+ """
+ return WebSecurityScannerClient.from_service_account_file.__func__(WebSecurityScannerAsyncClient, filename, *args, **kwargs) # type: ignore
+
+ from_service_account_json = from_service_account_file
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ return WebSecurityScannerClient.get_mtls_endpoint_and_cert_source(client_options) # type: ignore
+
+ @property
+ def transport(self) -> WebSecurityScannerTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ WebSecurityScannerTransport: The transport used by the client instance.
+ """
+ return self._client.transport
+
+ get_transport_class = functools.partial(
+ type(WebSecurityScannerClient).get_transport_class,
+ type(WebSecurityScannerClient),
+ )
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Union[str, WebSecurityScannerTransport] = "grpc_asyncio",
+ client_options: Optional[ClientOptions] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the web security scanner client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, ~.WebSecurityScannerTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (ClientOptions): Custom options for the client. It
+ won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ self._client = WebSecurityScannerClient(
+ credentials=credentials,
+ transport=transport,
+ client_options=client_options,
+ client_info=client_info,
+ )
+
+ async def create_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.CreateScanConfigRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Creates a new ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ async def sample_create_scan_config():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.CreateScanConfigRequest(
+ )
+
+ # Make the request
+ response = await client.create_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1.types.CreateScanConfigRequest, dict]]):
+ The request object. Request for the ``CreateScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ request = web_security_scanner.CreateScanConfigRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.create_scan_config,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def delete_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.DeleteScanConfigRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> None:
+ r"""Deletes an existing ScanConfig and its child
+ resources.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ async def sample_delete_scan_config():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.DeleteScanConfigRequest(
+ )
+
+ # Make the request
+ await client.delete_scan_config(request=request)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1.types.DeleteScanConfigRequest, dict]]):
+ The request object. Request for the ``DeleteScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ # Create or coerce a protobuf request object.
+ request = web_security_scanner.DeleteScanConfigRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.delete_scan_config,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ async def get_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.GetScanConfigRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Gets a ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ async def sample_get_scan_config():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.GetScanConfigRequest(
+ )
+
+ # Make the request
+ response = await client.get_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1.types.GetScanConfigRequest, dict]]):
+ The request object. Request for the ``GetScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ request = web_security_scanner.GetScanConfigRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_scan_config,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_scan_configs(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListScanConfigsRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListScanConfigsAsyncPager:
+ r"""Lists ScanConfigs under a given project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ async def sample_list_scan_configs():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.ListScanConfigsRequest(
+ )
+
+ # Make the request
+ page_result = client.list_scan_configs(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1.types.ListScanConfigsRequest, dict]]):
+ The request object. Request for the ``ListScanConfigs`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.services.web_security_scanner.pagers.ListScanConfigsAsyncPager:
+ Response for the ListScanConfigs method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ request = web_security_scanner.ListScanConfigsRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_scan_configs,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListScanConfigsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def update_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.UpdateScanConfigRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Updates a ScanConfig. This method support partial
+ update of a ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ async def sample_update_scan_config():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.UpdateScanConfigRequest(
+ )
+
+ # Make the request
+ response = await client.update_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1.types.UpdateScanConfigRequest, dict]]):
+ The request object. Request for the ``UpdateScanConfigRequest`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ request = web_security_scanner.UpdateScanConfigRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.update_scan_config,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("scan_config.name", request.scan_config.name),)
+ ),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def start_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.StartScanRunRequest, dict]] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Start a ScanRun according to the given ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ async def sample_start_scan_run():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.StartScanRunRequest(
+ )
+
+ # Make the request
+ response = await client.start_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1.types.StartScanRunRequest, dict]]):
+ The request object. Request for the ``StartScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ request = web_security_scanner.StartScanRunRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.start_scan_run,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.GetScanRunRequest, dict]] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Gets a ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ async def sample_get_scan_run():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.GetScanRunRequest(
+ )
+
+ # Make the request
+ response = await client.get_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1.types.GetScanRunRequest, dict]]):
+ The request object. Request for the ``GetScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ request = web_security_scanner.GetScanRunRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_scan_run,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_scan_runs(
+ self,
+ request: Optional[Union[web_security_scanner.ListScanRunsRequest, dict]] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListScanRunsAsyncPager:
+ r"""Lists ScanRuns under a given ScanConfig, in
+ descending order of ScanRun stop time.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ async def sample_list_scan_runs():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.ListScanRunsRequest(
+ )
+
+ # Make the request
+ page_result = client.list_scan_runs(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1.types.ListScanRunsRequest, dict]]):
+ The request object. Request for the ``ListScanRuns`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.services.web_security_scanner.pagers.ListScanRunsAsyncPager:
+ Response for the ListScanRuns method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ request = web_security_scanner.ListScanRunsRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_scan_runs,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListScanRunsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def stop_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.StopScanRunRequest, dict]] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Stops a ScanRun. The stopped ScanRun is returned.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ async def sample_stop_scan_run():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.StopScanRunRequest(
+ )
+
+ # Make the request
+ response = await client.stop_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1.types.StopScanRunRequest, dict]]):
+ The request object. Request for the ``StopScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ request = web_security_scanner.StopScanRunRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.stop_scan_run,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_crawled_urls(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListCrawledUrlsRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListCrawledUrlsAsyncPager:
+ r"""List CrawledUrls under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ async def sample_list_crawled_urls():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.ListCrawledUrlsRequest(
+ )
+
+ # Make the request
+ page_result = client.list_crawled_urls(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1.types.ListCrawledUrlsRequest, dict]]):
+ The request object. Request for the ``ListCrawledUrls`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.services.web_security_scanner.pagers.ListCrawledUrlsAsyncPager:
+ Response for the ListCrawledUrls method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ request = web_security_scanner.ListCrawledUrlsRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_crawled_urls,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListCrawledUrlsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_finding(
+ self,
+ request: Optional[Union[web_security_scanner.GetFindingRequest, dict]] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> finding.Finding:
+ r"""Gets a Finding.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ async def sample_get_finding():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.GetFindingRequest(
+ )
+
+ # Make the request
+ response = await client.get_finding(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1.types.GetFindingRequest, dict]]):
+ The request object. Request for the ``GetFinding`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.Finding:
+ A Finding resource represents a
+ vulnerability instance identified during
+ a ScanRun.
+
+ """
+ # Create or coerce a protobuf request object.
+ request = web_security_scanner.GetFindingRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_finding,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_findings(
+ self,
+ request: Optional[Union[web_security_scanner.ListFindingsRequest, dict]] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListFindingsAsyncPager:
+ r"""List Findings under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ async def sample_list_findings():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.ListFindingsRequest(
+ )
+
+ # Make the request
+ page_result = client.list_findings(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1.types.ListFindingsRequest, dict]]):
+ The request object. Request for the ``ListFindings`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.services.web_security_scanner.pagers.ListFindingsAsyncPager:
+ Response for the ListFindings method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ request = web_security_scanner.ListFindingsRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_findings,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListFindingsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_finding_type_stats(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListFindingTypeStatsRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListFindingTypeStatsResponse:
+ r"""List all FindingTypeStats under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ async def sample_list_finding_type_stats():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.ListFindingTypeStatsRequest(
+ )
+
+ # Make the request
+ response = await client.list_finding_type_stats(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1.types.ListFindingTypeStatsRequest, dict]]):
+ The request object. Request for the ``ListFindingTypeStats`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ListFindingTypeStatsResponse:
+ Response for the ListFindingTypeStats method.
+ """
+ # Create or coerce a protobuf request object.
+ request = web_security_scanner.ListFindingTypeStatsRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_finding_type_stats,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, exc_type, exc, tb):
+ await self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("WebSecurityScannerAsyncClient",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/client.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/client.py
@@ -0,0 +1,1567 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import os
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+ cast,
+)
+
+from google.api_core import client_options as client_options_lib
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.exceptions import MutualTLSChannelError # type: ignore
+from google.auth.transport import mtls # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.websecurityscanner_v1 import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.websecurityscanner_v1.services.web_security_scanner import pagers
+from google.cloud.websecurityscanner_v1.types import (
+ crawled_url,
+ finding,
+ finding_addon,
+ finding_type_stats,
+ scan_config,
+ scan_run,
+ scan_run_error_trace,
+ scan_run_warning_trace,
+ web_security_scanner,
+)
+
+from .transports.base import DEFAULT_CLIENT_INFO, WebSecurityScannerTransport
+from .transports.grpc import WebSecurityScannerGrpcTransport
+from .transports.grpc_asyncio import WebSecurityScannerGrpcAsyncIOTransport
+from .transports.rest import WebSecurityScannerRestTransport
+
+
+class WebSecurityScannerClientMeta(type):
+ """Metaclass for the WebSecurityScanner client.
+
+ This provides class-level methods for building and retrieving
+ support objects (e.g. transport) without polluting the client instance
+ objects.
+ """
+
+ _transport_registry = (
+ OrderedDict()
+ ) # type: Dict[str, Type[WebSecurityScannerTransport]]
+ _transport_registry["grpc"] = WebSecurityScannerGrpcTransport
+ _transport_registry["grpc_asyncio"] = WebSecurityScannerGrpcAsyncIOTransport
+ _transport_registry["rest"] = WebSecurityScannerRestTransport
+
+ def get_transport_class(
+ cls,
+ label: Optional[str] = None,
+ ) -> Type[WebSecurityScannerTransport]:
+ """Returns an appropriate transport class.
+
+ Args:
+ label: The name of the desired transport. If none is
+ provided, then the first transport in the registry is used.
+
+ Returns:
+ The transport class to use.
+ """
+ # If a specific transport is requested, return that one.
+ if label:
+ return cls._transport_registry[label]
+
+ # No transport is requested; return the default (that is, the first one
+ # in the dictionary).
+ return next(iter(cls._transport_registry.values()))
+
+
+class WebSecurityScannerClient(metaclass=WebSecurityScannerClientMeta):
+ """Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud. It
+ crawls your application, and attempts to exercise as many user
+ inputs and event handlers as possible.
+ """
+
+ @staticmethod
+ def _get_default_mtls_endpoint(api_endpoint):
+ """Converts api endpoint to mTLS endpoint.
+
+ Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to
+ "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively.
+ Args:
+ api_endpoint (Optional[str]): the api endpoint to convert.
+ Returns:
+ str: converted mTLS api endpoint.
+ """
+ if not api_endpoint:
+ return api_endpoint
+
+ mtls_endpoint_re = re.compile(
+ r"(?P<name>[^.]+)(?P<mtls>\.mtls)?(?P<sandbox>\.sandbox)?(?P<googledomain>\.googleapis\.com)?"
+ )
+
+ m = mtls_endpoint_re.match(api_endpoint)
+ name, mtls, sandbox, googledomain = m.groups()
+ if mtls or not googledomain:
+ return api_endpoint
+
+ if sandbox:
+ return api_endpoint.replace(
+ "sandbox.googleapis.com", "mtls.sandbox.googleapis.com"
+ )
+
+ return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com")
+
+ DEFAULT_ENDPOINT = "websecurityscanner.googleapis.com"
+ DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore
+ DEFAULT_ENDPOINT
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ WebSecurityScannerClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_info(info)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ WebSecurityScannerClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_file(filename)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ from_service_account_json = from_service_account_file
+
+ @property
+ def transport(self) -> WebSecurityScannerTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ WebSecurityScannerTransport: The transport used by the client
+ instance.
+ """
+ return self._transport
+
+ @staticmethod
+ def finding_path(
+ project: str,
+ scan_config: str,
+ scan_run: str,
+ finding: str,
+ ) -> str:
+ """Returns a fully-qualified finding string."""
+ return "projects/{project}/scanConfigs/{scan_config}/scanRuns/{scan_run}/findings/{finding}".format(
+ project=project,
+ scan_config=scan_config,
+ scan_run=scan_run,
+ finding=finding,
+ )
+
+ @staticmethod
+ def parse_finding_path(path: str) -> Dict[str, str]:
+ """Parses a finding path into its component segments."""
+ m = re.match(
+ r"^projects/(?P<project>.+?)/scanConfigs/(?P<scan_config>.+?)/scanRuns/(?P<scan_run>.+?)/findings/(?P<finding>.+?)$",
+ path,
+ )
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_billing_account_path(
+ billing_account: str,
+ ) -> str:
+ """Returns a fully-qualified billing_account string."""
+ return "billingAccounts/{billing_account}".format(
+ billing_account=billing_account,
+ )
+
+ @staticmethod
+ def parse_common_billing_account_path(path: str) -> Dict[str, str]:
+ """Parse a billing_account path into its component segments."""
+ m = re.match(r"^billingAccounts/(?P<billing_account>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_folder_path(
+ folder: str,
+ ) -> str:
+ """Returns a fully-qualified folder string."""
+ return "folders/{folder}".format(
+ folder=folder,
+ )
+
+ @staticmethod
+ def parse_common_folder_path(path: str) -> Dict[str, str]:
+ """Parse a folder path into its component segments."""
+ m = re.match(r"^folders/(?P<folder>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_organization_path(
+ organization: str,
+ ) -> str:
+ """Returns a fully-qualified organization string."""
+ return "organizations/{organization}".format(
+ organization=organization,
+ )
+
+ @staticmethod
+ def parse_common_organization_path(path: str) -> Dict[str, str]:
+ """Parse a organization path into its component segments."""
+ m = re.match(r"^organizations/(?P<organization>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_project_path(
+ project: str,
+ ) -> str:
+ """Returns a fully-qualified project string."""
+ return "projects/{project}".format(
+ project=project,
+ )
+
+ @staticmethod
+ def parse_common_project_path(path: str) -> Dict[str, str]:
+ """Parse a project path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_location_path(
+ project: str,
+ location: str,
+ ) -> str:
+ """Returns a fully-qualified location string."""
+ return "projects/{project}/locations/{location}".format(
+ project=project,
+ location=location,
+ )
+
+ @staticmethod
+ def parse_common_location_path(path: str) -> Dict[str, str]:
+ """Parse a location path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[client_options_lib.ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ use_client_cert = os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")
+ use_mtls_endpoint = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto")
+ if use_client_cert not in ("true", "false"):
+ raise ValueError(
+ "Environment variable `GOOGLE_API_USE_CLIENT_CERTIFICATE` must be either `true` or `false`"
+ )
+ if use_mtls_endpoint not in ("auto", "never", "always"):
+ raise MutualTLSChannelError(
+ "Environment variable `GOOGLE_API_USE_MTLS_ENDPOINT` must be `never`, `auto` or `always`"
+ )
+
+ # Figure out the client cert source to use.
+ client_cert_source = None
+ if use_client_cert == "true":
+ if client_options.client_cert_source:
+ client_cert_source = client_options.client_cert_source
+ elif mtls.has_default_client_cert_source():
+ client_cert_source = mtls.default_client_cert_source()
+
+ # Figure out which api endpoint to use.
+ if client_options.api_endpoint is not None:
+ api_endpoint = client_options.api_endpoint
+ elif use_mtls_endpoint == "always" or (
+ use_mtls_endpoint == "auto" and client_cert_source
+ ):
+ api_endpoint = cls.DEFAULT_MTLS_ENDPOINT
+ else:
+ api_endpoint = cls.DEFAULT_ENDPOINT
+
+ return api_endpoint, client_cert_source
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Optional[Union[str, WebSecurityScannerTransport]] = None,
+ client_options: Optional[Union[client_options_lib.ClientOptions, dict]] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the web security scanner client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, WebSecurityScannerTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (Optional[Union[google.api_core.client_options.ClientOptions, dict]]): Custom options for the
+ client. It won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ if isinstance(client_options, dict):
+ client_options = client_options_lib.from_dict(client_options)
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ client_options = cast(client_options_lib.ClientOptions, client_options)
+
+ api_endpoint, client_cert_source_func = self.get_mtls_endpoint_and_cert_source(
+ client_options
+ )
+
+ api_key_value = getattr(client_options, "api_key", None)
+ if api_key_value and credentials:
+ raise ValueError(
+ "client_options.api_key and credentials are mutually exclusive"
+ )
+
+ # Save or instantiate the transport.
+ # Ordinarily, we provide the transport, but allowing a custom transport
+ # instance provides an extensibility point for unusual situations.
+ if isinstance(transport, WebSecurityScannerTransport):
+ # transport is a WebSecurityScannerTransport instance.
+ if credentials or client_options.credentials_file or api_key_value:
+ raise ValueError(
+ "When providing a transport instance, "
+ "provide its credentials directly."
+ )
+ if client_options.scopes:
+ raise ValueError(
+ "When providing a transport instance, provide its scopes "
+ "directly."
+ )
+ self._transport = transport
+ else:
+ import google.auth._default # type: ignore
+
+ if api_key_value and hasattr(
+ google.auth._default, "get_api_key_credentials"
+ ):
+ credentials = google.auth._default.get_api_key_credentials(
+ api_key_value
+ )
+
+ Transport = type(self).get_transport_class(transport)
+ self._transport = Transport(
+ credentials=credentials,
+ credentials_file=client_options.credentials_file,
+ host=api_endpoint,
+ scopes=client_options.scopes,
+ client_cert_source_for_mtls=client_cert_source_func,
+ quota_project_id=client_options.quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=True,
+ api_audience=client_options.api_audience,
+ )
+
+ def create_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.CreateScanConfigRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Creates a new ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ def sample_create_scan_config():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.CreateScanConfigRequest(
+ )
+
+ # Make the request
+ response = client.create_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1.types.CreateScanConfigRequest, dict]):
+ The request object. Request for the ``CreateScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.CreateScanConfigRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.CreateScanConfigRequest):
+ request = web_security_scanner.CreateScanConfigRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.create_scan_config]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def delete_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.DeleteScanConfigRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> None:
+ r"""Deletes an existing ScanConfig and its child
+ resources.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ def sample_delete_scan_config():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.DeleteScanConfigRequest(
+ )
+
+ # Make the request
+ client.delete_scan_config(request=request)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1.types.DeleteScanConfigRequest, dict]):
+ The request object. Request for the ``DeleteScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ # Create or coerce a protobuf request object.
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.DeleteScanConfigRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.DeleteScanConfigRequest):
+ request = web_security_scanner.DeleteScanConfigRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.delete_scan_config]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ def get_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.GetScanConfigRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Gets a ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ def sample_get_scan_config():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.GetScanConfigRequest(
+ )
+
+ # Make the request
+ response = client.get_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1.types.GetScanConfigRequest, dict]):
+ The request object. Request for the ``GetScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.GetScanConfigRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.GetScanConfigRequest):
+ request = web_security_scanner.GetScanConfigRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_scan_config]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_scan_configs(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListScanConfigsRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListScanConfigsPager:
+ r"""Lists ScanConfigs under a given project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ def sample_list_scan_configs():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.ListScanConfigsRequest(
+ )
+
+ # Make the request
+ page_result = client.list_scan_configs(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1.types.ListScanConfigsRequest, dict]):
+ The request object. Request for the ``ListScanConfigs`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.services.web_security_scanner.pagers.ListScanConfigsPager:
+ Response for the ListScanConfigs method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListScanConfigsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListScanConfigsRequest):
+ request = web_security_scanner.ListScanConfigsRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_scan_configs]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListScanConfigsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def update_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.UpdateScanConfigRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Updates a ScanConfig. This method support partial
+ update of a ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ def sample_update_scan_config():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.UpdateScanConfigRequest(
+ )
+
+ # Make the request
+ response = client.update_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1.types.UpdateScanConfigRequest, dict]):
+ The request object. Request for the ``UpdateScanConfigRequest`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.UpdateScanConfigRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.UpdateScanConfigRequest):
+ request = web_security_scanner.UpdateScanConfigRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.update_scan_config]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("scan_config.name", request.scan_config.name),)
+ ),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def start_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.StartScanRunRequest, dict]] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Start a ScanRun according to the given ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ def sample_start_scan_run():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.StartScanRunRequest(
+ )
+
+ # Make the request
+ response = client.start_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1.types.StartScanRunRequest, dict]):
+ The request object. Request for the ``StartScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.StartScanRunRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.StartScanRunRequest):
+ request = web_security_scanner.StartScanRunRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.start_scan_run]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.GetScanRunRequest, dict]] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Gets a ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ def sample_get_scan_run():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.GetScanRunRequest(
+ )
+
+ # Make the request
+ response = client.get_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1.types.GetScanRunRequest, dict]):
+ The request object. Request for the ``GetScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.GetScanRunRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.GetScanRunRequest):
+ request = web_security_scanner.GetScanRunRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_scan_run]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_scan_runs(
+ self,
+ request: Optional[Union[web_security_scanner.ListScanRunsRequest, dict]] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListScanRunsPager:
+ r"""Lists ScanRuns under a given ScanConfig, in
+ descending order of ScanRun stop time.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ def sample_list_scan_runs():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.ListScanRunsRequest(
+ )
+
+ # Make the request
+ page_result = client.list_scan_runs(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1.types.ListScanRunsRequest, dict]):
+ The request object. Request for the ``ListScanRuns`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.services.web_security_scanner.pagers.ListScanRunsPager:
+ Response for the ListScanRuns method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListScanRunsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListScanRunsRequest):
+ request = web_security_scanner.ListScanRunsRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_scan_runs]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListScanRunsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def stop_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.StopScanRunRequest, dict]] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Stops a ScanRun. The stopped ScanRun is returned.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ def sample_stop_scan_run():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.StopScanRunRequest(
+ )
+
+ # Make the request
+ response = client.stop_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1.types.StopScanRunRequest, dict]):
+ The request object. Request for the ``StopScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.StopScanRunRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.StopScanRunRequest):
+ request = web_security_scanner.StopScanRunRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.stop_scan_run]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_crawled_urls(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListCrawledUrlsRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListCrawledUrlsPager:
+ r"""List CrawledUrls under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ def sample_list_crawled_urls():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.ListCrawledUrlsRequest(
+ )
+
+ # Make the request
+ page_result = client.list_crawled_urls(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1.types.ListCrawledUrlsRequest, dict]):
+ The request object. Request for the ``ListCrawledUrls`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.services.web_security_scanner.pagers.ListCrawledUrlsPager:
+ Response for the ListCrawledUrls method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListCrawledUrlsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListCrawledUrlsRequest):
+ request = web_security_scanner.ListCrawledUrlsRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_crawled_urls]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListCrawledUrlsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_finding(
+ self,
+ request: Optional[Union[web_security_scanner.GetFindingRequest, dict]] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> finding.Finding:
+ r"""Gets a Finding.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ def sample_get_finding():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.GetFindingRequest(
+ )
+
+ # Make the request
+ response = client.get_finding(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1.types.GetFindingRequest, dict]):
+ The request object. Request for the ``GetFinding`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.Finding:
+ A Finding resource represents a
+ vulnerability instance identified during
+ a ScanRun.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.GetFindingRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.GetFindingRequest):
+ request = web_security_scanner.GetFindingRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_finding]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_findings(
+ self,
+ request: Optional[Union[web_security_scanner.ListFindingsRequest, dict]] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListFindingsPager:
+ r"""List Findings under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ def sample_list_findings():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.ListFindingsRequest(
+ )
+
+ # Make the request
+ page_result = client.list_findings(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1.types.ListFindingsRequest, dict]):
+ The request object. Request for the ``ListFindings`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.services.web_security_scanner.pagers.ListFindingsPager:
+ Response for the ListFindings method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListFindingsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListFindingsRequest):
+ request = web_security_scanner.ListFindingsRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_findings]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListFindingsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_finding_type_stats(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListFindingTypeStatsRequest, dict]
+ ] = None,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListFindingTypeStatsResponse:
+ r"""List all FindingTypeStats under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1
+
+ def sample_list_finding_type_stats():
+ # Create a client
+ client = websecurityscanner_v1.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1.ListFindingTypeStatsRequest(
+ )
+
+ # Make the request
+ response = client.list_finding_type_stats(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1.types.ListFindingTypeStatsRequest, dict]):
+ The request object. Request for the ``ListFindingTypeStats`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1.types.ListFindingTypeStatsResponse:
+ Response for the ListFindingTypeStats method.
+ """
+ # Create or coerce a protobuf request object.
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListFindingTypeStatsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListFindingTypeStatsRequest):
+ request = web_security_scanner.ListFindingTypeStatsRequest(request)
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_finding_type_stats]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def __enter__(self) -> "WebSecurityScannerClient":
+ return self
+
+ def __exit__(self, type, value, traceback):
+ """Releases underlying transport's resources.
+
+ .. warning::
+ ONLY use as a context manager if the transport is NOT shared
+ with other clients! Exiting the with block will CLOSE the transport
+ and may cause errors in other clients!
+ """
+ self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("WebSecurityScannerClient",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/pagers.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/pagers.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/pagers.py
@@ -0,0 +1,549 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import (
+ Any,
+ AsyncIterator,
+ Awaitable,
+ Callable,
+ Iterator,
+ Optional,
+ Sequence,
+ Tuple,
+)
+
+from google.cloud.websecurityscanner_v1.types import (
+ crawled_url,
+ finding,
+ scan_config,
+ scan_run,
+ web_security_scanner,
+)
+
+
+class ListScanConfigsPager:
+ """A pager for iterating through ``list_scan_configs`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1.types.ListScanConfigsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``scan_configs`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListScanConfigs`` requests and continue to iterate
+ through the ``scan_configs`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1.types.ListScanConfigsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., web_security_scanner.ListScanConfigsResponse],
+ request: web_security_scanner.ListScanConfigsRequest,
+ response: web_security_scanner.ListScanConfigsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1.types.ListScanConfigsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1.types.ListScanConfigsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListScanConfigsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[web_security_scanner.ListScanConfigsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[scan_config.ScanConfig]:
+ for page in self.pages:
+ yield from page.scan_configs
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListScanConfigsAsyncPager:
+ """A pager for iterating through ``list_scan_configs`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1.types.ListScanConfigsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``scan_configs`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListScanConfigs`` requests and continue to iterate
+ through the ``scan_configs`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1.types.ListScanConfigsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[web_security_scanner.ListScanConfigsResponse]],
+ request: web_security_scanner.ListScanConfigsRequest,
+ response: web_security_scanner.ListScanConfigsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1.types.ListScanConfigsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1.types.ListScanConfigsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListScanConfigsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(
+ self,
+ ) -> AsyncIterator[web_security_scanner.ListScanConfigsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[scan_config.ScanConfig]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.scan_configs:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListScanRunsPager:
+ """A pager for iterating through ``list_scan_runs`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1.types.ListScanRunsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``scan_runs`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListScanRuns`` requests and continue to iterate
+ through the ``scan_runs`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1.types.ListScanRunsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., web_security_scanner.ListScanRunsResponse],
+ request: web_security_scanner.ListScanRunsRequest,
+ response: web_security_scanner.ListScanRunsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1.types.ListScanRunsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1.types.ListScanRunsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListScanRunsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[web_security_scanner.ListScanRunsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[scan_run.ScanRun]:
+ for page in self.pages:
+ yield from page.scan_runs
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListScanRunsAsyncPager:
+ """A pager for iterating through ``list_scan_runs`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1.types.ListScanRunsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``scan_runs`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListScanRuns`` requests and continue to iterate
+ through the ``scan_runs`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1.types.ListScanRunsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[web_security_scanner.ListScanRunsResponse]],
+ request: web_security_scanner.ListScanRunsRequest,
+ response: web_security_scanner.ListScanRunsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1.types.ListScanRunsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1.types.ListScanRunsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListScanRunsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[web_security_scanner.ListScanRunsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[scan_run.ScanRun]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.scan_runs:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListCrawledUrlsPager:
+ """A pager for iterating through ``list_crawled_urls`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1.types.ListCrawledUrlsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``crawled_urls`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListCrawledUrls`` requests and continue to iterate
+ through the ``crawled_urls`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1.types.ListCrawledUrlsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., web_security_scanner.ListCrawledUrlsResponse],
+ request: web_security_scanner.ListCrawledUrlsRequest,
+ response: web_security_scanner.ListCrawledUrlsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1.types.ListCrawledUrlsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1.types.ListCrawledUrlsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListCrawledUrlsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[web_security_scanner.ListCrawledUrlsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[crawled_url.CrawledUrl]:
+ for page in self.pages:
+ yield from page.crawled_urls
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListCrawledUrlsAsyncPager:
+ """A pager for iterating through ``list_crawled_urls`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1.types.ListCrawledUrlsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``crawled_urls`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListCrawledUrls`` requests and continue to iterate
+ through the ``crawled_urls`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1.types.ListCrawledUrlsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[web_security_scanner.ListCrawledUrlsResponse]],
+ request: web_security_scanner.ListCrawledUrlsRequest,
+ response: web_security_scanner.ListCrawledUrlsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1.types.ListCrawledUrlsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1.types.ListCrawledUrlsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListCrawledUrlsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(
+ self,
+ ) -> AsyncIterator[web_security_scanner.ListCrawledUrlsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[crawled_url.CrawledUrl]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.crawled_urls:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListFindingsPager:
+ """A pager for iterating through ``list_findings`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1.types.ListFindingsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``findings`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListFindings`` requests and continue to iterate
+ through the ``findings`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1.types.ListFindingsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., web_security_scanner.ListFindingsResponse],
+ request: web_security_scanner.ListFindingsRequest,
+ response: web_security_scanner.ListFindingsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1.types.ListFindingsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1.types.ListFindingsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListFindingsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[web_security_scanner.ListFindingsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[finding.Finding]:
+ for page in self.pages:
+ yield from page.findings
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListFindingsAsyncPager:
+ """A pager for iterating through ``list_findings`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1.types.ListFindingsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``findings`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListFindings`` requests and continue to iterate
+ through the ``findings`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1.types.ListFindingsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[web_security_scanner.ListFindingsResponse]],
+ request: web_security_scanner.ListFindingsRequest,
+ response: web_security_scanner.ListFindingsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1.types.ListFindingsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1.types.ListFindingsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListFindingsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[web_security_scanner.ListFindingsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[finding.Finding]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.findings:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/__init__.py
@@ -0,0 +1,38 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+from typing import Dict, Type
+
+from .base import WebSecurityScannerTransport
+from .grpc import WebSecurityScannerGrpcTransport
+from .grpc_asyncio import WebSecurityScannerGrpcAsyncIOTransport
+from .rest import WebSecurityScannerRestInterceptor, WebSecurityScannerRestTransport
+
+# Compile a registry of transports.
+_transport_registry = (
+ OrderedDict()
+) # type: Dict[str, Type[WebSecurityScannerTransport]]
+_transport_registry["grpc"] = WebSecurityScannerGrpcTransport
+_transport_registry["grpc_asyncio"] = WebSecurityScannerGrpcAsyncIOTransport
+_transport_registry["rest"] = WebSecurityScannerRestTransport
+
+__all__ = (
+ "WebSecurityScannerTransport",
+ "WebSecurityScannerGrpcTransport",
+ "WebSecurityScannerGrpcAsyncIOTransport",
+ "WebSecurityScannerRestTransport",
+ "WebSecurityScannerRestInterceptor",
+)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/base.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/base.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/base.py
@@ -0,0 +1,434 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import abc
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Union
+
+import google.api_core
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+from google.protobuf import empty_pb2 # type: ignore
+
+from google.cloud.websecurityscanner_v1 import gapic_version as package_version
+from google.cloud.websecurityscanner_v1.types import (
+ finding,
+ scan_config,
+ scan_run,
+ web_security_scanner,
+)
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+class WebSecurityScannerTransport(abc.ABC):
+ """Abstract transport class for WebSecurityScanner."""
+
+ AUTH_SCOPES = ("https://www.googleapis.com/auth/cloud-platform",)
+
+ DEFAULT_HOST: str = "websecurityscanner.googleapis.com"
+
+ def __init__(
+ self,
+ *,
+ host: str = DEFAULT_HOST,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ **kwargs,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A list of scopes.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ """
+
+ scopes_kwargs = {"scopes": scopes, "default_scopes": self.AUTH_SCOPES}
+
+ # Save the scopes.
+ self._scopes = scopes
+
+ # If no credentials are provided, then determine the appropriate
+ # defaults.
+ if credentials and credentials_file:
+ raise core_exceptions.DuplicateCredentialArgs(
+ "'credentials_file' and 'credentials' are mutually exclusive"
+ )
+
+ if credentials_file is not None:
+ credentials, _ = google.auth.load_credentials_from_file(
+ credentials_file, **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ elif credentials is None:
+ credentials, _ = google.auth.default(
+ **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ # Don't apply audience if the credentials file passed from user.
+ if hasattr(credentials, "with_gdch_audience"):
+ credentials = credentials.with_gdch_audience(
+ api_audience if api_audience else host
+ )
+
+ # If the credentials are service account credentials, then always try to use self signed JWT.
+ if (
+ always_use_jwt_access
+ and isinstance(credentials, service_account.Credentials)
+ and hasattr(service_account.Credentials, "with_always_use_jwt_access")
+ ):
+ credentials = credentials.with_always_use_jwt_access(True)
+
+ # Save the credentials.
+ self._credentials = credentials
+
+ # Save the hostname. Default to port 443 (HTTPS) if none is specified.
+ if ":" not in host:
+ host += ":443"
+ self._host = host
+
+ def _prep_wrapped_messages(self, client_info):
+ # Precompute the wrapped methods.
+ self._wrapped_methods = {
+ self.create_scan_config: gapic_v1.method.wrap_method(
+ self.create_scan_config,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.delete_scan_config: gapic_v1.method.wrap_method(
+ self.delete_scan_config,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.get_scan_config: gapic_v1.method.wrap_method(
+ self.get_scan_config,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_scan_configs: gapic_v1.method.wrap_method(
+ self.list_scan_configs,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.update_scan_config: gapic_v1.method.wrap_method(
+ self.update_scan_config,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.start_scan_run: gapic_v1.method.wrap_method(
+ self.start_scan_run,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.get_scan_run: gapic_v1.method.wrap_method(
+ self.get_scan_run,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_scan_runs: gapic_v1.method.wrap_method(
+ self.list_scan_runs,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.stop_scan_run: gapic_v1.method.wrap_method(
+ self.stop_scan_run,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_crawled_urls: gapic_v1.method.wrap_method(
+ self.list_crawled_urls,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.get_finding: gapic_v1.method.wrap_method(
+ self.get_finding,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_findings: gapic_v1.method.wrap_method(
+ self.list_findings,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_finding_type_stats: gapic_v1.method.wrap_method(
+ self.list_finding_type_stats,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ }
+
+ def close(self):
+ """Closes resources associated with the transport.
+
+ .. warning::
+ Only call this method if the transport is NOT shared
+ with other clients - this may cause errors in other clients!
+ """
+ raise NotImplementedError()
+
+ @property
+ def create_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.CreateScanConfigRequest],
+ Union[scan_config.ScanConfig, Awaitable[scan_config.ScanConfig]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def delete_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.DeleteScanConfigRequest],
+ Union[empty_pb2.Empty, Awaitable[empty_pb2.Empty]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetScanConfigRequest],
+ Union[scan_config.ScanConfig, Awaitable[scan_config.ScanConfig]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_scan_configs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanConfigsRequest],
+ Union[
+ web_security_scanner.ListScanConfigsResponse,
+ Awaitable[web_security_scanner.ListScanConfigsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def update_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.UpdateScanConfigRequest],
+ Union[scan_config.ScanConfig, Awaitable[scan_config.ScanConfig]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def start_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.StartScanRunRequest],
+ Union[scan_run.ScanRun, Awaitable[scan_run.ScanRun]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetScanRunRequest],
+ Union[scan_run.ScanRun, Awaitable[scan_run.ScanRun]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_scan_runs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanRunsRequest],
+ Union[
+ web_security_scanner.ListScanRunsResponse,
+ Awaitable[web_security_scanner.ListScanRunsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def stop_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.StopScanRunRequest],
+ Union[scan_run.ScanRun, Awaitable[scan_run.ScanRun]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_crawled_urls(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListCrawledUrlsRequest],
+ Union[
+ web_security_scanner.ListCrawledUrlsResponse,
+ Awaitable[web_security_scanner.ListCrawledUrlsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_finding(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetFindingRequest],
+ Union[finding.Finding, Awaitable[finding.Finding]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_findings(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingsRequest],
+ Union[
+ web_security_scanner.ListFindingsResponse,
+ Awaitable[web_security_scanner.ListFindingsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_finding_type_stats(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingTypeStatsRequest],
+ Union[
+ web_security_scanner.ListFindingTypeStatsResponse,
+ Awaitable[web_security_scanner.ListFindingTypeStatsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def kind(self) -> str:
+ raise NotImplementedError()
+
+
+__all__ = ("WebSecurityScannerTransport",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/grpc.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/grpc.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/grpc.py
@@ -0,0 +1,608 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.protobuf import empty_pb2 # type: ignore
+import grpc # type: ignore
+
+from google.cloud.websecurityscanner_v1.types import (
+ finding,
+ scan_config,
+ scan_run,
+ web_security_scanner,
+)
+
+from .base import DEFAULT_CLIENT_INFO, WebSecurityScannerTransport
+
+
+class WebSecurityScannerGrpcTransport(WebSecurityScannerTransport):
+ """gRPC backend transport for WebSecurityScanner.
+
+ Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud. It
+ crawls your application, and attempts to exercise as many user
+ inputs and event handlers as possible.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _stubs: Dict[str, Callable]
+
+ def __init__(
+ self,
+ *,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[grpc.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ channel (Optional[grpc.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> grpc.Channel:
+ """Create and return a gRPC channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ grpc.Channel: A gRPC channel object.
+
+ Raises:
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+
+ return grpc_helpers.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ @property
+ def grpc_channel(self) -> grpc.Channel:
+ """Return the channel designed to connect to this service."""
+ return self._grpc_channel
+
+ @property
+ def create_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.CreateScanConfigRequest], scan_config.ScanConfig
+ ]:
+ r"""Return a callable for the create scan config method over gRPC.
+
+ Creates a new ScanConfig.
+
+ Returns:
+ Callable[[~.CreateScanConfigRequest],
+ ~.ScanConfig]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_scan_config" not in self._stubs:
+ self._stubs["create_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/CreateScanConfig",
+ request_serializer=web_security_scanner.CreateScanConfigRequest.serialize,
+ response_deserializer=scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["create_scan_config"]
+
+ @property
+ def delete_scan_config(
+ self,
+ ) -> Callable[[web_security_scanner.DeleteScanConfigRequest], empty_pb2.Empty]:
+ r"""Return a callable for the delete scan config method over gRPC.
+
+ Deletes an existing ScanConfig and its child
+ resources.
+
+ Returns:
+ Callable[[~.DeleteScanConfigRequest],
+ ~.Empty]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_scan_config" not in self._stubs:
+ self._stubs["delete_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/DeleteScanConfig",
+ request_serializer=web_security_scanner.DeleteScanConfigRequest.serialize,
+ response_deserializer=empty_pb2.Empty.FromString,
+ )
+ return self._stubs["delete_scan_config"]
+
+ @property
+ def get_scan_config(
+ self,
+ ) -> Callable[[web_security_scanner.GetScanConfigRequest], scan_config.ScanConfig]:
+ r"""Return a callable for the get scan config method over gRPC.
+
+ Gets a ScanConfig.
+
+ Returns:
+ Callable[[~.GetScanConfigRequest],
+ ~.ScanConfig]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_scan_config" not in self._stubs:
+ self._stubs["get_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/GetScanConfig",
+ request_serializer=web_security_scanner.GetScanConfigRequest.serialize,
+ response_deserializer=scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["get_scan_config"]
+
+ @property
+ def list_scan_configs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanConfigsRequest],
+ web_security_scanner.ListScanConfigsResponse,
+ ]:
+ r"""Return a callable for the list scan configs method over gRPC.
+
+ Lists ScanConfigs under a given project.
+
+ Returns:
+ Callable[[~.ListScanConfigsRequest],
+ ~.ListScanConfigsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_scan_configs" not in self._stubs:
+ self._stubs["list_scan_configs"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/ListScanConfigs",
+ request_serializer=web_security_scanner.ListScanConfigsRequest.serialize,
+ response_deserializer=web_security_scanner.ListScanConfigsResponse.deserialize,
+ )
+ return self._stubs["list_scan_configs"]
+
+ @property
+ def update_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.UpdateScanConfigRequest], scan_config.ScanConfig
+ ]:
+ r"""Return a callable for the update scan config method over gRPC.
+
+ Updates a ScanConfig. This method support partial
+ update of a ScanConfig.
+
+ Returns:
+ Callable[[~.UpdateScanConfigRequest],
+ ~.ScanConfig]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_scan_config" not in self._stubs:
+ self._stubs["update_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/UpdateScanConfig",
+ request_serializer=web_security_scanner.UpdateScanConfigRequest.serialize,
+ response_deserializer=scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["update_scan_config"]
+
+ @property
+ def start_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.StartScanRunRequest], scan_run.ScanRun]:
+ r"""Return a callable for the start scan run method over gRPC.
+
+ Start a ScanRun according to the given ScanConfig.
+
+ Returns:
+ Callable[[~.StartScanRunRequest],
+ ~.ScanRun]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "start_scan_run" not in self._stubs:
+ self._stubs["start_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/StartScanRun",
+ request_serializer=web_security_scanner.StartScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["start_scan_run"]
+
+ @property
+ def get_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.GetScanRunRequest], scan_run.ScanRun]:
+ r"""Return a callable for the get scan run method over gRPC.
+
+ Gets a ScanRun.
+
+ Returns:
+ Callable[[~.GetScanRunRequest],
+ ~.ScanRun]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_scan_run" not in self._stubs:
+ self._stubs["get_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/GetScanRun",
+ request_serializer=web_security_scanner.GetScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["get_scan_run"]
+
+ @property
+ def list_scan_runs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanRunsRequest],
+ web_security_scanner.ListScanRunsResponse,
+ ]:
+ r"""Return a callable for the list scan runs method over gRPC.
+
+ Lists ScanRuns under a given ScanConfig, in
+ descending order of ScanRun stop time.
+
+ Returns:
+ Callable[[~.ListScanRunsRequest],
+ ~.ListScanRunsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_scan_runs" not in self._stubs:
+ self._stubs["list_scan_runs"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/ListScanRuns",
+ request_serializer=web_security_scanner.ListScanRunsRequest.serialize,
+ response_deserializer=web_security_scanner.ListScanRunsResponse.deserialize,
+ )
+ return self._stubs["list_scan_runs"]
+
+ @property
+ def stop_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.StopScanRunRequest], scan_run.ScanRun]:
+ r"""Return a callable for the stop scan run method over gRPC.
+
+ Stops a ScanRun. The stopped ScanRun is returned.
+
+ Returns:
+ Callable[[~.StopScanRunRequest],
+ ~.ScanRun]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "stop_scan_run" not in self._stubs:
+ self._stubs["stop_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/StopScanRun",
+ request_serializer=web_security_scanner.StopScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["stop_scan_run"]
+
+ @property
+ def list_crawled_urls(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListCrawledUrlsRequest],
+ web_security_scanner.ListCrawledUrlsResponse,
+ ]:
+ r"""Return a callable for the list crawled urls method over gRPC.
+
+ List CrawledUrls under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListCrawledUrlsRequest],
+ ~.ListCrawledUrlsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_crawled_urls" not in self._stubs:
+ self._stubs["list_crawled_urls"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/ListCrawledUrls",
+ request_serializer=web_security_scanner.ListCrawledUrlsRequest.serialize,
+ response_deserializer=web_security_scanner.ListCrawledUrlsResponse.deserialize,
+ )
+ return self._stubs["list_crawled_urls"]
+
+ @property
+ def get_finding(
+ self,
+ ) -> Callable[[web_security_scanner.GetFindingRequest], finding.Finding]:
+ r"""Return a callable for the get finding method over gRPC.
+
+ Gets a Finding.
+
+ Returns:
+ Callable[[~.GetFindingRequest],
+ ~.Finding]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_finding" not in self._stubs:
+ self._stubs["get_finding"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/GetFinding",
+ request_serializer=web_security_scanner.GetFindingRequest.serialize,
+ response_deserializer=finding.Finding.deserialize,
+ )
+ return self._stubs["get_finding"]
+
+ @property
+ def list_findings(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingsRequest],
+ web_security_scanner.ListFindingsResponse,
+ ]:
+ r"""Return a callable for the list findings method over gRPC.
+
+ List Findings under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListFindingsRequest],
+ ~.ListFindingsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_findings" not in self._stubs:
+ self._stubs["list_findings"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/ListFindings",
+ request_serializer=web_security_scanner.ListFindingsRequest.serialize,
+ response_deserializer=web_security_scanner.ListFindingsResponse.deserialize,
+ )
+ return self._stubs["list_findings"]
+
+ @property
+ def list_finding_type_stats(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingTypeStatsRequest],
+ web_security_scanner.ListFindingTypeStatsResponse,
+ ]:
+ r"""Return a callable for the list finding type stats method over gRPC.
+
+ List all FindingTypeStats under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListFindingTypeStatsRequest],
+ ~.ListFindingTypeStatsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_finding_type_stats" not in self._stubs:
+ self._stubs["list_finding_type_stats"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/ListFindingTypeStats",
+ request_serializer=web_security_scanner.ListFindingTypeStatsRequest.serialize,
+ response_deserializer=web_security_scanner.ListFindingTypeStatsResponse.deserialize,
+ )
+ return self._stubs["list_finding_type_stats"]
+
+ def close(self):
+ self.grpc_channel.close()
+
+ @property
+ def kind(self) -> str:
+ return "grpc"
+
+
+__all__ = ("WebSecurityScannerGrpcTransport",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/grpc_asyncio.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/grpc_asyncio.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/grpc_asyncio.py
@@ -0,0 +1,619 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers_async
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.protobuf import empty_pb2 # type: ignore
+import grpc # type: ignore
+from grpc.experimental import aio # type: ignore
+
+from google.cloud.websecurityscanner_v1.types import (
+ finding,
+ scan_config,
+ scan_run,
+ web_security_scanner,
+)
+
+from .base import DEFAULT_CLIENT_INFO, WebSecurityScannerTransport
+from .grpc import WebSecurityScannerGrpcTransport
+
+
+class WebSecurityScannerGrpcAsyncIOTransport(WebSecurityScannerTransport):
+ """gRPC AsyncIO backend transport for WebSecurityScanner.
+
+ Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud. It
+ crawls your application, and attempts to exercise as many user
+ inputs and event handlers as possible.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _grpc_channel: aio.Channel
+ _stubs: Dict[str, Callable] = {}
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> aio.Channel:
+ """Create and return a gRPC AsyncIO channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ aio.Channel: A gRPC AsyncIO channel object.
+ """
+
+ return grpc_helpers_async.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ def __init__(
+ self,
+ *,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[aio.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ channel (Optional[aio.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def grpc_channel(self) -> aio.Channel:
+ """Create the channel designed to connect to this service.
+
+ This property caches on the instance; repeated calls return
+ the same channel.
+ """
+ # Return the channel from cache.
+ return self._grpc_channel
+
+ @property
+ def create_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.CreateScanConfigRequest],
+ Awaitable[scan_config.ScanConfig],
+ ]:
+ r"""Return a callable for the create scan config method over gRPC.
+
+ Creates a new ScanConfig.
+
+ Returns:
+ Callable[[~.CreateScanConfigRequest],
+ Awaitable[~.ScanConfig]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_scan_config" not in self._stubs:
+ self._stubs["create_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/CreateScanConfig",
+ request_serializer=web_security_scanner.CreateScanConfigRequest.serialize,
+ response_deserializer=scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["create_scan_config"]
+
+ @property
+ def delete_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.DeleteScanConfigRequest], Awaitable[empty_pb2.Empty]
+ ]:
+ r"""Return a callable for the delete scan config method over gRPC.
+
+ Deletes an existing ScanConfig and its child
+ resources.
+
+ Returns:
+ Callable[[~.DeleteScanConfigRequest],
+ Awaitable[~.Empty]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_scan_config" not in self._stubs:
+ self._stubs["delete_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/DeleteScanConfig",
+ request_serializer=web_security_scanner.DeleteScanConfigRequest.serialize,
+ response_deserializer=empty_pb2.Empty.FromString,
+ )
+ return self._stubs["delete_scan_config"]
+
+ @property
+ def get_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetScanConfigRequest], Awaitable[scan_config.ScanConfig]
+ ]:
+ r"""Return a callable for the get scan config method over gRPC.
+
+ Gets a ScanConfig.
+
+ Returns:
+ Callable[[~.GetScanConfigRequest],
+ Awaitable[~.ScanConfig]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_scan_config" not in self._stubs:
+ self._stubs["get_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/GetScanConfig",
+ request_serializer=web_security_scanner.GetScanConfigRequest.serialize,
+ response_deserializer=scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["get_scan_config"]
+
+ @property
+ def list_scan_configs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanConfigsRequest],
+ Awaitable[web_security_scanner.ListScanConfigsResponse],
+ ]:
+ r"""Return a callable for the list scan configs method over gRPC.
+
+ Lists ScanConfigs under a given project.
+
+ Returns:
+ Callable[[~.ListScanConfigsRequest],
+ Awaitable[~.ListScanConfigsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_scan_configs" not in self._stubs:
+ self._stubs["list_scan_configs"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/ListScanConfigs",
+ request_serializer=web_security_scanner.ListScanConfigsRequest.serialize,
+ response_deserializer=web_security_scanner.ListScanConfigsResponse.deserialize,
+ )
+ return self._stubs["list_scan_configs"]
+
+ @property
+ def update_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.UpdateScanConfigRequest],
+ Awaitable[scan_config.ScanConfig],
+ ]:
+ r"""Return a callable for the update scan config method over gRPC.
+
+ Updates a ScanConfig. This method support partial
+ update of a ScanConfig.
+
+ Returns:
+ Callable[[~.UpdateScanConfigRequest],
+ Awaitable[~.ScanConfig]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_scan_config" not in self._stubs:
+ self._stubs["update_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/UpdateScanConfig",
+ request_serializer=web_security_scanner.UpdateScanConfigRequest.serialize,
+ response_deserializer=scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["update_scan_config"]
+
+ @property
+ def start_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.StartScanRunRequest], Awaitable[scan_run.ScanRun]
+ ]:
+ r"""Return a callable for the start scan run method over gRPC.
+
+ Start a ScanRun according to the given ScanConfig.
+
+ Returns:
+ Callable[[~.StartScanRunRequest],
+ Awaitable[~.ScanRun]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "start_scan_run" not in self._stubs:
+ self._stubs["start_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/StartScanRun",
+ request_serializer=web_security_scanner.StartScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["start_scan_run"]
+
+ @property
+ def get_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetScanRunRequest], Awaitable[scan_run.ScanRun]
+ ]:
+ r"""Return a callable for the get scan run method over gRPC.
+
+ Gets a ScanRun.
+
+ Returns:
+ Callable[[~.GetScanRunRequest],
+ Awaitable[~.ScanRun]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_scan_run" not in self._stubs:
+ self._stubs["get_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/GetScanRun",
+ request_serializer=web_security_scanner.GetScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["get_scan_run"]
+
+ @property
+ def list_scan_runs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanRunsRequest],
+ Awaitable[web_security_scanner.ListScanRunsResponse],
+ ]:
+ r"""Return a callable for the list scan runs method over gRPC.
+
+ Lists ScanRuns under a given ScanConfig, in
+ descending order of ScanRun stop time.
+
+ Returns:
+ Callable[[~.ListScanRunsRequest],
+ Awaitable[~.ListScanRunsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_scan_runs" not in self._stubs:
+ self._stubs["list_scan_runs"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/ListScanRuns",
+ request_serializer=web_security_scanner.ListScanRunsRequest.serialize,
+ response_deserializer=web_security_scanner.ListScanRunsResponse.deserialize,
+ )
+ return self._stubs["list_scan_runs"]
+
+ @property
+ def stop_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.StopScanRunRequest], Awaitable[scan_run.ScanRun]
+ ]:
+ r"""Return a callable for the stop scan run method over gRPC.
+
+ Stops a ScanRun. The stopped ScanRun is returned.
+
+ Returns:
+ Callable[[~.StopScanRunRequest],
+ Awaitable[~.ScanRun]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "stop_scan_run" not in self._stubs:
+ self._stubs["stop_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/StopScanRun",
+ request_serializer=web_security_scanner.StopScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["stop_scan_run"]
+
+ @property
+ def list_crawled_urls(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListCrawledUrlsRequest],
+ Awaitable[web_security_scanner.ListCrawledUrlsResponse],
+ ]:
+ r"""Return a callable for the list crawled urls method over gRPC.
+
+ List CrawledUrls under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListCrawledUrlsRequest],
+ Awaitable[~.ListCrawledUrlsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_crawled_urls" not in self._stubs:
+ self._stubs["list_crawled_urls"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/ListCrawledUrls",
+ request_serializer=web_security_scanner.ListCrawledUrlsRequest.serialize,
+ response_deserializer=web_security_scanner.ListCrawledUrlsResponse.deserialize,
+ )
+ return self._stubs["list_crawled_urls"]
+
+ @property
+ def get_finding(
+ self,
+ ) -> Callable[[web_security_scanner.GetFindingRequest], Awaitable[finding.Finding]]:
+ r"""Return a callable for the get finding method over gRPC.
+
+ Gets a Finding.
+
+ Returns:
+ Callable[[~.GetFindingRequest],
+ Awaitable[~.Finding]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_finding" not in self._stubs:
+ self._stubs["get_finding"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/GetFinding",
+ request_serializer=web_security_scanner.GetFindingRequest.serialize,
+ response_deserializer=finding.Finding.deserialize,
+ )
+ return self._stubs["get_finding"]
+
+ @property
+ def list_findings(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingsRequest],
+ Awaitable[web_security_scanner.ListFindingsResponse],
+ ]:
+ r"""Return a callable for the list findings method over gRPC.
+
+ List Findings under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListFindingsRequest],
+ Awaitable[~.ListFindingsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_findings" not in self._stubs:
+ self._stubs["list_findings"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/ListFindings",
+ request_serializer=web_security_scanner.ListFindingsRequest.serialize,
+ response_deserializer=web_security_scanner.ListFindingsResponse.deserialize,
+ )
+ return self._stubs["list_findings"]
+
+ @property
+ def list_finding_type_stats(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingTypeStatsRequest],
+ Awaitable[web_security_scanner.ListFindingTypeStatsResponse],
+ ]:
+ r"""Return a callable for the list finding type stats method over gRPC.
+
+ List all FindingTypeStats under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListFindingTypeStatsRequest],
+ Awaitable[~.ListFindingTypeStatsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_finding_type_stats" not in self._stubs:
+ self._stubs["list_finding_type_stats"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1.WebSecurityScanner/ListFindingTypeStats",
+ request_serializer=web_security_scanner.ListFindingTypeStatsRequest.serialize,
+ response_deserializer=web_security_scanner.ListFindingTypeStatsResponse.deserialize,
+ )
+ return self._stubs["list_finding_type_stats"]
+
+ def close(self):
+ return self.grpc_channel.close()
+
+
+__all__ = ("WebSecurityScannerGrpcAsyncIOTransport",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/rest.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/rest.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/services/web_security_scanner/transports/rest.py
@@ -0,0 +1,1721 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import dataclasses
+import json # type: ignore
+import re
+from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, path_template, rest_helpers, rest_streaming
+from google.api_core import exceptions as core_exceptions
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.auth.transport.requests import AuthorizedSession # type: ignore
+from google.protobuf import json_format
+import grpc # type: ignore
+from requests import __version__ as requests_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+
+from google.protobuf import empty_pb2 # type: ignore
+
+from google.cloud.websecurityscanner_v1.types import (
+ finding,
+ scan_config,
+ scan_run,
+ web_security_scanner,
+)
+
+from .base import DEFAULT_CLIENT_INFO as BASE_DEFAULT_CLIENT_INFO
+from .base import WebSecurityScannerTransport
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=BASE_DEFAULT_CLIENT_INFO.gapic_version,
+ grpc_version=None,
+ rest_version=requests_version,
+)
+
+
+class WebSecurityScannerRestInterceptor:
+ """Interceptor for WebSecurityScanner.
+
+ Interceptors are used to manipulate requests, request metadata, and responses
+ in arbitrary ways.
+ Example use cases include:
+ * Logging
+ * Verifying requests according to service or custom semantics
+ * Stripping extraneous information from responses
+
+ These use cases and more can be enabled by injecting an
+ instance of a custom subclass when constructing the WebSecurityScannerRestTransport.
+
+ .. code-block:: python
+ class MyCustomWebSecurityScannerInterceptor(WebSecurityScannerRestInterceptor):
+ def pre_create_scan_config(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_create_scan_config(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_delete_scan_config(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def pre_get_finding(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_finding(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_scan_config(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_scan_config(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_scan_run(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_scan_run(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_crawled_urls(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_crawled_urls(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_findings(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_findings(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_finding_type_stats(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_finding_type_stats(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_scan_configs(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_scan_configs(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_scan_runs(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_scan_runs(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_start_scan_run(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_start_scan_run(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_stop_scan_run(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_stop_scan_run(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_update_scan_config(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_update_scan_config(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ transport = WebSecurityScannerRestTransport(interceptor=MyCustomWebSecurityScannerInterceptor())
+ client = WebSecurityScannerClient(transport=transport)
+
+
+ """
+
+ def pre_create_scan_config(
+ self,
+ request: web_security_scanner.CreateScanConfigRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.CreateScanConfigRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for create_scan_config
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_create_scan_config(
+ self, response: scan_config.ScanConfig
+ ) -> scan_config.ScanConfig:
+ """Post-rpc interceptor for create_scan_config
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_delete_scan_config(
+ self,
+ request: web_security_scanner.DeleteScanConfigRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.DeleteScanConfigRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for delete_scan_config
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def pre_get_finding(
+ self,
+ request: web_security_scanner.GetFindingRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.GetFindingRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_finding
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_get_finding(self, response: finding.Finding) -> finding.Finding:
+ """Post-rpc interceptor for get_finding
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_scan_config(
+ self,
+ request: web_security_scanner.GetScanConfigRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.GetScanConfigRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_scan_config
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_get_scan_config(
+ self, response: scan_config.ScanConfig
+ ) -> scan_config.ScanConfig:
+ """Post-rpc interceptor for get_scan_config
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_scan_run(
+ self,
+ request: web_security_scanner.GetScanRunRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.GetScanRunRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_scan_run
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_get_scan_run(self, response: scan_run.ScanRun) -> scan_run.ScanRun:
+ """Post-rpc interceptor for get_scan_run
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_crawled_urls(
+ self,
+ request: web_security_scanner.ListCrawledUrlsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.ListCrawledUrlsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_crawled_urls
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_crawled_urls(
+ self, response: web_security_scanner.ListCrawledUrlsResponse
+ ) -> web_security_scanner.ListCrawledUrlsResponse:
+ """Post-rpc interceptor for list_crawled_urls
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_findings(
+ self,
+ request: web_security_scanner.ListFindingsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.ListFindingsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_findings
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_findings(
+ self, response: web_security_scanner.ListFindingsResponse
+ ) -> web_security_scanner.ListFindingsResponse:
+ """Post-rpc interceptor for list_findings
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_finding_type_stats(
+ self,
+ request: web_security_scanner.ListFindingTypeStatsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[
+ web_security_scanner.ListFindingTypeStatsRequest, Sequence[Tuple[str, str]]
+ ]:
+ """Pre-rpc interceptor for list_finding_type_stats
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_finding_type_stats(
+ self, response: web_security_scanner.ListFindingTypeStatsResponse
+ ) -> web_security_scanner.ListFindingTypeStatsResponse:
+ """Post-rpc interceptor for list_finding_type_stats
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_scan_configs(
+ self,
+ request: web_security_scanner.ListScanConfigsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.ListScanConfigsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_scan_configs
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_scan_configs(
+ self, response: web_security_scanner.ListScanConfigsResponse
+ ) -> web_security_scanner.ListScanConfigsResponse:
+ """Post-rpc interceptor for list_scan_configs
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_scan_runs(
+ self,
+ request: web_security_scanner.ListScanRunsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.ListScanRunsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_scan_runs
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_scan_runs(
+ self, response: web_security_scanner.ListScanRunsResponse
+ ) -> web_security_scanner.ListScanRunsResponse:
+ """Post-rpc interceptor for list_scan_runs
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_start_scan_run(
+ self,
+ request: web_security_scanner.StartScanRunRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.StartScanRunRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for start_scan_run
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_start_scan_run(self, response: scan_run.ScanRun) -> scan_run.ScanRun:
+ """Post-rpc interceptor for start_scan_run
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_stop_scan_run(
+ self,
+ request: web_security_scanner.StopScanRunRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.StopScanRunRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for stop_scan_run
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_stop_scan_run(self, response: scan_run.ScanRun) -> scan_run.ScanRun:
+ """Post-rpc interceptor for stop_scan_run
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_update_scan_config(
+ self,
+ request: web_security_scanner.UpdateScanConfigRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.UpdateScanConfigRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for update_scan_config
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_update_scan_config(
+ self, response: scan_config.ScanConfig
+ ) -> scan_config.ScanConfig:
+ """Post-rpc interceptor for update_scan_config
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+
+@dataclasses.dataclass
+class WebSecurityScannerRestStub:
+ _session: AuthorizedSession
+ _host: str
+ _interceptor: WebSecurityScannerRestInterceptor
+
+
+class WebSecurityScannerRestTransport(WebSecurityScannerTransport):
+ """REST backend transport for WebSecurityScanner.
+
+ Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud. It
+ crawls your application, and attempts to exercise as many user
+ inputs and event handlers as possible.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends JSON representations of protocol buffers over HTTP/1.1
+
+ """
+
+ def __init__(
+ self,
+ *,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ url_scheme: str = "https",
+ interceptor: Optional[WebSecurityScannerRestInterceptor] = None,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Callable[[], Tuple[bytes, bytes]]): Client
+ certificate to configure mutual TLS HTTP channel. It is ignored
+ if ``channel`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you are developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ url_scheme: the protocol scheme for the API endpoint. Normally
+ "https", but for testing or local servers,
+ "http" can be specified.
+ """
+ # Run the base constructor
+ # TODO(yon-mg): resolve other ctor params i.e. scopes, quota, etc.
+ # TODO: When custom host (api_endpoint) is set, `scopes` must *also* be set on the
+ # credentials object
+ maybe_url_match = re.match("^(?P<scheme>http(?:s)?://)?(?P<host>.*)$", host)
+ if maybe_url_match is None:
+ raise ValueError(
+ f"Unexpected hostname structure: {host}"
+ ) # pragma: NO COVER
+
+ url_match_items = maybe_url_match.groupdict()
+
+ host = f"{url_scheme}://{host}" if not url_match_items["scheme"] else host
+
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+ self._session = AuthorizedSession(
+ self._credentials, default_host=self.DEFAULT_HOST
+ )
+ if client_cert_source_for_mtls:
+ self._session.configure_mtls_channel(client_cert_source_for_mtls)
+ self._interceptor = interceptor or WebSecurityScannerRestInterceptor()
+ self._prep_wrapped_messages(client_info)
+
+ class _CreateScanConfig(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("CreateScanConfig")
+
+ def __call__(
+ self,
+ request: web_security_scanner.CreateScanConfigRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Call the create scan config method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.CreateScanConfigRequest):
+ The request object. Request for the ``CreateScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_config.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1/{parent=projects/*}/scanConfigs",
+ "body": "scan_config",
+ },
+ ]
+ request, metadata = self._interceptor.pre_create_scan_config(
+ request, metadata
+ )
+ pb_request = web_security_scanner.CreateScanConfigRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_config.ScanConfig()
+ pb_resp = scan_config.ScanConfig.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_create_scan_config(resp)
+ return resp
+
+ class _DeleteScanConfig(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("DeleteScanConfig")
+
+ def __call__(
+ self,
+ request: web_security_scanner.DeleteScanConfigRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ):
+ r"""Call the delete scan config method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.DeleteScanConfigRequest):
+ The request object. Request for the ``DeleteScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "delete",
+ "uri": "/v1/{name=projects/*/scanConfigs/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_delete_scan_config(
+ request, metadata
+ )
+ pb_request = web_security_scanner.DeleteScanConfigRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ class _GetFinding(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("GetFinding")
+
+ def __call__(
+ self,
+ request: web_security_scanner.GetFindingRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> finding.Finding:
+ r"""Call the get finding method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.GetFindingRequest):
+ The request object. Request for the ``GetFinding`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.finding.Finding:
+ A Finding resource represents a
+ vulnerability instance identified during
+ a ScanRun.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{name=projects/*/scanConfigs/*/scanRuns/*/findings/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_finding(request, metadata)
+ pb_request = web_security_scanner.GetFindingRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = finding.Finding()
+ pb_resp = finding.Finding.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_finding(resp)
+ return resp
+
+ class _GetScanConfig(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("GetScanConfig")
+
+ def __call__(
+ self,
+ request: web_security_scanner.GetScanConfigRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Call the get scan config method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.GetScanConfigRequest):
+ The request object. Request for the ``GetScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_config.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{name=projects/*/scanConfigs/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_scan_config(request, metadata)
+ pb_request = web_security_scanner.GetScanConfigRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_config.ScanConfig()
+ pb_resp = scan_config.ScanConfig.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_scan_config(resp)
+ return resp
+
+ class _GetScanRun(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("GetScanRun")
+
+ def __call__(
+ self,
+ request: web_security_scanner.GetScanRunRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Call the get scan run method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.GetScanRunRequest):
+ The request object. Request for the ``GetScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_run.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{name=projects/*/scanConfigs/*/scanRuns/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_scan_run(request, metadata)
+ pb_request = web_security_scanner.GetScanRunRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_run.ScanRun()
+ pb_resp = scan_run.ScanRun.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_scan_run(resp)
+ return resp
+
+ class _ListCrawledUrls(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListCrawledUrls")
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListCrawledUrlsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListCrawledUrlsResponse:
+ r"""Call the list crawled urls method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListCrawledUrlsRequest):
+ The request object. Request for the ``ListCrawledUrls`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListCrawledUrlsResponse:
+ Response for the ``ListCrawledUrls`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{parent=projects/*/scanConfigs/*/scanRuns/*}/crawledUrls",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_crawled_urls(
+ request, metadata
+ )
+ pb_request = web_security_scanner.ListCrawledUrlsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListCrawledUrlsResponse()
+ pb_resp = web_security_scanner.ListCrawledUrlsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_crawled_urls(resp)
+ return resp
+
+ class _ListFindings(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListFindings")
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListFindingsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListFindingsResponse:
+ r"""Call the list findings method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListFindingsRequest):
+ The request object. Request for the ``ListFindings`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListFindingsResponse:
+ Response for the ``ListFindings`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{parent=projects/*/scanConfigs/*/scanRuns/*}/findings",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_findings(request, metadata)
+ pb_request = web_security_scanner.ListFindingsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListFindingsResponse()
+ pb_resp = web_security_scanner.ListFindingsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_findings(resp)
+ return resp
+
+ class _ListFindingTypeStats(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListFindingTypeStats")
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListFindingTypeStatsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListFindingTypeStatsResponse:
+ r"""Call the list finding type stats method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListFindingTypeStatsRequest):
+ The request object. Request for the ``ListFindingTypeStats`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListFindingTypeStatsResponse:
+ Response for the ``ListFindingTypeStats`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{parent=projects/*/scanConfigs/*/scanRuns/*}/findingTypeStats",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_finding_type_stats(
+ request, metadata
+ )
+ pb_request = web_security_scanner.ListFindingTypeStatsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListFindingTypeStatsResponse()
+ pb_resp = web_security_scanner.ListFindingTypeStatsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_finding_type_stats(resp)
+ return resp
+
+ class _ListScanConfigs(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListScanConfigs")
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListScanConfigsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListScanConfigsResponse:
+ r"""Call the list scan configs method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListScanConfigsRequest):
+ The request object. Request for the ``ListScanConfigs`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListScanConfigsResponse:
+ Response for the ``ListScanConfigs`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{parent=projects/*}/scanConfigs",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_scan_configs(
+ request, metadata
+ )
+ pb_request = web_security_scanner.ListScanConfigsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListScanConfigsResponse()
+ pb_resp = web_security_scanner.ListScanConfigsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_scan_configs(resp)
+ return resp
+
+ class _ListScanRuns(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListScanRuns")
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListScanRunsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListScanRunsResponse:
+ r"""Call the list scan runs method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListScanRunsRequest):
+ The request object. Request for the ``ListScanRuns`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListScanRunsResponse:
+ Response for the ``ListScanRuns`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1/{parent=projects/*/scanConfigs/*}/scanRuns",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_scan_runs(request, metadata)
+ pb_request = web_security_scanner.ListScanRunsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListScanRunsResponse()
+ pb_resp = web_security_scanner.ListScanRunsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_scan_runs(resp)
+ return resp
+
+ class _StartScanRun(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("StartScanRun")
+
+ def __call__(
+ self,
+ request: web_security_scanner.StartScanRunRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Call the start scan run method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.StartScanRunRequest):
+ The request object. Request for the ``StartScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_run.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1/{name=projects/*/scanConfigs/*}:start",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_start_scan_run(request, metadata)
+ pb_request = web_security_scanner.StartScanRunRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_run.ScanRun()
+ pb_resp = scan_run.ScanRun.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_start_scan_run(resp)
+ return resp
+
+ class _StopScanRun(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("StopScanRun")
+
+ def __call__(
+ self,
+ request: web_security_scanner.StopScanRunRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Call the stop scan run method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.StopScanRunRequest):
+ The request object. Request for the ``StopScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_run.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1/{name=projects/*/scanConfigs/*/scanRuns/*}:stop",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_stop_scan_run(request, metadata)
+ pb_request = web_security_scanner.StopScanRunRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_run.ScanRun()
+ pb_resp = scan_run.ScanRun.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_stop_scan_run(resp)
+ return resp
+
+ class _UpdateScanConfig(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("UpdateScanConfig")
+
+ def __call__(
+ self,
+ request: web_security_scanner.UpdateScanConfigRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Call the update scan config method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.UpdateScanConfigRequest):
+ The request object. Request for the ``UpdateScanConfigRequest`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_config.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "patch",
+ "uri": "/v1/{scan_config.name=projects/*/scanConfigs/*}",
+ "body": "scan_config",
+ },
+ ]
+ request, metadata = self._interceptor.pre_update_scan_config(
+ request, metadata
+ )
+ pb_request = web_security_scanner.UpdateScanConfigRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_config.ScanConfig()
+ pb_resp = scan_config.ScanConfig.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_update_scan_config(resp)
+ return resp
+
+ @property
+ def create_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.CreateScanConfigRequest], scan_config.ScanConfig
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._CreateScanConfig(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def delete_scan_config(
+ self,
+ ) -> Callable[[web_security_scanner.DeleteScanConfigRequest], empty_pb2.Empty]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._DeleteScanConfig(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_finding(
+ self,
+ ) -> Callable[[web_security_scanner.GetFindingRequest], finding.Finding]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetFinding(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_scan_config(
+ self,
+ ) -> Callable[[web_security_scanner.GetScanConfigRequest], scan_config.ScanConfig]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetScanConfig(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.GetScanRunRequest], scan_run.ScanRun]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetScanRun(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_crawled_urls(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListCrawledUrlsRequest],
+ web_security_scanner.ListCrawledUrlsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListCrawledUrls(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_findings(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingsRequest],
+ web_security_scanner.ListFindingsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListFindings(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_finding_type_stats(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingTypeStatsRequest],
+ web_security_scanner.ListFindingTypeStatsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListFindingTypeStats(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_scan_configs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanConfigsRequest],
+ web_security_scanner.ListScanConfigsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListScanConfigs(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_scan_runs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanRunsRequest],
+ web_security_scanner.ListScanRunsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListScanRuns(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def start_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.StartScanRunRequest], scan_run.ScanRun]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._StartScanRun(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def stop_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.StopScanRunRequest], scan_run.ScanRun]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._StopScanRun(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def update_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.UpdateScanConfigRequest], scan_config.ScanConfig
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._UpdateScanConfig(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def kind(self) -> str:
+ return "rest"
+
+ def close(self):
+ self._session.close()
+
+
+__all__ = ("WebSecurityScannerRestTransport",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/__init__.py
@@ -0,0 +1,90 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .crawled_url import CrawledUrl
+from .finding import Finding
+from .finding_addon import (
+ Form,
+ OutdatedLibrary,
+ ViolatingResource,
+ VulnerableHeaders,
+ VulnerableParameters,
+ Xss,
+ Xxe,
+)
+from .finding_type_stats import FindingTypeStats
+from .scan_config import ScanConfig
+from .scan_config_error import ScanConfigError
+from .scan_run import ScanRun
+from .scan_run_error_trace import ScanRunErrorTrace
+from .scan_run_log import ScanRunLog
+from .scan_run_warning_trace import ScanRunWarningTrace
+from .web_security_scanner import (
+ CreateScanConfigRequest,
+ DeleteScanConfigRequest,
+ GetFindingRequest,
+ GetScanConfigRequest,
+ GetScanRunRequest,
+ ListCrawledUrlsRequest,
+ ListCrawledUrlsResponse,
+ ListFindingsRequest,
+ ListFindingsResponse,
+ ListFindingTypeStatsRequest,
+ ListFindingTypeStatsResponse,
+ ListScanConfigsRequest,
+ ListScanConfigsResponse,
+ ListScanRunsRequest,
+ ListScanRunsResponse,
+ StartScanRunRequest,
+ StopScanRunRequest,
+ UpdateScanConfigRequest,
+)
+
+__all__ = (
+ "CrawledUrl",
+ "Finding",
+ "Form",
+ "OutdatedLibrary",
+ "ViolatingResource",
+ "VulnerableHeaders",
+ "VulnerableParameters",
+ "Xss",
+ "Xxe",
+ "FindingTypeStats",
+ "ScanConfig",
+ "ScanConfigError",
+ "ScanRun",
+ "ScanRunErrorTrace",
+ "ScanRunLog",
+ "ScanRunWarningTrace",
+ "CreateScanConfigRequest",
+ "DeleteScanConfigRequest",
+ "GetFindingRequest",
+ "GetScanConfigRequest",
+ "GetScanRunRequest",
+ "ListCrawledUrlsRequest",
+ "ListCrawledUrlsResponse",
+ "ListFindingsRequest",
+ "ListFindingsResponse",
+ "ListFindingTypeStatsRequest",
+ "ListFindingTypeStatsResponse",
+ "ListScanConfigsRequest",
+ "ListScanConfigsResponse",
+ "ListScanRunsRequest",
+ "ListScanRunsResponse",
+ "StartScanRunRequest",
+ "StopScanRunRequest",
+ "UpdateScanConfigRequest",
+)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/crawled_url.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/crawled_url.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/crawled_url.py
@@ -0,0 +1,61 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1",
+ manifest={
+ "CrawledUrl",
+ },
+)
+
+
+class CrawledUrl(proto.Message):
+ r"""A CrawledUrl resource represents a URL that was crawled
+ during a ScanRun. Web Security Scanner Service crawls the web
+ applications, following all links within the scope of sites, to
+ find the URLs to test against.
+
+ Attributes:
+ http_method (str):
+ Output only. The http method of the request
+ that was used to visit the URL, in uppercase.
+ url (str):
+ Output only. The URL that was crawled.
+ body (str):
+ Output only. The body of the request that was
+ used to visit the URL.
+ """
+
+ http_method: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ url: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ body: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/finding.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/finding.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/finding.py
@@ -0,0 +1,208 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1.types import finding_addon
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1",
+ manifest={
+ "Finding",
+ },
+)
+
+
+class Finding(proto.Message):
+ r"""A Finding resource represents a vulnerability instance
+ identified during a ScanRun.
+
+ Attributes:
+ name (str):
+ Output only. The resource name of the
+ Finding. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanruns/{scanRunId}/findings/{findingId}'.
+ The finding IDs are generated by the system.
+ finding_type (str):
+ Output only. The type of the Finding.
+ Detailed and up-to-date information on findings
+ can be found here:
+ https://cloud.google.com/security-command-center/docs/how-to-remediate-web-security-scanner-findings
+ severity (google.cloud.websecurityscanner_v1.types.Finding.Severity):
+ Output only. The severity level of the
+ reported vulnerability.
+ http_method (str):
+ Output only. The http method of the request
+ that triggered the vulnerability, in uppercase.
+ fuzzed_url (str):
+ Output only. The URL produced by the
+ server-side fuzzer and used in the request that
+ triggered the vulnerability.
+ body (str):
+ Output only. The body of the request that
+ triggered the vulnerability.
+ description (str):
+ Output only. The description of the
+ vulnerability.
+ reproduction_url (str):
+ Output only. The URL containing
+ human-readable payload that user can leverage to
+ reproduce the vulnerability.
+ frame_url (str):
+ Output only. If the vulnerability was
+ originated from nested IFrame, the immediate
+ parent IFrame is reported.
+ final_url (str):
+ Output only. The URL where the browser lands
+ when the vulnerability is detected.
+ tracking_id (str):
+ Output only. The tracking ID uniquely
+ identifies a vulnerability instance across
+ multiple ScanRuns.
+ form (google.cloud.websecurityscanner_v1.types.Form):
+ Output only. An addon containing information
+ reported for a vulnerability with an HTML form,
+ if any.
+ outdated_library (google.cloud.websecurityscanner_v1.types.OutdatedLibrary):
+ Output only. An addon containing information
+ about outdated libraries.
+ violating_resource (google.cloud.websecurityscanner_v1.types.ViolatingResource):
+ Output only. An addon containing detailed
+ information regarding any resource causing the
+ vulnerability such as JavaScript sources, image,
+ audio files, etc.
+ vulnerable_headers (google.cloud.websecurityscanner_v1.types.VulnerableHeaders):
+ Output only. An addon containing information
+ about vulnerable or missing HTTP headers.
+ vulnerable_parameters (google.cloud.websecurityscanner_v1.types.VulnerableParameters):
+ Output only. An addon containing information
+ about request parameters which were found to be
+ vulnerable.
+ xss (google.cloud.websecurityscanner_v1.types.Xss):
+ Output only. An addon containing information
+ reported for an XSS, if any.
+ xxe (google.cloud.websecurityscanner_v1.types.Xxe):
+ Output only. An addon containing information
+ reported for an XXE, if any.
+ """
+
+ class Severity(proto.Enum):
+ r"""The severity level of a vulnerability.
+
+ Values:
+ SEVERITY_UNSPECIFIED (0):
+ No severity specified. The default value.
+ CRITICAL (1):
+ Critical severity.
+ HIGH (2):
+ High severity.
+ MEDIUM (3):
+ Medium severity.
+ LOW (4):
+ Low severity.
+ """
+ SEVERITY_UNSPECIFIED = 0
+ CRITICAL = 1
+ HIGH = 2
+ MEDIUM = 3
+ LOW = 4
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ finding_type: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ severity: Severity = proto.Field(
+ proto.ENUM,
+ number=17,
+ enum=Severity,
+ )
+ http_method: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ fuzzed_url: str = proto.Field(
+ proto.STRING,
+ number=4,
+ )
+ body: str = proto.Field(
+ proto.STRING,
+ number=5,
+ )
+ description: str = proto.Field(
+ proto.STRING,
+ number=6,
+ )
+ reproduction_url: str = proto.Field(
+ proto.STRING,
+ number=7,
+ )
+ frame_url: str = proto.Field(
+ proto.STRING,
+ number=8,
+ )
+ final_url: str = proto.Field(
+ proto.STRING,
+ number=9,
+ )
+ tracking_id: str = proto.Field(
+ proto.STRING,
+ number=10,
+ )
+ form: finding_addon.Form = proto.Field(
+ proto.MESSAGE,
+ number=16,
+ message=finding_addon.Form,
+ )
+ outdated_library: finding_addon.OutdatedLibrary = proto.Field(
+ proto.MESSAGE,
+ number=11,
+ message=finding_addon.OutdatedLibrary,
+ )
+ violating_resource: finding_addon.ViolatingResource = proto.Field(
+ proto.MESSAGE,
+ number=12,
+ message=finding_addon.ViolatingResource,
+ )
+ vulnerable_headers: finding_addon.VulnerableHeaders = proto.Field(
+ proto.MESSAGE,
+ number=15,
+ message=finding_addon.VulnerableHeaders,
+ )
+ vulnerable_parameters: finding_addon.VulnerableParameters = proto.Field(
+ proto.MESSAGE,
+ number=13,
+ message=finding_addon.VulnerableParameters,
+ )
+ xss: finding_addon.Xss = proto.Field(
+ proto.MESSAGE,
+ number=14,
+ message=finding_addon.Xss,
+ )
+ xxe: finding_addon.Xxe = proto.Field(
+ proto.MESSAGE,
+ number=18,
+ message=finding_addon.Xxe,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/finding_addon.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/finding_addon.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/finding_addon.py
@@ -0,0 +1,304 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1",
+ manifest={
+ "Form",
+ "OutdatedLibrary",
+ "ViolatingResource",
+ "VulnerableParameters",
+ "VulnerableHeaders",
+ "Xss",
+ "Xxe",
+ },
+)
+
+
+class Form(proto.Message):
+ r"""! Information about a vulnerability with an HTML.
+
+ Attributes:
+ action_uri (str):
+ ! The URI where to send the form when it's
+ submitted.
+ fields (MutableSequence[str]):
+ ! The names of form fields related to the
+ vulnerability.
+ """
+
+ action_uri: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ fields: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=2,
+ )
+
+
+class OutdatedLibrary(proto.Message):
+ r"""Information reported for an outdated library.
+
+ Attributes:
+ library_name (str):
+ The name of the outdated library.
+ version (str):
+ The version number.
+ learn_more_urls (MutableSequence[str]):
+ URLs to learn more information about the
+ vulnerabilities in the library.
+ """
+
+ library_name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ version: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ learn_more_urls: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=3,
+ )
+
+
+class ViolatingResource(proto.Message):
+ r"""Information regarding any resource causing the vulnerability
+ such as JavaScript sources, image, audio files, etc.
+
+ Attributes:
+ content_type (str):
+ The MIME type of this resource.
+ resource_url (str):
+ URL of this violating resource.
+ """
+
+ content_type: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ resource_url: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class VulnerableParameters(proto.Message):
+ r"""Information about vulnerable request parameters.
+
+ Attributes:
+ parameter_names (MutableSequence[str]):
+ The vulnerable parameter names.
+ """
+
+ parameter_names: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=1,
+ )
+
+
+class VulnerableHeaders(proto.Message):
+ r"""Information about vulnerable or missing HTTP Headers.
+
+ Attributes:
+ headers (MutableSequence[google.cloud.websecurityscanner_v1.types.VulnerableHeaders.Header]):
+ List of vulnerable headers.
+ missing_headers (MutableSequence[google.cloud.websecurityscanner_v1.types.VulnerableHeaders.Header]):
+ List of missing headers.
+ """
+
+ class Header(proto.Message):
+ r"""Describes a HTTP Header.
+
+ Attributes:
+ name (str):
+ Header name.
+ value (str):
+ Header value.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ value: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+ headers: MutableSequence[Header] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=Header,
+ )
+ missing_headers: MutableSequence[Header] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=2,
+ message=Header,
+ )
+
+
+class Xss(proto.Message):
+ r"""Information reported for an XSS.
+
+ Attributes:
+ stack_traces (MutableSequence[str]):
+ Stack traces leading to the point where the
+ XSS occurred.
+ error_message (str):
+ An error message generated by a javascript
+ breakage.
+ attack_vector (google.cloud.websecurityscanner_v1.types.Xss.AttackVector):
+ The attack vector of the payload triggering
+ this XSS.
+ stored_xss_seeding_url (str):
+ The reproduction url for the seeding POST
+ request of a Stored XSS.
+ """
+
+ class AttackVector(proto.Enum):
+ r"""Types of XSS attack vector.
+
+ Values:
+ ATTACK_VECTOR_UNSPECIFIED (0):
+ Unknown attack vector.
+ LOCAL_STORAGE (1):
+ The attack comes from fuzzing the browser's
+ localStorage.
+ SESSION_STORAGE (2):
+ The attack comes from fuzzing the browser's
+ sessionStorage.
+ WINDOW_NAME (3):
+ The attack comes from fuzzing the window's
+ name property.
+ REFERRER (4):
+ The attack comes from fuzzing the referrer
+ property.
+ FORM_INPUT (5):
+ The attack comes from fuzzing an input
+ element.
+ COOKIE (6):
+ The attack comes from fuzzing the browser's
+ cookies.
+ POST_MESSAGE (7):
+ The attack comes from hijacking the post
+ messaging mechanism.
+ GET_PARAMETERS (8):
+ The attack comes from fuzzing parameters in
+ the url.
+ URL_FRAGMENT (9):
+ The attack comes from fuzzing the fragment in
+ the url.
+ HTML_COMMENT (10):
+ The attack comes from fuzzing the HTML
+ comments.
+ POST_PARAMETERS (11):
+ The attack comes from fuzzing the POST
+ parameters.
+ PROTOCOL (12):
+ The attack comes from fuzzing the protocol.
+ STORED_XSS (13):
+ The attack comes from the server side and is
+ stored.
+ SAME_ORIGIN (14):
+ The attack is a Same-Origin Method Execution
+ attack via a GET parameter.
+ USER_CONTROLLABLE_URL (15):
+ The attack payload is received from a
+ third-party host via a URL that is
+ user-controllable
+ """
+ ATTACK_VECTOR_UNSPECIFIED = 0
+ LOCAL_STORAGE = 1
+ SESSION_STORAGE = 2
+ WINDOW_NAME = 3
+ REFERRER = 4
+ FORM_INPUT = 5
+ COOKIE = 6
+ POST_MESSAGE = 7
+ GET_PARAMETERS = 8
+ URL_FRAGMENT = 9
+ HTML_COMMENT = 10
+ POST_PARAMETERS = 11
+ PROTOCOL = 12
+ STORED_XSS = 13
+ SAME_ORIGIN = 14
+ USER_CONTROLLABLE_URL = 15
+
+ stack_traces: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=1,
+ )
+ error_message: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ attack_vector: AttackVector = proto.Field(
+ proto.ENUM,
+ number=3,
+ enum=AttackVector,
+ )
+ stored_xss_seeding_url: str = proto.Field(
+ proto.STRING,
+ number=4,
+ )
+
+
+class Xxe(proto.Message):
+ r"""Information reported for an XXE.
+
+ Attributes:
+ payload_value (str):
+ The XML string that triggered the XXE
+ vulnerability. Non-payload values might be
+ redacted.
+ payload_location (google.cloud.websecurityscanner_v1.types.Xxe.Location):
+ Location within the request where the payload
+ was placed.
+ """
+
+ class Location(proto.Enum):
+ r"""Locations within a request where XML was substituted.
+
+ Values:
+ LOCATION_UNSPECIFIED (0):
+ Unknown Location.
+ COMPLETE_REQUEST_BODY (1):
+ The XML payload replaced the complete request
+ body.
+ """
+ LOCATION_UNSPECIFIED = 0
+ COMPLETE_REQUEST_BODY = 1
+
+ payload_value: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ payload_location: Location = proto.Field(
+ proto.ENUM,
+ number=2,
+ enum=Location,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/finding_type_stats.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/finding_type_stats.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/finding_type_stats.py
@@ -0,0 +1,53 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1",
+ manifest={
+ "FindingTypeStats",
+ },
+)
+
+
+class FindingTypeStats(proto.Message):
+ r"""A FindingTypeStats resource represents stats regarding a
+ specific FindingType of Findings under a given ScanRun.
+
+ Attributes:
+ finding_type (str):
+ Output only. The finding type associated with
+ the stats.
+ finding_count (int):
+ Output only. The count of findings belonging
+ to this finding type.
+ """
+
+ finding_type: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ finding_count: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_config.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_config.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_config.py
@@ -0,0 +1,360 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import timestamp_pb2 # type: ignore
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1",
+ manifest={
+ "ScanConfig",
+ },
+)
+
+
+class ScanConfig(proto.Message):
+ r"""A ScanConfig resource contains the configurations to launch a
+ scan.
+
+ Attributes:
+ name (str):
+ The resource name of the ScanConfig. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ The ScanConfig IDs are generated by the system.
+ display_name (str):
+ Required. The user provided display name of
+ the ScanConfig.
+ max_qps (int):
+ The maximum QPS during scanning. A valid value ranges from 5
+ to 20 inclusively. If the field is unspecified or its value
+ is set 0, server will default to 15. Other values outside of
+ [5, 20] range will be rejected with INVALID_ARGUMENT error.
+ starting_urls (MutableSequence[str]):
+ Required. The starting URLs from which the
+ scanner finds site pages.
+ authentication (google.cloud.websecurityscanner_v1.types.ScanConfig.Authentication):
+ The authentication configuration. If
+ specified, service will use the authentication
+ configuration during scanning.
+ user_agent (google.cloud.websecurityscanner_v1.types.ScanConfig.UserAgent):
+ The user agent used during scanning.
+ blacklist_patterns (MutableSequence[str]):
+ The excluded URL patterns as described in
+ https://cloud.google.com/security-command-center/docs/how-to-use-web-security-scanner#excluding_urls
+ schedule (google.cloud.websecurityscanner_v1.types.ScanConfig.Schedule):
+ The schedule of the ScanConfig.
+ export_to_security_command_center (google.cloud.websecurityscanner_v1.types.ScanConfig.ExportToSecurityCommandCenter):
+ Controls export of scan configurations and
+ results to Security Command Center.
+ risk_level (google.cloud.websecurityscanner_v1.types.ScanConfig.RiskLevel):
+ The risk level selected for the scan
+ managed_scan (bool):
+ Whether the scan config is managed by Web
+ Security Scanner, output only.
+ static_ip_scan (bool):
+ Whether the scan configuration has enabled
+ static IP address scan feature. If enabled, the
+ scanner will access applications from static IP
+ addresses.
+ ignore_http_status_errors (bool):
+ Whether to keep scanning even if most
+ requests return HTTP error codes.
+ """
+
+ class UserAgent(proto.Enum):
+ r"""Type of user agents used for scanning.
+
+ Values:
+ USER_AGENT_UNSPECIFIED (0):
+ The user agent is unknown. Service will default to
+ CHROME_LINUX.
+ CHROME_LINUX (1):
+ Chrome on Linux. This is the service default
+ if unspecified.
+ CHROME_ANDROID (2):
+ Chrome on Android.
+ SAFARI_IPHONE (3):
+ Safari on IPhone.
+ """
+ USER_AGENT_UNSPECIFIED = 0
+ CHROME_LINUX = 1
+ CHROME_ANDROID = 2
+ SAFARI_IPHONE = 3
+
+ class RiskLevel(proto.Enum):
+ r"""Scan risk levels supported by Web Security Scanner. LOW
+ impact scanning will minimize requests with the potential to
+ modify data. To achieve the maximum scan coverage, NORMAL risk
+ level is recommended.
+
+ Values:
+ RISK_LEVEL_UNSPECIFIED (0):
+ Use default, which is NORMAL.
+ NORMAL (1):
+ Normal scanning (Recommended)
+ LOW (2):
+ Lower impact scanning
+ """
+ RISK_LEVEL_UNSPECIFIED = 0
+ NORMAL = 1
+ LOW = 2
+
+ class ExportToSecurityCommandCenter(proto.Enum):
+ r"""Controls export of scan configurations and results to
+ Security Command Center.
+
+ Values:
+ EXPORT_TO_SECURITY_COMMAND_CENTER_UNSPECIFIED (0):
+ Use default, which is ENABLED.
+ ENABLED (1):
+ Export results of this scan to Security
+ Command Center.
+ DISABLED (2):
+ Do not export results of this scan to
+ Security Command Center.
+ """
+ EXPORT_TO_SECURITY_COMMAND_CENTER_UNSPECIFIED = 0
+ ENABLED = 1
+ DISABLED = 2
+
+ class Authentication(proto.Message):
+ r"""Scan authentication configuration.
+
+ This message has `oneof`_ fields (mutually exclusive fields).
+ For each oneof, at most one member field can be set at the same time.
+ Setting any member of the oneof automatically clears all other
+ members.
+
+ .. _oneof: https://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields
+
+ Attributes:
+ google_account (google.cloud.websecurityscanner_v1.types.ScanConfig.Authentication.GoogleAccount):
+ Authentication using a Google account.
+
+ This field is a member of `oneof`_ ``authentication``.
+ custom_account (google.cloud.websecurityscanner_v1.types.ScanConfig.Authentication.CustomAccount):
+ Authentication using a custom account.
+
+ This field is a member of `oneof`_ ``authentication``.
+ iap_credential (google.cloud.websecurityscanner_v1.types.ScanConfig.Authentication.IapCredential):
+ Authentication using Identity-Aware-Proxy
+ (IAP).
+
+ This field is a member of `oneof`_ ``authentication``.
+ """
+
+ class GoogleAccount(proto.Message):
+ r"""Describes authentication configuration that uses a Google
+ account.
+
+ Attributes:
+ username (str):
+ Required. The user name of the Google
+ account.
+ password (str):
+ Required. Input only. The password of the
+ Google account. The credential is stored
+ encrypted and not returned in any response nor
+ included in audit logs.
+ """
+
+ username: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ password: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+ class CustomAccount(proto.Message):
+ r"""Describes authentication configuration that uses a custom
+ account.
+
+ Attributes:
+ username (str):
+ Required. The user name of the custom
+ account.
+ password (str):
+ Required. Input only. The password of the
+ custom account. The credential is stored
+ encrypted and not returned in any response nor
+ included in audit logs.
+ login_url (str):
+ Required. The login form URL of the website.
+ """
+
+ username: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ password: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ login_url: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+ class IapCredential(proto.Message):
+ r"""Describes authentication configuration for
+ Identity-Aware-Proxy (IAP).
+
+
+ .. _oneof: https://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields
+
+ Attributes:
+ iap_test_service_account_info (google.cloud.websecurityscanner_v1.types.ScanConfig.Authentication.IapCredential.IapTestServiceAccountInfo):
+ Authentication configuration when
+ Web-Security-Scanner service account is added in
+ Identity-Aware-Proxy (IAP) access policies.
+
+ This field is a member of `oneof`_ ``iap_credentials``.
+ """
+
+ class IapTestServiceAccountInfo(proto.Message):
+ r"""Describes authentication configuration when
+ Web-Security-Scanner service account is added in
+ Identity-Aware-Proxy (IAP) access policies.
+
+ Attributes:
+ target_audience_client_id (str):
+ Required. Describes OAuth2 client id of
+ resources protected by Identity-Aware-Proxy
+ (IAP).
+ """
+
+ target_audience_client_id: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+ iap_test_service_account_info: "ScanConfig.Authentication.IapCredential.IapTestServiceAccountInfo" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ oneof="iap_credentials",
+ message="ScanConfig.Authentication.IapCredential.IapTestServiceAccountInfo",
+ )
+
+ google_account: "ScanConfig.Authentication.GoogleAccount" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ oneof="authentication",
+ message="ScanConfig.Authentication.GoogleAccount",
+ )
+ custom_account: "ScanConfig.Authentication.CustomAccount" = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ oneof="authentication",
+ message="ScanConfig.Authentication.CustomAccount",
+ )
+ iap_credential: "ScanConfig.Authentication.IapCredential" = proto.Field(
+ proto.MESSAGE,
+ number=4,
+ oneof="authentication",
+ message="ScanConfig.Authentication.IapCredential",
+ )
+
+ class Schedule(proto.Message):
+ r"""Scan schedule configuration.
+
+ Attributes:
+ schedule_time (google.protobuf.timestamp_pb2.Timestamp):
+ A timestamp indicates when the next run will
+ be scheduled. The value is refreshed by the
+ server after each run. If unspecified, it will
+ default to current server time, which means the
+ scan will be scheduled to start immediately.
+ interval_duration_days (int):
+ Required. The duration of time between
+ executions in days.
+ """
+
+ schedule_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message=timestamp_pb2.Timestamp,
+ )
+ interval_duration_days: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ display_name: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ max_qps: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+ starting_urls: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=4,
+ )
+ authentication: Authentication = proto.Field(
+ proto.MESSAGE,
+ number=5,
+ message=Authentication,
+ )
+ user_agent: UserAgent = proto.Field(
+ proto.ENUM,
+ number=6,
+ enum=UserAgent,
+ )
+ blacklist_patterns: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=7,
+ )
+ schedule: Schedule = proto.Field(
+ proto.MESSAGE,
+ number=8,
+ message=Schedule,
+ )
+ export_to_security_command_center: ExportToSecurityCommandCenter = proto.Field(
+ proto.ENUM,
+ number=10,
+ enum=ExportToSecurityCommandCenter,
+ )
+ risk_level: RiskLevel = proto.Field(
+ proto.ENUM,
+ number=12,
+ enum=RiskLevel,
+ )
+ managed_scan: bool = proto.Field(
+ proto.BOOL,
+ number=13,
+ )
+ static_ip_scan: bool = proto.Field(
+ proto.BOOL,
+ number=14,
+ )
+ ignore_http_status_errors: bool = proto.Field(
+ proto.BOOL,
+ number=15,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_config_error.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_config_error.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_config_error.py
@@ -0,0 +1,241 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1",
+ manifest={
+ "ScanConfigError",
+ },
+)
+
+
+class ScanConfigError(proto.Message):
+ r"""Defines a custom error message used by CreateScanConfig and
+ UpdateScanConfig APIs when scan configuration validation fails.
+ It is also reported as part of a ScanRunErrorTrace message if
+ scan validation fails due to a scan configuration error.
+
+ Attributes:
+ code (google.cloud.websecurityscanner_v1.types.ScanConfigError.Code):
+ Output only. Indicates the reason code for a
+ configuration failure.
+ field_name (str):
+ Output only. Indicates the full name of the ScanConfig field
+ that triggers this error, for example "scan_config.max_qps".
+ This field is provided for troubleshooting purposes only and
+ its actual value can change in the future.
+ """
+
+ class Code(proto.Enum):
+ r"""Output only.
+ Defines an error reason code.
+ Next id: 44
+
+ Values:
+ CODE_UNSPECIFIED (0):
+ There is no error.
+ OK (0):
+ There is no error.
+ INTERNAL_ERROR (1):
+ Indicates an internal server error.
+ Please DO NOT USE THIS ERROR CODE unless the
+ root cause is truly unknown.
+ APPENGINE_API_BACKEND_ERROR (2):
+ One of the seed URLs is an App Engine URL but
+ we cannot validate the scan settings due to an
+ App Engine API backend error.
+ APPENGINE_API_NOT_ACCESSIBLE (3):
+ One of the seed URLs is an App Engine URL but
+ we cannot access the App Engine API to validate
+ scan settings.
+ APPENGINE_DEFAULT_HOST_MISSING (4):
+ One of the seed URLs is an App Engine URL but
+ the Default Host of the App Engine is not set.
+ CANNOT_USE_GOOGLE_COM_ACCOUNT (6):
+ Google corporate accounts can not be used for
+ scanning.
+ CANNOT_USE_OWNER_ACCOUNT (7):
+ The account of the scan creator can not be
+ used for scanning.
+ COMPUTE_API_BACKEND_ERROR (8):
+ This scan targets Compute Engine, but we
+ cannot validate scan settings due to a Compute
+ Engine API backend error.
+ COMPUTE_API_NOT_ACCESSIBLE (9):
+ This scan targets Compute Engine, but we
+ cannot access the Compute Engine API to validate
+ the scan settings.
+ CUSTOM_LOGIN_URL_DOES_NOT_BELONG_TO_CURRENT_PROJECT (10):
+ The Custom Login URL does not belong to the
+ current project.
+ CUSTOM_LOGIN_URL_MALFORMED (11):
+ The Custom Login URL is malformed (can not be
+ parsed).
+ CUSTOM_LOGIN_URL_MAPPED_TO_NON_ROUTABLE_ADDRESS (12):
+ The Custom Login URL is mapped to a
+ non-routable IP address in DNS.
+ CUSTOM_LOGIN_URL_MAPPED_TO_UNRESERVED_ADDRESS (13):
+ The Custom Login URL is mapped to an IP
+ address which is not reserved for the current
+ project.
+ CUSTOM_LOGIN_URL_HAS_NON_ROUTABLE_IP_ADDRESS (14):
+ The Custom Login URL has a non-routable IP
+ address.
+ CUSTOM_LOGIN_URL_HAS_UNRESERVED_IP_ADDRESS (15):
+ The Custom Login URL has an IP address which
+ is not reserved for the current project.
+ DUPLICATE_SCAN_NAME (16):
+ Another scan with the same name
+ (case-sensitive) already exists.
+ INVALID_FIELD_VALUE (18):
+ A field is set to an invalid value.
+ FAILED_TO_AUTHENTICATE_TO_TARGET (19):
+ There was an error trying to authenticate to
+ the scan target.
+ FINDING_TYPE_UNSPECIFIED (20):
+ Finding type value is not specified in the
+ list findings request.
+ FORBIDDEN_TO_SCAN_COMPUTE (21):
+ Scan targets Compute Engine, yet current
+ project was not whitelisted for Google Compute
+ Engine Scanning Alpha access.
+ FORBIDDEN_UPDATE_TO_MANAGED_SCAN (43):
+ User tries to update managed scan
+ MALFORMED_FILTER (22):
+ The supplied filter is malformed. For
+ example, it can not be parsed, does not have a
+ filter type in expression, or the same filter
+ type appears more than once.
+ MALFORMED_RESOURCE_NAME (23):
+ The supplied resource name is malformed (can
+ not be parsed).
+ PROJECT_INACTIVE (24):
+ The current project is not in an active
+ state.
+ REQUIRED_FIELD (25):
+ A required field is not set.
+ RESOURCE_NAME_INCONSISTENT (26):
+ Project id, scanconfig id, scanrun id, or
+ finding id are not consistent with each other in
+ resource name.
+ SCAN_ALREADY_RUNNING (27):
+ The scan being requested to start is already
+ running.
+ SCAN_NOT_RUNNING (28):
+ The scan that was requested to be stopped is
+ not running.
+ SEED_URL_DOES_NOT_BELONG_TO_CURRENT_PROJECT (29):
+ One of the seed URLs does not belong to the
+ current project.
+ SEED_URL_MALFORMED (30):
+ One of the seed URLs is malformed (can not be
+ parsed).
+ SEED_URL_MAPPED_TO_NON_ROUTABLE_ADDRESS (31):
+ One of the seed URLs is mapped to a
+ non-routable IP address in DNS.
+ SEED_URL_MAPPED_TO_UNRESERVED_ADDRESS (32):
+ One of the seed URLs is mapped to an IP
+ address which is not reserved for the current
+ project.
+ SEED_URL_HAS_NON_ROUTABLE_IP_ADDRESS (33):
+ One of the seed URLs has on-routable IP
+ address.
+ SEED_URL_HAS_UNRESERVED_IP_ADDRESS (35):
+ One of the seed URLs has an IP address that
+ is not reserved for the current project.
+ SERVICE_ACCOUNT_NOT_CONFIGURED (36):
+ The Web Security Scanner service account is
+ not configured under the project.
+ TOO_MANY_SCANS (37):
+ A project has reached the maximum number of
+ scans.
+ UNABLE_TO_RESOLVE_PROJECT_INFO (38):
+ Resolving the details of the current project
+ fails.
+ UNSUPPORTED_BLACKLIST_PATTERN_FORMAT (39):
+ One or more blacklist patterns were in the
+ wrong format.
+ UNSUPPORTED_FILTER (40):
+ The supplied filter is not supported.
+ UNSUPPORTED_FINDING_TYPE (41):
+ The supplied finding type is not supported.
+ For example, we do not provide findings of the
+ given finding type.
+ UNSUPPORTED_URL_SCHEME (42):
+ The URL scheme of one or more of the supplied
+ URLs is not supported.
+ """
+ _pb_options = {"allow_alias": True}
+ CODE_UNSPECIFIED = 0
+ OK = 0
+ INTERNAL_ERROR = 1
+ APPENGINE_API_BACKEND_ERROR = 2
+ APPENGINE_API_NOT_ACCESSIBLE = 3
+ APPENGINE_DEFAULT_HOST_MISSING = 4
+ CANNOT_USE_GOOGLE_COM_ACCOUNT = 6
+ CANNOT_USE_OWNER_ACCOUNT = 7
+ COMPUTE_API_BACKEND_ERROR = 8
+ COMPUTE_API_NOT_ACCESSIBLE = 9
+ CUSTOM_LOGIN_URL_DOES_NOT_BELONG_TO_CURRENT_PROJECT = 10
+ CUSTOM_LOGIN_URL_MALFORMED = 11
+ CUSTOM_LOGIN_URL_MAPPED_TO_NON_ROUTABLE_ADDRESS = 12
+ CUSTOM_LOGIN_URL_MAPPED_TO_UNRESERVED_ADDRESS = 13
+ CUSTOM_LOGIN_URL_HAS_NON_ROUTABLE_IP_ADDRESS = 14
+ CUSTOM_LOGIN_URL_HAS_UNRESERVED_IP_ADDRESS = 15
+ DUPLICATE_SCAN_NAME = 16
+ INVALID_FIELD_VALUE = 18
+ FAILED_TO_AUTHENTICATE_TO_TARGET = 19
+ FINDING_TYPE_UNSPECIFIED = 20
+ FORBIDDEN_TO_SCAN_COMPUTE = 21
+ FORBIDDEN_UPDATE_TO_MANAGED_SCAN = 43
+ MALFORMED_FILTER = 22
+ MALFORMED_RESOURCE_NAME = 23
+ PROJECT_INACTIVE = 24
+ REQUIRED_FIELD = 25
+ RESOURCE_NAME_INCONSISTENT = 26
+ SCAN_ALREADY_RUNNING = 27
+ SCAN_NOT_RUNNING = 28
+ SEED_URL_DOES_NOT_BELONG_TO_CURRENT_PROJECT = 29
+ SEED_URL_MALFORMED = 30
+ SEED_URL_MAPPED_TO_NON_ROUTABLE_ADDRESS = 31
+ SEED_URL_MAPPED_TO_UNRESERVED_ADDRESS = 32
+ SEED_URL_HAS_NON_ROUTABLE_IP_ADDRESS = 33
+ SEED_URL_HAS_UNRESERVED_IP_ADDRESS = 35
+ SERVICE_ACCOUNT_NOT_CONFIGURED = 36
+ TOO_MANY_SCANS = 37
+ UNABLE_TO_RESOLVE_PROJECT_INFO = 38
+ UNSUPPORTED_BLACKLIST_PATTERN_FORMAT = 39
+ UNSUPPORTED_FILTER = 40
+ UNSUPPORTED_FINDING_TYPE = 41
+ UNSUPPORTED_URL_SCHEME = 42
+
+ code: Code = proto.Field(
+ proto.ENUM,
+ number=1,
+ enum=Code,
+ )
+ field_name: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_run.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_run.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_run.py
@@ -0,0 +1,185 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import timestamp_pb2 # type: ignore
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1.types import (
+ scan_run_error_trace,
+ scan_run_warning_trace,
+)
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1",
+ manifest={
+ "ScanRun",
+ },
+)
+
+
+class ScanRun(proto.Message):
+ r"""A ScanRun is a output-only resource representing an actual
+ run of the scan. Next id: 12
+
+ Attributes:
+ name (str):
+ Output only. The resource name of the
+ ScanRun. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ The ScanRun IDs are generated by the system.
+ execution_state (google.cloud.websecurityscanner_v1.types.ScanRun.ExecutionState):
+ Output only. The execution state of the
+ ScanRun.
+ result_state (google.cloud.websecurityscanner_v1.types.ScanRun.ResultState):
+ Output only. The result state of the ScanRun.
+ This field is only available after the execution
+ state reaches "FINISHED".
+ start_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time at which the ScanRun
+ started.
+ end_time (google.protobuf.timestamp_pb2.Timestamp):
+ Output only. The time at which the ScanRun
+ reached termination state - that the ScanRun is
+ either finished or stopped by user.
+ urls_crawled_count (int):
+ Output only. The number of URLs crawled
+ during this ScanRun. If the scan is in progress,
+ the value represents the number of URLs crawled
+ up to now.
+ urls_tested_count (int):
+ Output only. The number of URLs tested during
+ this ScanRun. If the scan is in progress, the
+ value represents the number of URLs tested up to
+ now. The number of URLs tested is usually larger
+ than the number URLS crawled because typically a
+ crawled URL is tested with multiple test
+ payloads.
+ has_vulnerabilities (bool):
+ Output only. Whether the scan run has found
+ any vulnerabilities.
+ progress_percent (int):
+ Output only. The percentage of total
+ completion ranging from 0 to 100. If the scan is
+ in queue, the value is 0. If the scan is
+ running, the value ranges from 0 to 100. If the
+ scan is finished, the value is 100.
+ error_trace (google.cloud.websecurityscanner_v1.types.ScanRunErrorTrace):
+ Output only. If result_state is an ERROR, this field
+ provides the primary reason for scan's termination and more
+ details, if such are available.
+ warning_traces (MutableSequence[google.cloud.websecurityscanner_v1.types.ScanRunWarningTrace]):
+ Output only. A list of warnings, if such are
+ encountered during this scan run.
+ """
+
+ class ExecutionState(proto.Enum):
+ r"""Types of ScanRun execution state.
+
+ Values:
+ EXECUTION_STATE_UNSPECIFIED (0):
+ Represents an invalid state caused by
+ internal server error. This value should never
+ be returned.
+ QUEUED (1):
+ The scan is waiting in the queue.
+ SCANNING (2):
+ The scan is in progress.
+ FINISHED (3):
+ The scan is either finished or stopped by
+ user.
+ """
+ EXECUTION_STATE_UNSPECIFIED = 0
+ QUEUED = 1
+ SCANNING = 2
+ FINISHED = 3
+
+ class ResultState(proto.Enum):
+ r"""Types of ScanRun result state.
+
+ Values:
+ RESULT_STATE_UNSPECIFIED (0):
+ Default value. This value is returned when
+ the ScanRun is not yet finished.
+ SUCCESS (1):
+ The scan finished without errors.
+ ERROR (2):
+ The scan finished with errors.
+ KILLED (3):
+ The scan was terminated by user.
+ """
+ RESULT_STATE_UNSPECIFIED = 0
+ SUCCESS = 1
+ ERROR = 2
+ KILLED = 3
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ execution_state: ExecutionState = proto.Field(
+ proto.ENUM,
+ number=2,
+ enum=ExecutionState,
+ )
+ result_state: ResultState = proto.Field(
+ proto.ENUM,
+ number=3,
+ enum=ResultState,
+ )
+ start_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=4,
+ message=timestamp_pb2.Timestamp,
+ )
+ end_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=5,
+ message=timestamp_pb2.Timestamp,
+ )
+ urls_crawled_count: int = proto.Field(
+ proto.INT64,
+ number=6,
+ )
+ urls_tested_count: int = proto.Field(
+ proto.INT64,
+ number=7,
+ )
+ has_vulnerabilities: bool = proto.Field(
+ proto.BOOL,
+ number=8,
+ )
+ progress_percent: int = proto.Field(
+ proto.INT32,
+ number=9,
+ )
+ error_trace: scan_run_error_trace.ScanRunErrorTrace = proto.Field(
+ proto.MESSAGE,
+ number=10,
+ message=scan_run_error_trace.ScanRunErrorTrace,
+ )
+ warning_traces: MutableSequence[
+ scan_run_warning_trace.ScanRunWarningTrace
+ ] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=11,
+ message=scan_run_warning_trace.ScanRunWarningTrace,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_run_error_trace.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_run_error_trace.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_run_error_trace.py
@@ -0,0 +1,109 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1.types import (
+ scan_config_error as gcw_scan_config_error,
+)
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1",
+ manifest={
+ "ScanRunErrorTrace",
+ },
+)
+
+
+class ScanRunErrorTrace(proto.Message):
+ r"""Output only.
+ Defines an error trace message for a ScanRun.
+
+ Attributes:
+ code (google.cloud.websecurityscanner_v1.types.ScanRunErrorTrace.Code):
+ Output only. Indicates the error reason code.
+ scan_config_error (google.cloud.websecurityscanner_v1.types.ScanConfigError):
+ Output only. If the scan encounters SCAN_CONFIG_ISSUE error,
+ this field has the error message encountered during scan
+ configuration validation that is performed before each scan
+ run.
+ most_common_http_error_code (int):
+ Output only. If the scan encounters TOO_MANY_HTTP_ERRORS,
+ this field indicates the most common HTTP error code, if
+ such is available. For example, if this code is 404, the
+ scan has encountered too many NOT_FOUND responses.
+ """
+
+ class Code(proto.Enum):
+ r"""Output only.
+ Defines an error reason code.
+ Next id: 8
+
+ Values:
+ CODE_UNSPECIFIED (0):
+ Default value is never used.
+ INTERNAL_ERROR (1):
+ Indicates that the scan run failed due to an
+ internal server error.
+ SCAN_CONFIG_ISSUE (2):
+ Indicates a scan configuration error, usually due to
+ outdated ScanConfig settings, such as starting_urls or the
+ DNS configuration.
+ AUTHENTICATION_CONFIG_ISSUE (3):
+ Indicates an authentication error, usually
+ due to outdated ScanConfig authentication
+ settings.
+ TIMED_OUT_WHILE_SCANNING (4):
+ Indicates a scan operation timeout, usually
+ caused by a very large site.
+ TOO_MANY_REDIRECTS (5):
+ Indicates that a scan encountered excessive
+ redirects, either to authentication or some
+ other page outside of the scan scope.
+ TOO_MANY_HTTP_ERRORS (6):
+ Indicates that a scan encountered numerous errors from the
+ web site pages. When available, most_common_http_error_code
+ field indicates the most common HTTP error code encountered
+ during the scan.
+ """
+ CODE_UNSPECIFIED = 0
+ INTERNAL_ERROR = 1
+ SCAN_CONFIG_ISSUE = 2
+ AUTHENTICATION_CONFIG_ISSUE = 3
+ TIMED_OUT_WHILE_SCANNING = 4
+ TOO_MANY_REDIRECTS = 5
+ TOO_MANY_HTTP_ERRORS = 6
+
+ code: Code = proto.Field(
+ proto.ENUM,
+ number=1,
+ enum=Code,
+ )
+ scan_config_error: gcw_scan_config_error.ScanConfigError = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=gcw_scan_config_error.ScanConfigError,
+ )
+ most_common_http_error_code: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_run_log.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_run_log.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_run_log.py
@@ -0,0 +1,96 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1.types import scan_run, scan_run_error_trace
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1",
+ manifest={
+ "ScanRunLog",
+ },
+)
+
+
+class ScanRunLog(proto.Message):
+ r"""A ScanRunLog is an output-only proto used for Stackdriver
+ customer logging. It is used for logs covering the start and end
+ of scan pipelines. Other than an added summary, this is a subset
+ of the ScanRun. Representation in logs is either a proto Struct,
+ or converted to JSON. Next id: 9
+
+ Attributes:
+ summary (str):
+ Human friendly message about the event.
+ name (str):
+ The resource name of the ScanRun being
+ logged.
+ execution_state (google.cloud.websecurityscanner_v1.types.ScanRun.ExecutionState):
+ The execution state of the ScanRun.
+ result_state (google.cloud.websecurityscanner_v1.types.ScanRun.ResultState):
+ The result state of the ScanRun.
+ urls_crawled_count (int):
+
+ urls_tested_count (int):
+
+ has_findings (bool):
+
+ error_trace (google.cloud.websecurityscanner_v1.types.ScanRunErrorTrace):
+
+ """
+
+ summary: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ name: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ execution_state: scan_run.ScanRun.ExecutionState = proto.Field(
+ proto.ENUM,
+ number=3,
+ enum=scan_run.ScanRun.ExecutionState,
+ )
+ result_state: scan_run.ScanRun.ResultState = proto.Field(
+ proto.ENUM,
+ number=4,
+ enum=scan_run.ScanRun.ResultState,
+ )
+ urls_crawled_count: int = proto.Field(
+ proto.INT64,
+ number=5,
+ )
+ urls_tested_count: int = proto.Field(
+ proto.INT64,
+ number=6,
+ )
+ has_findings: bool = proto.Field(
+ proto.BOOL,
+ number=7,
+ )
+ error_trace: scan_run_error_trace.ScanRunErrorTrace = proto.Field(
+ proto.MESSAGE,
+ number=8,
+ message=scan_run_error_trace.ScanRunErrorTrace,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_run_warning_trace.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_run_warning_trace.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/scan_run_warning_trace.py
@@ -0,0 +1,82 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1",
+ manifest={
+ "ScanRunWarningTrace",
+ },
+)
+
+
+class ScanRunWarningTrace(proto.Message):
+ r"""Output only.
+ Defines a warning trace message for ScanRun. Warning traces
+ provide customers with useful information that helps make the
+ scanning process more effective.
+
+ Attributes:
+ code (google.cloud.websecurityscanner_v1.types.ScanRunWarningTrace.Code):
+ Output only. Indicates the warning code.
+ """
+
+ class Code(proto.Enum):
+ r"""Output only.
+ Defines a warning message code.
+ Next id: 6
+
+ Values:
+ CODE_UNSPECIFIED (0):
+ Default value is never used.
+ INSUFFICIENT_CRAWL_RESULTS (1):
+ Indicates that a scan discovered an
+ unexpectedly low number of URLs. This is
+ sometimes caused by complex navigation features
+ or by using a single URL for numerous pages.
+ TOO_MANY_CRAWL_RESULTS (2):
+ Indicates that a scan discovered too many
+ URLs to test, or excessive redundant URLs.
+ TOO_MANY_FUZZ_TASKS (3):
+ Indicates that too many tests have been
+ generated for the scan. Customer should try
+ reducing the number of starting URLs, increasing
+ the QPS rate, or narrowing down the scope of the
+ scan using the excluded patterns.
+ BLOCKED_BY_IAP (4):
+ Indicates that a scan is blocked by IAP.
+ NO_STARTING_URL_FOUND_FOR_MANAGED_SCAN (5):
+ Indicates that no seeds is found for a scan
+ """
+ CODE_UNSPECIFIED = 0
+ INSUFFICIENT_CRAWL_RESULTS = 1
+ TOO_MANY_CRAWL_RESULTS = 2
+ TOO_MANY_FUZZ_TASKS = 3
+ BLOCKED_BY_IAP = 4
+ NO_STARTING_URL_FOUND_FOR_MANAGED_SCAN = 5
+
+ code: Code = proto.Field(
+ proto.ENUM,
+ number=1,
+ enum=Code,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/web_security_scanner.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/web_security_scanner.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1/types/web_security_scanner.py
@@ -0,0 +1,486 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import field_mask_pb2 # type: ignore
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1.types import (
+ finding_type_stats as gcw_finding_type_stats,
+)
+from google.cloud.websecurityscanner_v1.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1.types import crawled_url, finding
+from google.cloud.websecurityscanner_v1.types import scan_run
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1",
+ manifest={
+ "CreateScanConfigRequest",
+ "DeleteScanConfigRequest",
+ "GetScanConfigRequest",
+ "ListScanConfigsRequest",
+ "UpdateScanConfigRequest",
+ "ListScanConfigsResponse",
+ "StartScanRunRequest",
+ "GetScanRunRequest",
+ "ListScanRunsRequest",
+ "ListScanRunsResponse",
+ "StopScanRunRequest",
+ "ListCrawledUrlsRequest",
+ "ListCrawledUrlsResponse",
+ "GetFindingRequest",
+ "ListFindingsRequest",
+ "ListFindingsResponse",
+ "ListFindingTypeStatsRequest",
+ "ListFindingTypeStatsResponse",
+ },
+)
+
+
+class CreateScanConfigRequest(proto.Message):
+ r"""Request for the ``CreateScanConfig`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name where the
+ scan is created, which should be a project
+ resource name in the format
+ 'projects/{projectId}'.
+ scan_config (google.cloud.websecurityscanner_v1.types.ScanConfig):
+ Required. The ScanConfig to be created.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ scan_config: gcw_scan_config.ScanConfig = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=gcw_scan_config.ScanConfig,
+ )
+
+
+class DeleteScanConfigRequest(proto.Message):
+ r"""Request for the ``DeleteScanConfig`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanConfig
+ to be deleted. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class GetScanConfigRequest(proto.Message):
+ r"""Request for the ``GetScanConfig`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanConfig
+ to be returned. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListScanConfigsRequest(proto.Message):
+ r"""Request for the ``ListScanConfigs`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a project resource name in the format
+ 'projects/{projectId}'.
+ page_token (str):
+ A token identifying a page of results to be returned. This
+ should be a ``next_page_token`` value returned from a
+ previous List request. If unspecified, the first page of
+ results is returned.
+ page_size (int):
+ The maximum number of ScanConfigs to return,
+ can be limited by server. If not specified or
+ not positive, the implementation will select a
+ reasonable value.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+
+
+class UpdateScanConfigRequest(proto.Message):
+ r"""Request for the ``UpdateScanConfigRequest`` method.
+
+ Attributes:
+ scan_config (google.cloud.websecurityscanner_v1.types.ScanConfig):
+ Required. The ScanConfig to be updated. The
+ name field must be set to identify the resource
+ to be updated. The values of fields not covered
+ by the mask will be ignored.
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Required. The update mask applies to the resource. For the
+ ``FieldMask`` definition, see
+ https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask
+ """
+
+ scan_config: gcw_scan_config.ScanConfig = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=gcw_scan_config.ScanConfig,
+ )
+ update_mask: field_mask_pb2.FieldMask = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message=field_mask_pb2.FieldMask,
+ )
+
+
+class ListScanConfigsResponse(proto.Message):
+ r"""Response for the ``ListScanConfigs`` method.
+
+ Attributes:
+ scan_configs (MutableSequence[google.cloud.websecurityscanner_v1.types.ScanConfig]):
+ The list of ScanConfigs returned.
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ scan_configs: MutableSequence[gcw_scan_config.ScanConfig] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=gcw_scan_config.ScanConfig,
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class StartScanRunRequest(proto.Message):
+ r"""Request for the ``StartScanRun`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanConfig
+ to be used. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class GetScanRunRequest(proto.Message):
+ r"""Request for the ``GetScanRun`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanRun to
+ be returned. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListScanRunsRequest(proto.Message):
+ r"""Request for the ``ListScanRuns`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a scan resource name in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ page_token (str):
+ A token identifying a page of results to be returned. This
+ should be a ``next_page_token`` value returned from a
+ previous List request. If unspecified, the first page of
+ results is returned.
+ page_size (int):
+ The maximum number of ScanRuns to return, can
+ be limited by server. If not specified or not
+ positive, the implementation will select a
+ reasonable value.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+
+
+class ListScanRunsResponse(proto.Message):
+ r"""Response for the ``ListScanRuns`` method.
+
+ Attributes:
+ scan_runs (MutableSequence[google.cloud.websecurityscanner_v1.types.ScanRun]):
+ The list of ScanRuns returned.
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ scan_runs: MutableSequence[scan_run.ScanRun] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=scan_run.ScanRun,
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class StopScanRunRequest(proto.Message):
+ r"""Request for the ``StopScanRun`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanRun to
+ be stopped. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListCrawledUrlsRequest(proto.Message):
+ r"""Request for the ``ListCrawledUrls`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a scan run resource name in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ page_token (str):
+ A token identifying a page of results to be returned. This
+ should be a ``next_page_token`` value returned from a
+ previous List request. If unspecified, the first page of
+ results is returned.
+ page_size (int):
+ The maximum number of CrawledUrls to return,
+ can be limited by server. If not specified or
+ not positive, the implementation will select a
+ reasonable value.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+
+
+class ListCrawledUrlsResponse(proto.Message):
+ r"""Response for the ``ListCrawledUrls`` method.
+
+ Attributes:
+ crawled_urls (MutableSequence[google.cloud.websecurityscanner_v1.types.CrawledUrl]):
+ The list of CrawledUrls returned.
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ crawled_urls: MutableSequence[crawled_url.CrawledUrl] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=crawled_url.CrawledUrl,
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class GetFindingRequest(proto.Message):
+ r"""Request for the ``GetFinding`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the Finding to
+ be returned. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}/findings/{findingId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListFindingsRequest(proto.Message):
+ r"""Request for the ``ListFindings`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a scan run resource name in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ filter (str):
+ The filter expression. The expression must be in the format:
+ . Supported field: 'finding_type'. Supported operator: '='.
+ page_token (str):
+ A token identifying a page of results to be returned. This
+ should be a ``next_page_token`` value returned from a
+ previous List request. If unspecified, the first page of
+ results is returned.
+ page_size (int):
+ The maximum number of Findings to return, can
+ be limited by server. If not specified or not
+ positive, the implementation will select a
+ reasonable value.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ filter: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=4,
+ )
+
+
+class ListFindingsResponse(proto.Message):
+ r"""Response for the ``ListFindings`` method.
+
+ Attributes:
+ findings (MutableSequence[google.cloud.websecurityscanner_v1.types.Finding]):
+ The list of Findings returned.
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ findings: MutableSequence[finding.Finding] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=finding.Finding,
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class ListFindingTypeStatsRequest(proto.Message):
+ r"""Request for the ``ListFindingTypeStats`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a scan run resource name in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListFindingTypeStatsResponse(proto.Message):
+ r"""Response for the ``ListFindingTypeStats`` method.
+
+ Attributes:
+ finding_type_stats (MutableSequence[google.cloud.websecurityscanner_v1.types.FindingTypeStats]):
+ The list of FindingTypeStats returned.
+ """
+
+ finding_type_stats: MutableSequence[
+ gcw_finding_type_stats.FindingTypeStats
+ ] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=gcw_finding_type_stats.FindingTypeStats,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/__init__.py
@@ -0,0 +1,89 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from google.cloud.websecurityscanner_v1alpha import gapic_version as package_version
+
+__version__ = package_version.__version__
+
+
+from .services.web_security_scanner import (
+ WebSecurityScannerAsyncClient,
+ WebSecurityScannerClient,
+)
+from .types.crawled_url import CrawledUrl
+from .types.finding import Finding
+from .types.finding_addon import (
+ OutdatedLibrary,
+ ViolatingResource,
+ VulnerableHeaders,
+ VulnerableParameters,
+ Xss,
+)
+from .types.finding_type_stats import FindingTypeStats
+from .types.scan_config import ScanConfig
+from .types.scan_run import ScanRun
+from .types.web_security_scanner import (
+ CreateScanConfigRequest,
+ DeleteScanConfigRequest,
+ GetFindingRequest,
+ GetScanConfigRequest,
+ GetScanRunRequest,
+ ListCrawledUrlsRequest,
+ ListCrawledUrlsResponse,
+ ListFindingsRequest,
+ ListFindingsResponse,
+ ListFindingTypeStatsRequest,
+ ListFindingTypeStatsResponse,
+ ListScanConfigsRequest,
+ ListScanConfigsResponse,
+ ListScanRunsRequest,
+ ListScanRunsResponse,
+ StartScanRunRequest,
+ StopScanRunRequest,
+ UpdateScanConfigRequest,
+)
+
+__all__ = (
+ "WebSecurityScannerAsyncClient",
+ "CrawledUrl",
+ "CreateScanConfigRequest",
+ "DeleteScanConfigRequest",
+ "Finding",
+ "FindingTypeStats",
+ "GetFindingRequest",
+ "GetScanConfigRequest",
+ "GetScanRunRequest",
+ "ListCrawledUrlsRequest",
+ "ListCrawledUrlsResponse",
+ "ListFindingTypeStatsRequest",
+ "ListFindingTypeStatsResponse",
+ "ListFindingsRequest",
+ "ListFindingsResponse",
+ "ListScanConfigsRequest",
+ "ListScanConfigsResponse",
+ "ListScanRunsRequest",
+ "ListScanRunsResponse",
+ "OutdatedLibrary",
+ "ScanConfig",
+ "ScanRun",
+ "StartScanRunRequest",
+ "StopScanRunRequest",
+ "UpdateScanConfigRequest",
+ "ViolatingResource",
+ "VulnerableHeaders",
+ "VulnerableParameters",
+ "WebSecurityScannerClient",
+ "Xss",
+)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/gapic_version.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/gapic_version.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/gapic_version.py
@@ -0,0 +1,16 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+__version__ = "1.12.1" # {x-release-please-version}
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/__init__.py
@@ -0,0 +1,15 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/__init__.py
@@ -0,0 +1,22 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .async_client import WebSecurityScannerAsyncClient
+from .client import WebSecurityScannerClient
+
+__all__ = (
+ "WebSecurityScannerClient",
+ "WebSecurityScannerAsyncClient",
+)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/async_client.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/async_client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/async_client.py
@@ -0,0 +1,1779 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import functools
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+)
+
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.api_core.client_options import ClientOptions
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.websecurityscanner_v1alpha import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.websecurityscanner_v1alpha.services.web_security_scanner import pagers
+from google.cloud.websecurityscanner_v1alpha.types import scan_run, web_security_scanner
+from google.cloud.websecurityscanner_v1alpha.types import (
+ crawled_url,
+ finding,
+ finding_addon,
+ finding_type_stats,
+)
+from google.cloud.websecurityscanner_v1alpha.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1alpha.types import scan_config
+
+from .client import WebSecurityScannerClient
+from .transports.base import DEFAULT_CLIENT_INFO, WebSecurityScannerTransport
+from .transports.grpc_asyncio import WebSecurityScannerGrpcAsyncIOTransport
+
+
+class WebSecurityScannerAsyncClient:
+ """Cloud Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud
+ Platform. It crawls your application, and attempts to exercise
+ as many user inputs and event handlers as possible.
+ """
+
+ _client: WebSecurityScannerClient
+
+ DEFAULT_ENDPOINT = WebSecurityScannerClient.DEFAULT_ENDPOINT
+ DEFAULT_MTLS_ENDPOINT = WebSecurityScannerClient.DEFAULT_MTLS_ENDPOINT
+
+ finding_path = staticmethod(WebSecurityScannerClient.finding_path)
+ parse_finding_path = staticmethod(WebSecurityScannerClient.parse_finding_path)
+ scan_config_path = staticmethod(WebSecurityScannerClient.scan_config_path)
+ parse_scan_config_path = staticmethod(
+ WebSecurityScannerClient.parse_scan_config_path
+ )
+ scan_run_path = staticmethod(WebSecurityScannerClient.scan_run_path)
+ parse_scan_run_path = staticmethod(WebSecurityScannerClient.parse_scan_run_path)
+ common_billing_account_path = staticmethod(
+ WebSecurityScannerClient.common_billing_account_path
+ )
+ parse_common_billing_account_path = staticmethod(
+ WebSecurityScannerClient.parse_common_billing_account_path
+ )
+ common_folder_path = staticmethod(WebSecurityScannerClient.common_folder_path)
+ parse_common_folder_path = staticmethod(
+ WebSecurityScannerClient.parse_common_folder_path
+ )
+ common_organization_path = staticmethod(
+ WebSecurityScannerClient.common_organization_path
+ )
+ parse_common_organization_path = staticmethod(
+ WebSecurityScannerClient.parse_common_organization_path
+ )
+ common_project_path = staticmethod(WebSecurityScannerClient.common_project_path)
+ parse_common_project_path = staticmethod(
+ WebSecurityScannerClient.parse_common_project_path
+ )
+ common_location_path = staticmethod(WebSecurityScannerClient.common_location_path)
+ parse_common_location_path = staticmethod(
+ WebSecurityScannerClient.parse_common_location_path
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ WebSecurityScannerAsyncClient: The constructed client.
+ """
+ return WebSecurityScannerClient.from_service_account_info.__func__(WebSecurityScannerAsyncClient, info, *args, **kwargs) # type: ignore
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ WebSecurityScannerAsyncClient: The constructed client.
+ """
+ return WebSecurityScannerClient.from_service_account_file.__func__(WebSecurityScannerAsyncClient, filename, *args, **kwargs) # type: ignore
+
+ from_service_account_json = from_service_account_file
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ return WebSecurityScannerClient.get_mtls_endpoint_and_cert_source(client_options) # type: ignore
+
+ @property
+ def transport(self) -> WebSecurityScannerTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ WebSecurityScannerTransport: The transport used by the client instance.
+ """
+ return self._client.transport
+
+ get_transport_class = functools.partial(
+ type(WebSecurityScannerClient).get_transport_class,
+ type(WebSecurityScannerClient),
+ )
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Union[str, WebSecurityScannerTransport] = "grpc_asyncio",
+ client_options: Optional[ClientOptions] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the web security scanner client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, ~.WebSecurityScannerTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (ClientOptions): Custom options for the client. It
+ won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ self._client = WebSecurityScannerClient(
+ credentials=credentials,
+ transport=transport,
+ client_options=client_options,
+ client_info=client_info,
+ )
+
+ async def create_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.CreateScanConfigRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ scan_config: Optional[gcw_scan_config.ScanConfig] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> gcw_scan_config.ScanConfig:
+ r"""Creates a new ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ async def sample_create_scan_config():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ scan_config = websecurityscanner_v1alpha.ScanConfig()
+ scan_config.display_name = "display_name_value"
+ scan_config.starting_urls = ['starting_urls_value1', 'starting_urls_value2']
+
+ request = websecurityscanner_v1alpha.CreateScanConfigRequest(
+ parent="parent_value",
+ scan_config=scan_config,
+ )
+
+ # Make the request
+ response = await client.create_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1alpha.types.CreateScanConfigRequest, dict]]):
+ The request object. Request for the ``CreateScanConfig`` method.
+ parent (:class:`str`):
+ Required. The parent resource name
+ where the scan is created, which should
+ be a project resource name in the format
+ 'projects/{projectId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ scan_config (:class:`google.cloud.websecurityscanner_v1alpha.types.ScanConfig`):
+ Required. The ScanConfig to be
+ created.
+
+ This corresponds to the ``scan_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan. next
+ id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, scan_config])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.CreateScanConfigRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if scan_config is not None:
+ request.scan_config = scan_config
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.create_scan_config,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def delete_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.DeleteScanConfigRequest, dict]
+ ] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> None:
+ r"""Deletes an existing ScanConfig and its child
+ resources.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ async def sample_delete_scan_config():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.DeleteScanConfigRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ await client.delete_scan_config(request=request)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1alpha.types.DeleteScanConfigRequest, dict]]):
+ The request object. Request for the ``DeleteScanConfig`` method.
+ name (:class:`str`):
+ Required. The resource name of the
+ ScanConfig to be deleted. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.DeleteScanConfigRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.delete_scan_config,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ async def get_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.GetScanConfigRequest, dict]
+ ] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Gets a ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ async def sample_get_scan_config():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.GetScanConfigRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1alpha.types.GetScanConfigRequest, dict]]):
+ The request object. Request for the ``GetScanConfig`` method.
+ name (:class:`str`):
+ Required. The resource name of the
+ ScanConfig to be returned. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan. next
+ id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.GetScanConfigRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_scan_config,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_scan_configs(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListScanConfigsRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListScanConfigsAsyncPager:
+ r"""Lists ScanConfigs under a given project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ async def sample_list_scan_configs():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.ListScanConfigsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_scan_configs(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1alpha.types.ListScanConfigsRequest, dict]]):
+ The request object. Request for the ``ListScanConfigs`` method.
+ parent (:class:`str`):
+ Required. The parent resource name,
+ which should be a project resource name
+ in the format 'projects/{projectId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.services.web_security_scanner.pagers.ListScanConfigsAsyncPager:
+ Response for the ListScanConfigs method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.ListScanConfigsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_scan_configs,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListScanConfigsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def update_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.UpdateScanConfigRequest, dict]
+ ] = None,
+ *,
+ scan_config: Optional[gcw_scan_config.ScanConfig] = None,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> gcw_scan_config.ScanConfig:
+ r"""Updates a ScanConfig. This method support partial
+ update of a ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ async def sample_update_scan_config():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ scan_config = websecurityscanner_v1alpha.ScanConfig()
+ scan_config.display_name = "display_name_value"
+ scan_config.starting_urls = ['starting_urls_value1', 'starting_urls_value2']
+
+ request = websecurityscanner_v1alpha.UpdateScanConfigRequest(
+ scan_config=scan_config,
+ )
+
+ # Make the request
+ response = await client.update_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1alpha.types.UpdateScanConfigRequest, dict]]):
+ The request object. Request for the ``UpdateScanConfigRequest`` method.
+ scan_config (:class:`google.cloud.websecurityscanner_v1alpha.types.ScanConfig`):
+ Required. The ScanConfig to be
+ updated. The name field must be set to
+ identify the resource to be updated. The
+ values of fields not covered by the mask
+ will be ignored.
+
+ This corresponds to the ``scan_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`):
+ Required. The update mask applies to the resource. For
+ the ``FieldMask`` definition, see
+ https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan. next
+ id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([scan_config, update_mask])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.UpdateScanConfigRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if scan_config is not None:
+ request.scan_config = scan_config
+ if update_mask is not None:
+ request.update_mask = update_mask
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.update_scan_config,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("scan_config.name", request.scan_config.name),)
+ ),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def start_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.StartScanRunRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Start a ScanRun according to the given ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ async def sample_start_scan_run():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.StartScanRunRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.start_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1alpha.types.StartScanRunRequest, dict]]):
+ The request object. Request for the ``StartScanRun`` method.
+ name (:class:`str`):
+ Required. The resource name of the
+ ScanConfig to be used. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.StartScanRunRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.start_scan_run,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.GetScanRunRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Gets a ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ async def sample_get_scan_run():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.GetScanRunRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1alpha.types.GetScanRunRequest, dict]]):
+ The request object. Request for the ``GetScanRun`` method.
+ name (:class:`str`):
+ Required. The resource name of the
+ ScanRun to be returned. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.GetScanRunRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_scan_run,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_scan_runs(
+ self,
+ request: Optional[Union[web_security_scanner.ListScanRunsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListScanRunsAsyncPager:
+ r"""Lists ScanRuns under a given ScanConfig, in
+ descending order of ScanRun stop time.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ async def sample_list_scan_runs():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.ListScanRunsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_scan_runs(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1alpha.types.ListScanRunsRequest, dict]]):
+ The request object. Request for the ``ListScanRuns`` method.
+ parent (:class:`str`):
+ Required. The parent resource name,
+ which should be a scan resource name in
+ the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.services.web_security_scanner.pagers.ListScanRunsAsyncPager:
+ Response for the ListScanRuns method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.ListScanRunsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_scan_runs,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListScanRunsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def stop_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.StopScanRunRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Stops a ScanRun. The stopped ScanRun is returned.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ async def sample_stop_scan_run():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.StopScanRunRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.stop_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1alpha.types.StopScanRunRequest, dict]]):
+ The request object. Request for the ``StopScanRun`` method.
+ name (:class:`str`):
+ Required. The resource name of the
+ ScanRun to be stopped. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.StopScanRunRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.stop_scan_run,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_crawled_urls(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListCrawledUrlsRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListCrawledUrlsAsyncPager:
+ r"""List CrawledUrls under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ async def sample_list_crawled_urls():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.ListCrawledUrlsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_crawled_urls(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1alpha.types.ListCrawledUrlsRequest, dict]]):
+ The request object. Request for the ``ListCrawledUrls`` method.
+ parent (:class:`str`):
+ Required. The parent resource name,
+ which should be a scan run resource name
+ in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.services.web_security_scanner.pagers.ListCrawledUrlsAsyncPager:
+ Response for the ListCrawledUrls method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.ListCrawledUrlsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_crawled_urls,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListCrawledUrlsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_finding(
+ self,
+ request: Optional[Union[web_security_scanner.GetFindingRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> finding.Finding:
+ r"""Gets a Finding.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ async def sample_get_finding():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.GetFindingRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_finding(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1alpha.types.GetFindingRequest, dict]]):
+ The request object. Request for the ``GetFinding`` method.
+ name (:class:`str`):
+ Required. The resource name of the
+ Finding to be returned. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}/findings/{findingId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.Finding:
+ A Finding resource represents a
+ vulnerability instance identified during
+ a ScanRun.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.GetFindingRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_finding,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_findings(
+ self,
+ request: Optional[Union[web_security_scanner.ListFindingsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ filter: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListFindingsAsyncPager:
+ r"""List Findings under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ async def sample_list_findings():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.ListFindingsRequest(
+ parent="parent_value",
+ filter="filter_value",
+ )
+
+ # Make the request
+ page_result = client.list_findings(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1alpha.types.ListFindingsRequest, dict]]):
+ The request object. Request for the ``ListFindings`` method.
+ parent (:class:`str`):
+ Required. The parent resource name,
+ which should be a scan run resource name
+ in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ filter (:class:`str`):
+ Required. The filter expression. The expression must be
+ in the format: . Supported field: 'finding_type'.
+ Supported operator: '='.
+
+ This corresponds to the ``filter`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.services.web_security_scanner.pagers.ListFindingsAsyncPager:
+ Response for the ListFindings method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, filter])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.ListFindingsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if filter is not None:
+ request.filter = filter
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_findings,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListFindingsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_finding_type_stats(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListFindingTypeStatsRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListFindingTypeStatsResponse:
+ r"""List all FindingTypeStats under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ async def sample_list_finding_type_stats():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.ListFindingTypeStatsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ response = await client.list_finding_type_stats(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1alpha.types.ListFindingTypeStatsRequest, dict]]):
+ The request object. Request for the ``ListFindingTypeStats`` method.
+ parent (:class:`str`):
+ Required. The parent resource name,
+ which should be a scan run resource name
+ in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ListFindingTypeStatsResponse:
+ Response for the ListFindingTypeStats method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.ListFindingTypeStatsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_finding_type_stats,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, exc_type, exc, tb):
+ await self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("WebSecurityScannerAsyncClient",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/client.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/client.py
@@ -0,0 +1,1963 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import os
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+ cast,
+)
+
+from google.api_core import client_options as client_options_lib
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.exceptions import MutualTLSChannelError # type: ignore
+from google.auth.transport import mtls # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.websecurityscanner_v1alpha import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.websecurityscanner_v1alpha.services.web_security_scanner import pagers
+from google.cloud.websecurityscanner_v1alpha.types import scan_run, web_security_scanner
+from google.cloud.websecurityscanner_v1alpha.types import (
+ crawled_url,
+ finding,
+ finding_addon,
+ finding_type_stats,
+)
+from google.cloud.websecurityscanner_v1alpha.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1alpha.types import scan_config
+
+from .transports.base import DEFAULT_CLIENT_INFO, WebSecurityScannerTransport
+from .transports.grpc import WebSecurityScannerGrpcTransport
+from .transports.grpc_asyncio import WebSecurityScannerGrpcAsyncIOTransport
+from .transports.rest import WebSecurityScannerRestTransport
+
+
+class WebSecurityScannerClientMeta(type):
+ """Metaclass for the WebSecurityScanner client.
+
+ This provides class-level methods for building and retrieving
+ support objects (e.g. transport) without polluting the client instance
+ objects.
+ """
+
+ _transport_registry = (
+ OrderedDict()
+ ) # type: Dict[str, Type[WebSecurityScannerTransport]]
+ _transport_registry["grpc"] = WebSecurityScannerGrpcTransport
+ _transport_registry["grpc_asyncio"] = WebSecurityScannerGrpcAsyncIOTransport
+ _transport_registry["rest"] = WebSecurityScannerRestTransport
+
+ def get_transport_class(
+ cls,
+ label: Optional[str] = None,
+ ) -> Type[WebSecurityScannerTransport]:
+ """Returns an appropriate transport class.
+
+ Args:
+ label: The name of the desired transport. If none is
+ provided, then the first transport in the registry is used.
+
+ Returns:
+ The transport class to use.
+ """
+ # If a specific transport is requested, return that one.
+ if label:
+ return cls._transport_registry[label]
+
+ # No transport is requested; return the default (that is, the first one
+ # in the dictionary).
+ return next(iter(cls._transport_registry.values()))
+
+
+class WebSecurityScannerClient(metaclass=WebSecurityScannerClientMeta):
+ """Cloud Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud
+ Platform. It crawls your application, and attempts to exercise
+ as many user inputs and event handlers as possible.
+ """
+
+ @staticmethod
+ def _get_default_mtls_endpoint(api_endpoint):
+ """Converts api endpoint to mTLS endpoint.
+
+ Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to
+ "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively.
+ Args:
+ api_endpoint (Optional[str]): the api endpoint to convert.
+ Returns:
+ str: converted mTLS api endpoint.
+ """
+ if not api_endpoint:
+ return api_endpoint
+
+ mtls_endpoint_re = re.compile(
+ r"(?P<name>[^.]+)(?P<mtls>\.mtls)?(?P<sandbox>\.sandbox)?(?P<googledomain>\.googleapis\.com)?"
+ )
+
+ m = mtls_endpoint_re.match(api_endpoint)
+ name, mtls, sandbox, googledomain = m.groups()
+ if mtls or not googledomain:
+ return api_endpoint
+
+ if sandbox:
+ return api_endpoint.replace(
+ "sandbox.googleapis.com", "mtls.sandbox.googleapis.com"
+ )
+
+ return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com")
+
+ DEFAULT_ENDPOINT = "websecurityscanner.googleapis.com"
+ DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore
+ DEFAULT_ENDPOINT
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ WebSecurityScannerClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_info(info)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ WebSecurityScannerClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_file(filename)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ from_service_account_json = from_service_account_file
+
+ @property
+ def transport(self) -> WebSecurityScannerTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ WebSecurityScannerTransport: The transport used by the client
+ instance.
+ """
+ return self._transport
+
+ @staticmethod
+ def finding_path(
+ project: str,
+ scan_config: str,
+ scan_run: str,
+ finding: str,
+ ) -> str:
+ """Returns a fully-qualified finding string."""
+ return "projects/{project}/scanConfigs/{scan_config}/scanRuns/{scan_run}/findings/{finding}".format(
+ project=project,
+ scan_config=scan_config,
+ scan_run=scan_run,
+ finding=finding,
+ )
+
+ @staticmethod
+ def parse_finding_path(path: str) -> Dict[str, str]:
+ """Parses a finding path into its component segments."""
+ m = re.match(
+ r"^projects/(?P<project>.+?)/scanConfigs/(?P<scan_config>.+?)/scanRuns/(?P<scan_run>.+?)/findings/(?P<finding>.+?)$",
+ path,
+ )
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def scan_config_path(
+ project: str,
+ scan_config: str,
+ ) -> str:
+ """Returns a fully-qualified scan_config string."""
+ return "projects/{project}/scanConfigs/{scan_config}".format(
+ project=project,
+ scan_config=scan_config,
+ )
+
+ @staticmethod
+ def parse_scan_config_path(path: str) -> Dict[str, str]:
+ """Parses a scan_config path into its component segments."""
+ m = re.match(
+ r"^projects/(?P<project>.+?)/scanConfigs/(?P<scan_config>.+?)$", path
+ )
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def scan_run_path(
+ project: str,
+ scan_config: str,
+ scan_run: str,
+ ) -> str:
+ """Returns a fully-qualified scan_run string."""
+ return (
+ "projects/{project}/scanConfigs/{scan_config}/scanRuns/{scan_run}".format(
+ project=project,
+ scan_config=scan_config,
+ scan_run=scan_run,
+ )
+ )
+
+ @staticmethod
+ def parse_scan_run_path(path: str) -> Dict[str, str]:
+ """Parses a scan_run path into its component segments."""
+ m = re.match(
+ r"^projects/(?P<project>.+?)/scanConfigs/(?P<scan_config>.+?)/scanRuns/(?P<scan_run>.+?)$",
+ path,
+ )
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_billing_account_path(
+ billing_account: str,
+ ) -> str:
+ """Returns a fully-qualified billing_account string."""
+ return "billingAccounts/{billing_account}".format(
+ billing_account=billing_account,
+ )
+
+ @staticmethod
+ def parse_common_billing_account_path(path: str) -> Dict[str, str]:
+ """Parse a billing_account path into its component segments."""
+ m = re.match(r"^billingAccounts/(?P<billing_account>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_folder_path(
+ folder: str,
+ ) -> str:
+ """Returns a fully-qualified folder string."""
+ return "folders/{folder}".format(
+ folder=folder,
+ )
+
+ @staticmethod
+ def parse_common_folder_path(path: str) -> Dict[str, str]:
+ """Parse a folder path into its component segments."""
+ m = re.match(r"^folders/(?P<folder>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_organization_path(
+ organization: str,
+ ) -> str:
+ """Returns a fully-qualified organization string."""
+ return "organizations/{organization}".format(
+ organization=organization,
+ )
+
+ @staticmethod
+ def parse_common_organization_path(path: str) -> Dict[str, str]:
+ """Parse a organization path into its component segments."""
+ m = re.match(r"^organizations/(?P<organization>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_project_path(
+ project: str,
+ ) -> str:
+ """Returns a fully-qualified project string."""
+ return "projects/{project}".format(
+ project=project,
+ )
+
+ @staticmethod
+ def parse_common_project_path(path: str) -> Dict[str, str]:
+ """Parse a project path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_location_path(
+ project: str,
+ location: str,
+ ) -> str:
+ """Returns a fully-qualified location string."""
+ return "projects/{project}/locations/{location}".format(
+ project=project,
+ location=location,
+ )
+
+ @staticmethod
+ def parse_common_location_path(path: str) -> Dict[str, str]:
+ """Parse a location path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[client_options_lib.ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ use_client_cert = os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")
+ use_mtls_endpoint = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto")
+ if use_client_cert not in ("true", "false"):
+ raise ValueError(
+ "Environment variable `GOOGLE_API_USE_CLIENT_CERTIFICATE` must be either `true` or `false`"
+ )
+ if use_mtls_endpoint not in ("auto", "never", "always"):
+ raise MutualTLSChannelError(
+ "Environment variable `GOOGLE_API_USE_MTLS_ENDPOINT` must be `never`, `auto` or `always`"
+ )
+
+ # Figure out the client cert source to use.
+ client_cert_source = None
+ if use_client_cert == "true":
+ if client_options.client_cert_source:
+ client_cert_source = client_options.client_cert_source
+ elif mtls.has_default_client_cert_source():
+ client_cert_source = mtls.default_client_cert_source()
+
+ # Figure out which api endpoint to use.
+ if client_options.api_endpoint is not None:
+ api_endpoint = client_options.api_endpoint
+ elif use_mtls_endpoint == "always" or (
+ use_mtls_endpoint == "auto" and client_cert_source
+ ):
+ api_endpoint = cls.DEFAULT_MTLS_ENDPOINT
+ else:
+ api_endpoint = cls.DEFAULT_ENDPOINT
+
+ return api_endpoint, client_cert_source
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Optional[Union[str, WebSecurityScannerTransport]] = None,
+ client_options: Optional[Union[client_options_lib.ClientOptions, dict]] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the web security scanner client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, WebSecurityScannerTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (Optional[Union[google.api_core.client_options.ClientOptions, dict]]): Custom options for the
+ client. It won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ if isinstance(client_options, dict):
+ client_options = client_options_lib.from_dict(client_options)
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ client_options = cast(client_options_lib.ClientOptions, client_options)
+
+ api_endpoint, client_cert_source_func = self.get_mtls_endpoint_and_cert_source(
+ client_options
+ )
+
+ api_key_value = getattr(client_options, "api_key", None)
+ if api_key_value and credentials:
+ raise ValueError(
+ "client_options.api_key and credentials are mutually exclusive"
+ )
+
+ # Save or instantiate the transport.
+ # Ordinarily, we provide the transport, but allowing a custom transport
+ # instance provides an extensibility point for unusual situations.
+ if isinstance(transport, WebSecurityScannerTransport):
+ # transport is a WebSecurityScannerTransport instance.
+ if credentials or client_options.credentials_file or api_key_value:
+ raise ValueError(
+ "When providing a transport instance, "
+ "provide its credentials directly."
+ )
+ if client_options.scopes:
+ raise ValueError(
+ "When providing a transport instance, provide its scopes "
+ "directly."
+ )
+ self._transport = transport
+ else:
+ import google.auth._default # type: ignore
+
+ if api_key_value and hasattr(
+ google.auth._default, "get_api_key_credentials"
+ ):
+ credentials = google.auth._default.get_api_key_credentials(
+ api_key_value
+ )
+
+ Transport = type(self).get_transport_class(transport)
+ self._transport = Transport(
+ credentials=credentials,
+ credentials_file=client_options.credentials_file,
+ host=api_endpoint,
+ scopes=client_options.scopes,
+ client_cert_source_for_mtls=client_cert_source_func,
+ quota_project_id=client_options.quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=True,
+ api_audience=client_options.api_audience,
+ )
+
+ def create_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.CreateScanConfigRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ scan_config: Optional[gcw_scan_config.ScanConfig] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> gcw_scan_config.ScanConfig:
+ r"""Creates a new ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ def sample_create_scan_config():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ scan_config = websecurityscanner_v1alpha.ScanConfig()
+ scan_config.display_name = "display_name_value"
+ scan_config.starting_urls = ['starting_urls_value1', 'starting_urls_value2']
+
+ request = websecurityscanner_v1alpha.CreateScanConfigRequest(
+ parent="parent_value",
+ scan_config=scan_config,
+ )
+
+ # Make the request
+ response = client.create_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1alpha.types.CreateScanConfigRequest, dict]):
+ The request object. Request for the ``CreateScanConfig`` method.
+ parent (str):
+ Required. The parent resource name
+ where the scan is created, which should
+ be a project resource name in the format
+ 'projects/{projectId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ scan_config (google.cloud.websecurityscanner_v1alpha.types.ScanConfig):
+ Required. The ScanConfig to be
+ created.
+
+ This corresponds to the ``scan_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan. next
+ id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, scan_config])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.CreateScanConfigRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.CreateScanConfigRequest):
+ request = web_security_scanner.CreateScanConfigRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if scan_config is not None:
+ request.scan_config = scan_config
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.create_scan_config]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def delete_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.DeleteScanConfigRequest, dict]
+ ] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> None:
+ r"""Deletes an existing ScanConfig and its child
+ resources.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ def sample_delete_scan_config():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.DeleteScanConfigRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ client.delete_scan_config(request=request)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1alpha.types.DeleteScanConfigRequest, dict]):
+ The request object. Request for the ``DeleteScanConfig`` method.
+ name (str):
+ Required. The resource name of the
+ ScanConfig to be deleted. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.DeleteScanConfigRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.DeleteScanConfigRequest):
+ request = web_security_scanner.DeleteScanConfigRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.delete_scan_config]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ def get_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.GetScanConfigRequest, dict]
+ ] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Gets a ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ def sample_get_scan_config():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.GetScanConfigRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1alpha.types.GetScanConfigRequest, dict]):
+ The request object. Request for the ``GetScanConfig`` method.
+ name (str):
+ Required. The resource name of the
+ ScanConfig to be returned. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan. next
+ id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.GetScanConfigRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.GetScanConfigRequest):
+ request = web_security_scanner.GetScanConfigRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_scan_config]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_scan_configs(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListScanConfigsRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListScanConfigsPager:
+ r"""Lists ScanConfigs under a given project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ def sample_list_scan_configs():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.ListScanConfigsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_scan_configs(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1alpha.types.ListScanConfigsRequest, dict]):
+ The request object. Request for the ``ListScanConfigs`` method.
+ parent (str):
+ Required. The parent resource name,
+ which should be a project resource name
+ in the format 'projects/{projectId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.services.web_security_scanner.pagers.ListScanConfigsPager:
+ Response for the ListScanConfigs method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListScanConfigsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListScanConfigsRequest):
+ request = web_security_scanner.ListScanConfigsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_scan_configs]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListScanConfigsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def update_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.UpdateScanConfigRequest, dict]
+ ] = None,
+ *,
+ scan_config: Optional[gcw_scan_config.ScanConfig] = None,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> gcw_scan_config.ScanConfig:
+ r"""Updates a ScanConfig. This method support partial
+ update of a ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ def sample_update_scan_config():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ scan_config = websecurityscanner_v1alpha.ScanConfig()
+ scan_config.display_name = "display_name_value"
+ scan_config.starting_urls = ['starting_urls_value1', 'starting_urls_value2']
+
+ request = websecurityscanner_v1alpha.UpdateScanConfigRequest(
+ scan_config=scan_config,
+ )
+
+ # Make the request
+ response = client.update_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1alpha.types.UpdateScanConfigRequest, dict]):
+ The request object. Request for the ``UpdateScanConfigRequest`` method.
+ scan_config (google.cloud.websecurityscanner_v1alpha.types.ScanConfig):
+ Required. The ScanConfig to be
+ updated. The name field must be set to
+ identify the resource to be updated. The
+ values of fields not covered by the mask
+ will be ignored.
+
+ This corresponds to the ``scan_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Required. The update mask applies to the resource. For
+ the ``FieldMask`` definition, see
+ https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan. next
+ id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([scan_config, update_mask])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.UpdateScanConfigRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.UpdateScanConfigRequest):
+ request = web_security_scanner.UpdateScanConfigRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if scan_config is not None:
+ request.scan_config = scan_config
+ if update_mask is not None:
+ request.update_mask = update_mask
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.update_scan_config]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("scan_config.name", request.scan_config.name),)
+ ),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def start_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.StartScanRunRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Start a ScanRun according to the given ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ def sample_start_scan_run():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.StartScanRunRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.start_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1alpha.types.StartScanRunRequest, dict]):
+ The request object. Request for the ``StartScanRun`` method.
+ name (str):
+ Required. The resource name of the
+ ScanConfig to be used. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.StartScanRunRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.StartScanRunRequest):
+ request = web_security_scanner.StartScanRunRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.start_scan_run]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.GetScanRunRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Gets a ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ def sample_get_scan_run():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.GetScanRunRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1alpha.types.GetScanRunRequest, dict]):
+ The request object. Request for the ``GetScanRun`` method.
+ name (str):
+ Required. The resource name of the
+ ScanRun to be returned. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.GetScanRunRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.GetScanRunRequest):
+ request = web_security_scanner.GetScanRunRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_scan_run]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_scan_runs(
+ self,
+ request: Optional[Union[web_security_scanner.ListScanRunsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListScanRunsPager:
+ r"""Lists ScanRuns under a given ScanConfig, in
+ descending order of ScanRun stop time.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ def sample_list_scan_runs():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.ListScanRunsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_scan_runs(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1alpha.types.ListScanRunsRequest, dict]):
+ The request object. Request for the ``ListScanRuns`` method.
+ parent (str):
+ Required. The parent resource name,
+ which should be a scan resource name in
+ the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.services.web_security_scanner.pagers.ListScanRunsPager:
+ Response for the ListScanRuns method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListScanRunsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListScanRunsRequest):
+ request = web_security_scanner.ListScanRunsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_scan_runs]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListScanRunsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def stop_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.StopScanRunRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Stops a ScanRun. The stopped ScanRun is returned.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ def sample_stop_scan_run():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.StopScanRunRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.stop_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1alpha.types.StopScanRunRequest, dict]):
+ The request object. Request for the ``StopScanRun`` method.
+ name (str):
+ Required. The resource name of the
+ ScanRun to be stopped. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.StopScanRunRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.StopScanRunRequest):
+ request = web_security_scanner.StopScanRunRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.stop_scan_run]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_crawled_urls(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListCrawledUrlsRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListCrawledUrlsPager:
+ r"""List CrawledUrls under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ def sample_list_crawled_urls():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.ListCrawledUrlsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_crawled_urls(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1alpha.types.ListCrawledUrlsRequest, dict]):
+ The request object. Request for the ``ListCrawledUrls`` method.
+ parent (str):
+ Required. The parent resource name,
+ which should be a scan run resource name
+ in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.services.web_security_scanner.pagers.ListCrawledUrlsPager:
+ Response for the ListCrawledUrls method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListCrawledUrlsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListCrawledUrlsRequest):
+ request = web_security_scanner.ListCrawledUrlsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_crawled_urls]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListCrawledUrlsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_finding(
+ self,
+ request: Optional[Union[web_security_scanner.GetFindingRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> finding.Finding:
+ r"""Gets a Finding.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ def sample_get_finding():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.GetFindingRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_finding(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1alpha.types.GetFindingRequest, dict]):
+ The request object. Request for the ``GetFinding`` method.
+ name (str):
+ Required. The resource name of the
+ Finding to be returned. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}/findings/{findingId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.Finding:
+ A Finding resource represents a
+ vulnerability instance identified during
+ a ScanRun.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.GetFindingRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.GetFindingRequest):
+ request = web_security_scanner.GetFindingRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_finding]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_findings(
+ self,
+ request: Optional[Union[web_security_scanner.ListFindingsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ filter: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListFindingsPager:
+ r"""List Findings under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ def sample_list_findings():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.ListFindingsRequest(
+ parent="parent_value",
+ filter="filter_value",
+ )
+
+ # Make the request
+ page_result = client.list_findings(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1alpha.types.ListFindingsRequest, dict]):
+ The request object. Request for the ``ListFindings`` method.
+ parent (str):
+ Required. The parent resource name,
+ which should be a scan run resource name
+ in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ filter (str):
+ Required. The filter expression. The expression must be
+ in the format: . Supported field: 'finding_type'.
+ Supported operator: '='.
+
+ This corresponds to the ``filter`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.services.web_security_scanner.pagers.ListFindingsPager:
+ Response for the ListFindings method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, filter])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListFindingsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListFindingsRequest):
+ request = web_security_scanner.ListFindingsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if filter is not None:
+ request.filter = filter
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_findings]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListFindingsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_finding_type_stats(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListFindingTypeStatsRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListFindingTypeStatsResponse:
+ r"""List all FindingTypeStats under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1alpha
+
+ def sample_list_finding_type_stats():
+ # Create a client
+ client = websecurityscanner_v1alpha.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1alpha.ListFindingTypeStatsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ response = client.list_finding_type_stats(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1alpha.types.ListFindingTypeStatsRequest, dict]):
+ The request object. Request for the ``ListFindingTypeStats`` method.
+ parent (str):
+ Required. The parent resource name,
+ which should be a scan run resource name
+ in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1alpha.types.ListFindingTypeStatsResponse:
+ Response for the ListFindingTypeStats method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListFindingTypeStatsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListFindingTypeStatsRequest):
+ request = web_security_scanner.ListFindingTypeStatsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_finding_type_stats]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def __enter__(self) -> "WebSecurityScannerClient":
+ return self
+
+ def __exit__(self, type, value, traceback):
+ """Releases underlying transport's resources.
+
+ .. warning::
+ ONLY use as a context manager if the transport is NOT shared
+ with other clients! Exiting the with block will CLOSE the transport
+ and may cause errors in other clients!
+ """
+ self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("WebSecurityScannerClient",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/pagers.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/pagers.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/pagers.py
@@ -0,0 +1,549 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import (
+ Any,
+ AsyncIterator,
+ Awaitable,
+ Callable,
+ Iterator,
+ Optional,
+ Sequence,
+ Tuple,
+)
+
+from google.cloud.websecurityscanner_v1alpha.types import (
+ crawled_url,
+ finding,
+ scan_config,
+ scan_run,
+ web_security_scanner,
+)
+
+
+class ListScanConfigsPager:
+ """A pager for iterating through ``list_scan_configs`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1alpha.types.ListScanConfigsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``scan_configs`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListScanConfigs`` requests and continue to iterate
+ through the ``scan_configs`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1alpha.types.ListScanConfigsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., web_security_scanner.ListScanConfigsResponse],
+ request: web_security_scanner.ListScanConfigsRequest,
+ response: web_security_scanner.ListScanConfigsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1alpha.types.ListScanConfigsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1alpha.types.ListScanConfigsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListScanConfigsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[web_security_scanner.ListScanConfigsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[scan_config.ScanConfig]:
+ for page in self.pages:
+ yield from page.scan_configs
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListScanConfigsAsyncPager:
+ """A pager for iterating through ``list_scan_configs`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1alpha.types.ListScanConfigsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``scan_configs`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListScanConfigs`` requests and continue to iterate
+ through the ``scan_configs`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1alpha.types.ListScanConfigsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[web_security_scanner.ListScanConfigsResponse]],
+ request: web_security_scanner.ListScanConfigsRequest,
+ response: web_security_scanner.ListScanConfigsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1alpha.types.ListScanConfigsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1alpha.types.ListScanConfigsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListScanConfigsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(
+ self,
+ ) -> AsyncIterator[web_security_scanner.ListScanConfigsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[scan_config.ScanConfig]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.scan_configs:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListScanRunsPager:
+ """A pager for iterating through ``list_scan_runs`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1alpha.types.ListScanRunsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``scan_runs`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListScanRuns`` requests and continue to iterate
+ through the ``scan_runs`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1alpha.types.ListScanRunsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., web_security_scanner.ListScanRunsResponse],
+ request: web_security_scanner.ListScanRunsRequest,
+ response: web_security_scanner.ListScanRunsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1alpha.types.ListScanRunsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1alpha.types.ListScanRunsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListScanRunsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[web_security_scanner.ListScanRunsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[scan_run.ScanRun]:
+ for page in self.pages:
+ yield from page.scan_runs
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListScanRunsAsyncPager:
+ """A pager for iterating through ``list_scan_runs`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1alpha.types.ListScanRunsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``scan_runs`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListScanRuns`` requests and continue to iterate
+ through the ``scan_runs`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1alpha.types.ListScanRunsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[web_security_scanner.ListScanRunsResponse]],
+ request: web_security_scanner.ListScanRunsRequest,
+ response: web_security_scanner.ListScanRunsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1alpha.types.ListScanRunsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1alpha.types.ListScanRunsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListScanRunsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[web_security_scanner.ListScanRunsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[scan_run.ScanRun]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.scan_runs:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListCrawledUrlsPager:
+ """A pager for iterating through ``list_crawled_urls`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1alpha.types.ListCrawledUrlsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``crawled_urls`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListCrawledUrls`` requests and continue to iterate
+ through the ``crawled_urls`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1alpha.types.ListCrawledUrlsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., web_security_scanner.ListCrawledUrlsResponse],
+ request: web_security_scanner.ListCrawledUrlsRequest,
+ response: web_security_scanner.ListCrawledUrlsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1alpha.types.ListCrawledUrlsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1alpha.types.ListCrawledUrlsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListCrawledUrlsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[web_security_scanner.ListCrawledUrlsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[crawled_url.CrawledUrl]:
+ for page in self.pages:
+ yield from page.crawled_urls
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListCrawledUrlsAsyncPager:
+ """A pager for iterating through ``list_crawled_urls`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1alpha.types.ListCrawledUrlsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``crawled_urls`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListCrawledUrls`` requests and continue to iterate
+ through the ``crawled_urls`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1alpha.types.ListCrawledUrlsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[web_security_scanner.ListCrawledUrlsResponse]],
+ request: web_security_scanner.ListCrawledUrlsRequest,
+ response: web_security_scanner.ListCrawledUrlsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1alpha.types.ListCrawledUrlsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1alpha.types.ListCrawledUrlsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListCrawledUrlsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(
+ self,
+ ) -> AsyncIterator[web_security_scanner.ListCrawledUrlsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[crawled_url.CrawledUrl]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.crawled_urls:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListFindingsPager:
+ """A pager for iterating through ``list_findings`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1alpha.types.ListFindingsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``findings`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListFindings`` requests and continue to iterate
+ through the ``findings`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1alpha.types.ListFindingsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., web_security_scanner.ListFindingsResponse],
+ request: web_security_scanner.ListFindingsRequest,
+ response: web_security_scanner.ListFindingsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1alpha.types.ListFindingsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1alpha.types.ListFindingsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListFindingsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[web_security_scanner.ListFindingsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[finding.Finding]:
+ for page in self.pages:
+ yield from page.findings
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListFindingsAsyncPager:
+ """A pager for iterating through ``list_findings`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1alpha.types.ListFindingsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``findings`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListFindings`` requests and continue to iterate
+ through the ``findings`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1alpha.types.ListFindingsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[web_security_scanner.ListFindingsResponse]],
+ request: web_security_scanner.ListFindingsRequest,
+ response: web_security_scanner.ListFindingsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1alpha.types.ListFindingsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1alpha.types.ListFindingsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListFindingsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[web_security_scanner.ListFindingsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[finding.Finding]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.findings:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/__init__.py
@@ -0,0 +1,38 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+from typing import Dict, Type
+
+from .base import WebSecurityScannerTransport
+from .grpc import WebSecurityScannerGrpcTransport
+from .grpc_asyncio import WebSecurityScannerGrpcAsyncIOTransport
+from .rest import WebSecurityScannerRestInterceptor, WebSecurityScannerRestTransport
+
+# Compile a registry of transports.
+_transport_registry = (
+ OrderedDict()
+) # type: Dict[str, Type[WebSecurityScannerTransport]]
+_transport_registry["grpc"] = WebSecurityScannerGrpcTransport
+_transport_registry["grpc_asyncio"] = WebSecurityScannerGrpcAsyncIOTransport
+_transport_registry["rest"] = WebSecurityScannerRestTransport
+
+__all__ = (
+ "WebSecurityScannerTransport",
+ "WebSecurityScannerGrpcTransport",
+ "WebSecurityScannerGrpcAsyncIOTransport",
+ "WebSecurityScannerRestTransport",
+ "WebSecurityScannerRestInterceptor",
+)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/base.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/base.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/base.py
@@ -0,0 +1,432 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import abc
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Union
+
+import google.api_core
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+from google.protobuf import empty_pb2 # type: ignore
+
+from google.cloud.websecurityscanner_v1alpha import gapic_version as package_version
+from google.cloud.websecurityscanner_v1alpha.types import scan_run, web_security_scanner
+from google.cloud.websecurityscanner_v1alpha.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1alpha.types import finding
+from google.cloud.websecurityscanner_v1alpha.types import scan_config
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+class WebSecurityScannerTransport(abc.ABC):
+ """Abstract transport class for WebSecurityScanner."""
+
+ AUTH_SCOPES = ("https://www.googleapis.com/auth/cloud-platform",)
+
+ DEFAULT_HOST: str = "websecurityscanner.googleapis.com"
+
+ def __init__(
+ self,
+ *,
+ host: str = DEFAULT_HOST,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ **kwargs,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A list of scopes.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ """
+
+ scopes_kwargs = {"scopes": scopes, "default_scopes": self.AUTH_SCOPES}
+
+ # Save the scopes.
+ self._scopes = scopes
+
+ # If no credentials are provided, then determine the appropriate
+ # defaults.
+ if credentials and credentials_file:
+ raise core_exceptions.DuplicateCredentialArgs(
+ "'credentials_file' and 'credentials' are mutually exclusive"
+ )
+
+ if credentials_file is not None:
+ credentials, _ = google.auth.load_credentials_from_file(
+ credentials_file, **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ elif credentials is None:
+ credentials, _ = google.auth.default(
+ **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ # Don't apply audience if the credentials file passed from user.
+ if hasattr(credentials, "with_gdch_audience"):
+ credentials = credentials.with_gdch_audience(
+ api_audience if api_audience else host
+ )
+
+ # If the credentials are service account credentials, then always try to use self signed JWT.
+ if (
+ always_use_jwt_access
+ and isinstance(credentials, service_account.Credentials)
+ and hasattr(service_account.Credentials, "with_always_use_jwt_access")
+ ):
+ credentials = credentials.with_always_use_jwt_access(True)
+
+ # Save the credentials.
+ self._credentials = credentials
+
+ # Save the hostname. Default to port 443 (HTTPS) if none is specified.
+ if ":" not in host:
+ host += ":443"
+ self._host = host
+
+ def _prep_wrapped_messages(self, client_info):
+ # Precompute the wrapped methods.
+ self._wrapped_methods = {
+ self.create_scan_config: gapic_v1.method.wrap_method(
+ self.create_scan_config,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.delete_scan_config: gapic_v1.method.wrap_method(
+ self.delete_scan_config,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.get_scan_config: gapic_v1.method.wrap_method(
+ self.get_scan_config,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_scan_configs: gapic_v1.method.wrap_method(
+ self.list_scan_configs,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.update_scan_config: gapic_v1.method.wrap_method(
+ self.update_scan_config,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.start_scan_run: gapic_v1.method.wrap_method(
+ self.start_scan_run,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.get_scan_run: gapic_v1.method.wrap_method(
+ self.get_scan_run,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_scan_runs: gapic_v1.method.wrap_method(
+ self.list_scan_runs,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.stop_scan_run: gapic_v1.method.wrap_method(
+ self.stop_scan_run,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_crawled_urls: gapic_v1.method.wrap_method(
+ self.list_crawled_urls,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.get_finding: gapic_v1.method.wrap_method(
+ self.get_finding,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_findings: gapic_v1.method.wrap_method(
+ self.list_findings,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_finding_type_stats: gapic_v1.method.wrap_method(
+ self.list_finding_type_stats,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ }
+
+ def close(self):
+ """Closes resources associated with the transport.
+
+ .. warning::
+ Only call this method if the transport is NOT shared
+ with other clients - this may cause errors in other clients!
+ """
+ raise NotImplementedError()
+
+ @property
+ def create_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.CreateScanConfigRequest],
+ Union[gcw_scan_config.ScanConfig, Awaitable[gcw_scan_config.ScanConfig]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def delete_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.DeleteScanConfigRequest],
+ Union[empty_pb2.Empty, Awaitable[empty_pb2.Empty]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetScanConfigRequest],
+ Union[scan_config.ScanConfig, Awaitable[scan_config.ScanConfig]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_scan_configs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanConfigsRequest],
+ Union[
+ web_security_scanner.ListScanConfigsResponse,
+ Awaitable[web_security_scanner.ListScanConfigsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def update_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.UpdateScanConfigRequest],
+ Union[gcw_scan_config.ScanConfig, Awaitable[gcw_scan_config.ScanConfig]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def start_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.StartScanRunRequest],
+ Union[scan_run.ScanRun, Awaitable[scan_run.ScanRun]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetScanRunRequest],
+ Union[scan_run.ScanRun, Awaitable[scan_run.ScanRun]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_scan_runs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanRunsRequest],
+ Union[
+ web_security_scanner.ListScanRunsResponse,
+ Awaitable[web_security_scanner.ListScanRunsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def stop_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.StopScanRunRequest],
+ Union[scan_run.ScanRun, Awaitable[scan_run.ScanRun]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_crawled_urls(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListCrawledUrlsRequest],
+ Union[
+ web_security_scanner.ListCrawledUrlsResponse,
+ Awaitable[web_security_scanner.ListCrawledUrlsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_finding(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetFindingRequest],
+ Union[finding.Finding, Awaitable[finding.Finding]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_findings(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingsRequest],
+ Union[
+ web_security_scanner.ListFindingsResponse,
+ Awaitable[web_security_scanner.ListFindingsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_finding_type_stats(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingTypeStatsRequest],
+ Union[
+ web_security_scanner.ListFindingTypeStatsResponse,
+ Awaitable[web_security_scanner.ListFindingTypeStatsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def kind(self) -> str:
+ raise NotImplementedError()
+
+
+__all__ = ("WebSecurityScannerTransport",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/grpc.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/grpc.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/grpc.py
@@ -0,0 +1,606 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.protobuf import empty_pb2 # type: ignore
+import grpc # type: ignore
+
+from google.cloud.websecurityscanner_v1alpha.types import scan_run, web_security_scanner
+from google.cloud.websecurityscanner_v1alpha.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1alpha.types import finding
+from google.cloud.websecurityscanner_v1alpha.types import scan_config
+
+from .base import DEFAULT_CLIENT_INFO, WebSecurityScannerTransport
+
+
+class WebSecurityScannerGrpcTransport(WebSecurityScannerTransport):
+ """gRPC backend transport for WebSecurityScanner.
+
+ Cloud Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud
+ Platform. It crawls your application, and attempts to exercise
+ as many user inputs and event handlers as possible.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _stubs: Dict[str, Callable]
+
+ def __init__(
+ self,
+ *,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[grpc.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ channel (Optional[grpc.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> grpc.Channel:
+ """Create and return a gRPC channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ grpc.Channel: A gRPC channel object.
+
+ Raises:
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+
+ return grpc_helpers.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ @property
+ def grpc_channel(self) -> grpc.Channel:
+ """Return the channel designed to connect to this service."""
+ return self._grpc_channel
+
+ @property
+ def create_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.CreateScanConfigRequest], gcw_scan_config.ScanConfig
+ ]:
+ r"""Return a callable for the create scan config method over gRPC.
+
+ Creates a new ScanConfig.
+
+ Returns:
+ Callable[[~.CreateScanConfigRequest],
+ ~.ScanConfig]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_scan_config" not in self._stubs:
+ self._stubs["create_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/CreateScanConfig",
+ request_serializer=web_security_scanner.CreateScanConfigRequest.serialize,
+ response_deserializer=gcw_scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["create_scan_config"]
+
+ @property
+ def delete_scan_config(
+ self,
+ ) -> Callable[[web_security_scanner.DeleteScanConfigRequest], empty_pb2.Empty]:
+ r"""Return a callable for the delete scan config method over gRPC.
+
+ Deletes an existing ScanConfig and its child
+ resources.
+
+ Returns:
+ Callable[[~.DeleteScanConfigRequest],
+ ~.Empty]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_scan_config" not in self._stubs:
+ self._stubs["delete_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/DeleteScanConfig",
+ request_serializer=web_security_scanner.DeleteScanConfigRequest.serialize,
+ response_deserializer=empty_pb2.Empty.FromString,
+ )
+ return self._stubs["delete_scan_config"]
+
+ @property
+ def get_scan_config(
+ self,
+ ) -> Callable[[web_security_scanner.GetScanConfigRequest], scan_config.ScanConfig]:
+ r"""Return a callable for the get scan config method over gRPC.
+
+ Gets a ScanConfig.
+
+ Returns:
+ Callable[[~.GetScanConfigRequest],
+ ~.ScanConfig]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_scan_config" not in self._stubs:
+ self._stubs["get_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/GetScanConfig",
+ request_serializer=web_security_scanner.GetScanConfigRequest.serialize,
+ response_deserializer=scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["get_scan_config"]
+
+ @property
+ def list_scan_configs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanConfigsRequest],
+ web_security_scanner.ListScanConfigsResponse,
+ ]:
+ r"""Return a callable for the list scan configs method over gRPC.
+
+ Lists ScanConfigs under a given project.
+
+ Returns:
+ Callable[[~.ListScanConfigsRequest],
+ ~.ListScanConfigsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_scan_configs" not in self._stubs:
+ self._stubs["list_scan_configs"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/ListScanConfigs",
+ request_serializer=web_security_scanner.ListScanConfigsRequest.serialize,
+ response_deserializer=web_security_scanner.ListScanConfigsResponse.deserialize,
+ )
+ return self._stubs["list_scan_configs"]
+
+ @property
+ def update_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.UpdateScanConfigRequest], gcw_scan_config.ScanConfig
+ ]:
+ r"""Return a callable for the update scan config method over gRPC.
+
+ Updates a ScanConfig. This method support partial
+ update of a ScanConfig.
+
+ Returns:
+ Callable[[~.UpdateScanConfigRequest],
+ ~.ScanConfig]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_scan_config" not in self._stubs:
+ self._stubs["update_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/UpdateScanConfig",
+ request_serializer=web_security_scanner.UpdateScanConfigRequest.serialize,
+ response_deserializer=gcw_scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["update_scan_config"]
+
+ @property
+ def start_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.StartScanRunRequest], scan_run.ScanRun]:
+ r"""Return a callable for the start scan run method over gRPC.
+
+ Start a ScanRun according to the given ScanConfig.
+
+ Returns:
+ Callable[[~.StartScanRunRequest],
+ ~.ScanRun]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "start_scan_run" not in self._stubs:
+ self._stubs["start_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/StartScanRun",
+ request_serializer=web_security_scanner.StartScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["start_scan_run"]
+
+ @property
+ def get_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.GetScanRunRequest], scan_run.ScanRun]:
+ r"""Return a callable for the get scan run method over gRPC.
+
+ Gets a ScanRun.
+
+ Returns:
+ Callable[[~.GetScanRunRequest],
+ ~.ScanRun]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_scan_run" not in self._stubs:
+ self._stubs["get_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/GetScanRun",
+ request_serializer=web_security_scanner.GetScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["get_scan_run"]
+
+ @property
+ def list_scan_runs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanRunsRequest],
+ web_security_scanner.ListScanRunsResponse,
+ ]:
+ r"""Return a callable for the list scan runs method over gRPC.
+
+ Lists ScanRuns under a given ScanConfig, in
+ descending order of ScanRun stop time.
+
+ Returns:
+ Callable[[~.ListScanRunsRequest],
+ ~.ListScanRunsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_scan_runs" not in self._stubs:
+ self._stubs["list_scan_runs"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/ListScanRuns",
+ request_serializer=web_security_scanner.ListScanRunsRequest.serialize,
+ response_deserializer=web_security_scanner.ListScanRunsResponse.deserialize,
+ )
+ return self._stubs["list_scan_runs"]
+
+ @property
+ def stop_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.StopScanRunRequest], scan_run.ScanRun]:
+ r"""Return a callable for the stop scan run method over gRPC.
+
+ Stops a ScanRun. The stopped ScanRun is returned.
+
+ Returns:
+ Callable[[~.StopScanRunRequest],
+ ~.ScanRun]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "stop_scan_run" not in self._stubs:
+ self._stubs["stop_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/StopScanRun",
+ request_serializer=web_security_scanner.StopScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["stop_scan_run"]
+
+ @property
+ def list_crawled_urls(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListCrawledUrlsRequest],
+ web_security_scanner.ListCrawledUrlsResponse,
+ ]:
+ r"""Return a callable for the list crawled urls method over gRPC.
+
+ List CrawledUrls under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListCrawledUrlsRequest],
+ ~.ListCrawledUrlsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_crawled_urls" not in self._stubs:
+ self._stubs["list_crawled_urls"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/ListCrawledUrls",
+ request_serializer=web_security_scanner.ListCrawledUrlsRequest.serialize,
+ response_deserializer=web_security_scanner.ListCrawledUrlsResponse.deserialize,
+ )
+ return self._stubs["list_crawled_urls"]
+
+ @property
+ def get_finding(
+ self,
+ ) -> Callable[[web_security_scanner.GetFindingRequest], finding.Finding]:
+ r"""Return a callable for the get finding method over gRPC.
+
+ Gets a Finding.
+
+ Returns:
+ Callable[[~.GetFindingRequest],
+ ~.Finding]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_finding" not in self._stubs:
+ self._stubs["get_finding"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/GetFinding",
+ request_serializer=web_security_scanner.GetFindingRequest.serialize,
+ response_deserializer=finding.Finding.deserialize,
+ )
+ return self._stubs["get_finding"]
+
+ @property
+ def list_findings(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingsRequest],
+ web_security_scanner.ListFindingsResponse,
+ ]:
+ r"""Return a callable for the list findings method over gRPC.
+
+ List Findings under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListFindingsRequest],
+ ~.ListFindingsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_findings" not in self._stubs:
+ self._stubs["list_findings"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/ListFindings",
+ request_serializer=web_security_scanner.ListFindingsRequest.serialize,
+ response_deserializer=web_security_scanner.ListFindingsResponse.deserialize,
+ )
+ return self._stubs["list_findings"]
+
+ @property
+ def list_finding_type_stats(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingTypeStatsRequest],
+ web_security_scanner.ListFindingTypeStatsResponse,
+ ]:
+ r"""Return a callable for the list finding type stats method over gRPC.
+
+ List all FindingTypeStats under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListFindingTypeStatsRequest],
+ ~.ListFindingTypeStatsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_finding_type_stats" not in self._stubs:
+ self._stubs["list_finding_type_stats"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/ListFindingTypeStats",
+ request_serializer=web_security_scanner.ListFindingTypeStatsRequest.serialize,
+ response_deserializer=web_security_scanner.ListFindingTypeStatsResponse.deserialize,
+ )
+ return self._stubs["list_finding_type_stats"]
+
+ def close(self):
+ self.grpc_channel.close()
+
+ @property
+ def kind(self) -> str:
+ return "grpc"
+
+
+__all__ = ("WebSecurityScannerGrpcTransport",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/grpc_asyncio.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/grpc_asyncio.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/grpc_asyncio.py
@@ -0,0 +1,617 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers_async
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.protobuf import empty_pb2 # type: ignore
+import grpc # type: ignore
+from grpc.experimental import aio # type: ignore
+
+from google.cloud.websecurityscanner_v1alpha.types import scan_run, web_security_scanner
+from google.cloud.websecurityscanner_v1alpha.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1alpha.types import finding
+from google.cloud.websecurityscanner_v1alpha.types import scan_config
+
+from .base import DEFAULT_CLIENT_INFO, WebSecurityScannerTransport
+from .grpc import WebSecurityScannerGrpcTransport
+
+
+class WebSecurityScannerGrpcAsyncIOTransport(WebSecurityScannerTransport):
+ """gRPC AsyncIO backend transport for WebSecurityScanner.
+
+ Cloud Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud
+ Platform. It crawls your application, and attempts to exercise
+ as many user inputs and event handlers as possible.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _grpc_channel: aio.Channel
+ _stubs: Dict[str, Callable] = {}
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> aio.Channel:
+ """Create and return a gRPC AsyncIO channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ aio.Channel: A gRPC AsyncIO channel object.
+ """
+
+ return grpc_helpers_async.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ def __init__(
+ self,
+ *,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[aio.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ channel (Optional[aio.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def grpc_channel(self) -> aio.Channel:
+ """Create the channel designed to connect to this service.
+
+ This property caches on the instance; repeated calls return
+ the same channel.
+ """
+ # Return the channel from cache.
+ return self._grpc_channel
+
+ @property
+ def create_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.CreateScanConfigRequest],
+ Awaitable[gcw_scan_config.ScanConfig],
+ ]:
+ r"""Return a callable for the create scan config method over gRPC.
+
+ Creates a new ScanConfig.
+
+ Returns:
+ Callable[[~.CreateScanConfigRequest],
+ Awaitable[~.ScanConfig]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_scan_config" not in self._stubs:
+ self._stubs["create_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/CreateScanConfig",
+ request_serializer=web_security_scanner.CreateScanConfigRequest.serialize,
+ response_deserializer=gcw_scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["create_scan_config"]
+
+ @property
+ def delete_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.DeleteScanConfigRequest], Awaitable[empty_pb2.Empty]
+ ]:
+ r"""Return a callable for the delete scan config method over gRPC.
+
+ Deletes an existing ScanConfig and its child
+ resources.
+
+ Returns:
+ Callable[[~.DeleteScanConfigRequest],
+ Awaitable[~.Empty]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_scan_config" not in self._stubs:
+ self._stubs["delete_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/DeleteScanConfig",
+ request_serializer=web_security_scanner.DeleteScanConfigRequest.serialize,
+ response_deserializer=empty_pb2.Empty.FromString,
+ )
+ return self._stubs["delete_scan_config"]
+
+ @property
+ def get_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetScanConfigRequest], Awaitable[scan_config.ScanConfig]
+ ]:
+ r"""Return a callable for the get scan config method over gRPC.
+
+ Gets a ScanConfig.
+
+ Returns:
+ Callable[[~.GetScanConfigRequest],
+ Awaitable[~.ScanConfig]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_scan_config" not in self._stubs:
+ self._stubs["get_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/GetScanConfig",
+ request_serializer=web_security_scanner.GetScanConfigRequest.serialize,
+ response_deserializer=scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["get_scan_config"]
+
+ @property
+ def list_scan_configs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanConfigsRequest],
+ Awaitable[web_security_scanner.ListScanConfigsResponse],
+ ]:
+ r"""Return a callable for the list scan configs method over gRPC.
+
+ Lists ScanConfigs under a given project.
+
+ Returns:
+ Callable[[~.ListScanConfigsRequest],
+ Awaitable[~.ListScanConfigsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_scan_configs" not in self._stubs:
+ self._stubs["list_scan_configs"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/ListScanConfigs",
+ request_serializer=web_security_scanner.ListScanConfigsRequest.serialize,
+ response_deserializer=web_security_scanner.ListScanConfigsResponse.deserialize,
+ )
+ return self._stubs["list_scan_configs"]
+
+ @property
+ def update_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.UpdateScanConfigRequest],
+ Awaitable[gcw_scan_config.ScanConfig],
+ ]:
+ r"""Return a callable for the update scan config method over gRPC.
+
+ Updates a ScanConfig. This method support partial
+ update of a ScanConfig.
+
+ Returns:
+ Callable[[~.UpdateScanConfigRequest],
+ Awaitable[~.ScanConfig]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_scan_config" not in self._stubs:
+ self._stubs["update_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/UpdateScanConfig",
+ request_serializer=web_security_scanner.UpdateScanConfigRequest.serialize,
+ response_deserializer=gcw_scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["update_scan_config"]
+
+ @property
+ def start_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.StartScanRunRequest], Awaitable[scan_run.ScanRun]
+ ]:
+ r"""Return a callable for the start scan run method over gRPC.
+
+ Start a ScanRun according to the given ScanConfig.
+
+ Returns:
+ Callable[[~.StartScanRunRequest],
+ Awaitable[~.ScanRun]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "start_scan_run" not in self._stubs:
+ self._stubs["start_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/StartScanRun",
+ request_serializer=web_security_scanner.StartScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["start_scan_run"]
+
+ @property
+ def get_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetScanRunRequest], Awaitable[scan_run.ScanRun]
+ ]:
+ r"""Return a callable for the get scan run method over gRPC.
+
+ Gets a ScanRun.
+
+ Returns:
+ Callable[[~.GetScanRunRequest],
+ Awaitable[~.ScanRun]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_scan_run" not in self._stubs:
+ self._stubs["get_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/GetScanRun",
+ request_serializer=web_security_scanner.GetScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["get_scan_run"]
+
+ @property
+ def list_scan_runs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanRunsRequest],
+ Awaitable[web_security_scanner.ListScanRunsResponse],
+ ]:
+ r"""Return a callable for the list scan runs method over gRPC.
+
+ Lists ScanRuns under a given ScanConfig, in
+ descending order of ScanRun stop time.
+
+ Returns:
+ Callable[[~.ListScanRunsRequest],
+ Awaitable[~.ListScanRunsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_scan_runs" not in self._stubs:
+ self._stubs["list_scan_runs"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/ListScanRuns",
+ request_serializer=web_security_scanner.ListScanRunsRequest.serialize,
+ response_deserializer=web_security_scanner.ListScanRunsResponse.deserialize,
+ )
+ return self._stubs["list_scan_runs"]
+
+ @property
+ def stop_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.StopScanRunRequest], Awaitable[scan_run.ScanRun]
+ ]:
+ r"""Return a callable for the stop scan run method over gRPC.
+
+ Stops a ScanRun. The stopped ScanRun is returned.
+
+ Returns:
+ Callable[[~.StopScanRunRequest],
+ Awaitable[~.ScanRun]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "stop_scan_run" not in self._stubs:
+ self._stubs["stop_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/StopScanRun",
+ request_serializer=web_security_scanner.StopScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["stop_scan_run"]
+
+ @property
+ def list_crawled_urls(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListCrawledUrlsRequest],
+ Awaitable[web_security_scanner.ListCrawledUrlsResponse],
+ ]:
+ r"""Return a callable for the list crawled urls method over gRPC.
+
+ List CrawledUrls under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListCrawledUrlsRequest],
+ Awaitable[~.ListCrawledUrlsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_crawled_urls" not in self._stubs:
+ self._stubs["list_crawled_urls"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/ListCrawledUrls",
+ request_serializer=web_security_scanner.ListCrawledUrlsRequest.serialize,
+ response_deserializer=web_security_scanner.ListCrawledUrlsResponse.deserialize,
+ )
+ return self._stubs["list_crawled_urls"]
+
+ @property
+ def get_finding(
+ self,
+ ) -> Callable[[web_security_scanner.GetFindingRequest], Awaitable[finding.Finding]]:
+ r"""Return a callable for the get finding method over gRPC.
+
+ Gets a Finding.
+
+ Returns:
+ Callable[[~.GetFindingRequest],
+ Awaitable[~.Finding]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_finding" not in self._stubs:
+ self._stubs["get_finding"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/GetFinding",
+ request_serializer=web_security_scanner.GetFindingRequest.serialize,
+ response_deserializer=finding.Finding.deserialize,
+ )
+ return self._stubs["get_finding"]
+
+ @property
+ def list_findings(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingsRequest],
+ Awaitable[web_security_scanner.ListFindingsResponse],
+ ]:
+ r"""Return a callable for the list findings method over gRPC.
+
+ List Findings under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListFindingsRequest],
+ Awaitable[~.ListFindingsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_findings" not in self._stubs:
+ self._stubs["list_findings"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/ListFindings",
+ request_serializer=web_security_scanner.ListFindingsRequest.serialize,
+ response_deserializer=web_security_scanner.ListFindingsResponse.deserialize,
+ )
+ return self._stubs["list_findings"]
+
+ @property
+ def list_finding_type_stats(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingTypeStatsRequest],
+ Awaitable[web_security_scanner.ListFindingTypeStatsResponse],
+ ]:
+ r"""Return a callable for the list finding type stats method over gRPC.
+
+ List all FindingTypeStats under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListFindingTypeStatsRequest],
+ Awaitable[~.ListFindingTypeStatsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_finding_type_stats" not in self._stubs:
+ self._stubs["list_finding_type_stats"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1alpha.WebSecurityScanner/ListFindingTypeStats",
+ request_serializer=web_security_scanner.ListFindingTypeStatsRequest.serialize,
+ response_deserializer=web_security_scanner.ListFindingTypeStatsResponse.deserialize,
+ )
+ return self._stubs["list_finding_type_stats"]
+
+ def close(self):
+ return self.grpc_channel.close()
+
+
+__all__ = ("WebSecurityScannerGrpcAsyncIOTransport",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/rest.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/rest.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/services/web_security_scanner/transports/rest.py
@@ -0,0 +1,1866 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import dataclasses
+import json # type: ignore
+import re
+from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, path_template, rest_helpers, rest_streaming
+from google.api_core import exceptions as core_exceptions
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.auth.transport.requests import AuthorizedSession # type: ignore
+from google.protobuf import json_format
+import grpc # type: ignore
+from requests import __version__ as requests_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+
+from google.protobuf import empty_pb2 # type: ignore
+
+from google.cloud.websecurityscanner_v1alpha.types import scan_run, web_security_scanner
+from google.cloud.websecurityscanner_v1alpha.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1alpha.types import finding
+from google.cloud.websecurityscanner_v1alpha.types import scan_config
+
+from .base import DEFAULT_CLIENT_INFO as BASE_DEFAULT_CLIENT_INFO
+from .base import WebSecurityScannerTransport
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=BASE_DEFAULT_CLIENT_INFO.gapic_version,
+ grpc_version=None,
+ rest_version=requests_version,
+)
+
+
+class WebSecurityScannerRestInterceptor:
+ """Interceptor for WebSecurityScanner.
+
+ Interceptors are used to manipulate requests, request metadata, and responses
+ in arbitrary ways.
+ Example use cases include:
+ * Logging
+ * Verifying requests according to service or custom semantics
+ * Stripping extraneous information from responses
+
+ These use cases and more can be enabled by injecting an
+ instance of a custom subclass when constructing the WebSecurityScannerRestTransport.
+
+ .. code-block:: python
+ class MyCustomWebSecurityScannerInterceptor(WebSecurityScannerRestInterceptor):
+ def pre_create_scan_config(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_create_scan_config(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_delete_scan_config(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def pre_get_finding(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_finding(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_scan_config(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_scan_config(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_scan_run(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_scan_run(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_crawled_urls(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_crawled_urls(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_findings(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_findings(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_finding_type_stats(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_finding_type_stats(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_scan_configs(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_scan_configs(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_scan_runs(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_scan_runs(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_start_scan_run(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_start_scan_run(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_stop_scan_run(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_stop_scan_run(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_update_scan_config(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_update_scan_config(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ transport = WebSecurityScannerRestTransport(interceptor=MyCustomWebSecurityScannerInterceptor())
+ client = WebSecurityScannerClient(transport=transport)
+
+
+ """
+
+ def pre_create_scan_config(
+ self,
+ request: web_security_scanner.CreateScanConfigRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.CreateScanConfigRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for create_scan_config
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_create_scan_config(
+ self, response: gcw_scan_config.ScanConfig
+ ) -> gcw_scan_config.ScanConfig:
+ """Post-rpc interceptor for create_scan_config
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_delete_scan_config(
+ self,
+ request: web_security_scanner.DeleteScanConfigRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.DeleteScanConfigRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for delete_scan_config
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def pre_get_finding(
+ self,
+ request: web_security_scanner.GetFindingRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.GetFindingRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_finding
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_get_finding(self, response: finding.Finding) -> finding.Finding:
+ """Post-rpc interceptor for get_finding
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_scan_config(
+ self,
+ request: web_security_scanner.GetScanConfigRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.GetScanConfigRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_scan_config
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_get_scan_config(
+ self, response: scan_config.ScanConfig
+ ) -> scan_config.ScanConfig:
+ """Post-rpc interceptor for get_scan_config
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_scan_run(
+ self,
+ request: web_security_scanner.GetScanRunRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.GetScanRunRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_scan_run
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_get_scan_run(self, response: scan_run.ScanRun) -> scan_run.ScanRun:
+ """Post-rpc interceptor for get_scan_run
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_crawled_urls(
+ self,
+ request: web_security_scanner.ListCrawledUrlsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.ListCrawledUrlsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_crawled_urls
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_crawled_urls(
+ self, response: web_security_scanner.ListCrawledUrlsResponse
+ ) -> web_security_scanner.ListCrawledUrlsResponse:
+ """Post-rpc interceptor for list_crawled_urls
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_findings(
+ self,
+ request: web_security_scanner.ListFindingsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.ListFindingsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_findings
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_findings(
+ self, response: web_security_scanner.ListFindingsResponse
+ ) -> web_security_scanner.ListFindingsResponse:
+ """Post-rpc interceptor for list_findings
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_finding_type_stats(
+ self,
+ request: web_security_scanner.ListFindingTypeStatsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[
+ web_security_scanner.ListFindingTypeStatsRequest, Sequence[Tuple[str, str]]
+ ]:
+ """Pre-rpc interceptor for list_finding_type_stats
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_finding_type_stats(
+ self, response: web_security_scanner.ListFindingTypeStatsResponse
+ ) -> web_security_scanner.ListFindingTypeStatsResponse:
+ """Post-rpc interceptor for list_finding_type_stats
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_scan_configs(
+ self,
+ request: web_security_scanner.ListScanConfigsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.ListScanConfigsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_scan_configs
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_scan_configs(
+ self, response: web_security_scanner.ListScanConfigsResponse
+ ) -> web_security_scanner.ListScanConfigsResponse:
+ """Post-rpc interceptor for list_scan_configs
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_scan_runs(
+ self,
+ request: web_security_scanner.ListScanRunsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.ListScanRunsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_scan_runs
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_scan_runs(
+ self, response: web_security_scanner.ListScanRunsResponse
+ ) -> web_security_scanner.ListScanRunsResponse:
+ """Post-rpc interceptor for list_scan_runs
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_start_scan_run(
+ self,
+ request: web_security_scanner.StartScanRunRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.StartScanRunRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for start_scan_run
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_start_scan_run(self, response: scan_run.ScanRun) -> scan_run.ScanRun:
+ """Post-rpc interceptor for start_scan_run
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_stop_scan_run(
+ self,
+ request: web_security_scanner.StopScanRunRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.StopScanRunRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for stop_scan_run
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_stop_scan_run(self, response: scan_run.ScanRun) -> scan_run.ScanRun:
+ """Post-rpc interceptor for stop_scan_run
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_update_scan_config(
+ self,
+ request: web_security_scanner.UpdateScanConfigRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.UpdateScanConfigRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for update_scan_config
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_update_scan_config(
+ self, response: gcw_scan_config.ScanConfig
+ ) -> gcw_scan_config.ScanConfig:
+ """Post-rpc interceptor for update_scan_config
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+
+@dataclasses.dataclass
+class WebSecurityScannerRestStub:
+ _session: AuthorizedSession
+ _host: str
+ _interceptor: WebSecurityScannerRestInterceptor
+
+
+class WebSecurityScannerRestTransport(WebSecurityScannerTransport):
+ """REST backend transport for WebSecurityScanner.
+
+ Cloud Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud
+ Platform. It crawls your application, and attempts to exercise
+ as many user inputs and event handlers as possible.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends JSON representations of protocol buffers over HTTP/1.1
+
+ """
+
+ def __init__(
+ self,
+ *,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ url_scheme: str = "https",
+ interceptor: Optional[WebSecurityScannerRestInterceptor] = None,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Callable[[], Tuple[bytes, bytes]]): Client
+ certificate to configure mutual TLS HTTP channel. It is ignored
+ if ``channel`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you are developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ url_scheme: the protocol scheme for the API endpoint. Normally
+ "https", but for testing or local servers,
+ "http" can be specified.
+ """
+ # Run the base constructor
+ # TODO(yon-mg): resolve other ctor params i.e. scopes, quota, etc.
+ # TODO: When custom host (api_endpoint) is set, `scopes` must *also* be set on the
+ # credentials object
+ maybe_url_match = re.match("^(?P<scheme>http(?:s)?://)?(?P<host>.*)$", host)
+ if maybe_url_match is None:
+ raise ValueError(
+ f"Unexpected hostname structure: {host}"
+ ) # pragma: NO COVER
+
+ url_match_items = maybe_url_match.groupdict()
+
+ host = f"{url_scheme}://{host}" if not url_match_items["scheme"] else host
+
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+ self._session = AuthorizedSession(
+ self._credentials, default_host=self.DEFAULT_HOST
+ )
+ if client_cert_source_for_mtls:
+ self._session.configure_mtls_channel(client_cert_source_for_mtls)
+ self._interceptor = interceptor or WebSecurityScannerRestInterceptor()
+ self._prep_wrapped_messages(client_info)
+
+ class _CreateScanConfig(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("CreateScanConfig")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.CreateScanConfigRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> gcw_scan_config.ScanConfig:
+ r"""Call the create scan config method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.CreateScanConfigRequest):
+ The request object. Request for the ``CreateScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.gcw_scan_config.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan. next
+ id: 12
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1alpha/{parent=projects/*}/scanConfigs",
+ "body": "scan_config",
+ },
+ ]
+ request, metadata = self._interceptor.pre_create_scan_config(
+ request, metadata
+ )
+ pb_request = web_security_scanner.CreateScanConfigRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = gcw_scan_config.ScanConfig()
+ pb_resp = gcw_scan_config.ScanConfig.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_create_scan_config(resp)
+ return resp
+
+ class _DeleteScanConfig(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("DeleteScanConfig")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.DeleteScanConfigRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ):
+ r"""Call the delete scan config method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.DeleteScanConfigRequest):
+ The request object. Request for the ``DeleteScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "delete",
+ "uri": "/v1alpha/{name=projects/*/scanConfigs/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_delete_scan_config(
+ request, metadata
+ )
+ pb_request = web_security_scanner.DeleteScanConfigRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ class _GetFinding(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("GetFinding")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.GetFindingRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> finding.Finding:
+ r"""Call the get finding method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.GetFindingRequest):
+ The request object. Request for the ``GetFinding`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.finding.Finding:
+ A Finding resource represents a
+ vulnerability instance identified during
+ a ScanRun.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1alpha/{name=projects/*/scanConfigs/*/scanRuns/*/findings/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_finding(request, metadata)
+ pb_request = web_security_scanner.GetFindingRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = finding.Finding()
+ pb_resp = finding.Finding.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_finding(resp)
+ return resp
+
+ class _GetScanConfig(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("GetScanConfig")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.GetScanConfigRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Call the get scan config method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.GetScanConfigRequest):
+ The request object. Request for the ``GetScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_config.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan. next
+ id: 12
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1alpha/{name=projects/*/scanConfigs/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_scan_config(request, metadata)
+ pb_request = web_security_scanner.GetScanConfigRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_config.ScanConfig()
+ pb_resp = scan_config.ScanConfig.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_scan_config(resp)
+ return resp
+
+ class _GetScanRun(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("GetScanRun")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.GetScanRunRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Call the get scan run method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.GetScanRunRequest):
+ The request object. Request for the ``GetScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_run.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1alpha/{name=projects/*/scanConfigs/*/scanRuns/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_scan_run(request, metadata)
+ pb_request = web_security_scanner.GetScanRunRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_run.ScanRun()
+ pb_resp = scan_run.ScanRun.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_scan_run(resp)
+ return resp
+
+ class _ListCrawledUrls(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListCrawledUrls")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListCrawledUrlsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListCrawledUrlsResponse:
+ r"""Call the list crawled urls method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListCrawledUrlsRequest):
+ The request object. Request for the ``ListCrawledUrls`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListCrawledUrlsResponse:
+ Response for the ``ListCrawledUrls`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1alpha/{parent=projects/*/scanConfigs/*/scanRuns/*}/crawledUrls",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_crawled_urls(
+ request, metadata
+ )
+ pb_request = web_security_scanner.ListCrawledUrlsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListCrawledUrlsResponse()
+ pb_resp = web_security_scanner.ListCrawledUrlsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_crawled_urls(resp)
+ return resp
+
+ class _ListFindings(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListFindings")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "filter": "",
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListFindingsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListFindingsResponse:
+ r"""Call the list findings method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListFindingsRequest):
+ The request object. Request for the ``ListFindings`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListFindingsResponse:
+ Response for the ``ListFindings`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1alpha/{parent=projects/*/scanConfigs/*/scanRuns/*}/findings",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_findings(request, metadata)
+ pb_request = web_security_scanner.ListFindingsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListFindingsResponse()
+ pb_resp = web_security_scanner.ListFindingsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_findings(resp)
+ return resp
+
+ class _ListFindingTypeStats(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListFindingTypeStats")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListFindingTypeStatsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListFindingTypeStatsResponse:
+ r"""Call the list finding type stats method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListFindingTypeStatsRequest):
+ The request object. Request for the ``ListFindingTypeStats`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListFindingTypeStatsResponse:
+ Response for the ``ListFindingTypeStats`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1alpha/{parent=projects/*/scanConfigs/*/scanRuns/*}/findingTypeStats",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_finding_type_stats(
+ request, metadata
+ )
+ pb_request = web_security_scanner.ListFindingTypeStatsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListFindingTypeStatsResponse()
+ pb_resp = web_security_scanner.ListFindingTypeStatsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_finding_type_stats(resp)
+ return resp
+
+ class _ListScanConfigs(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListScanConfigs")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListScanConfigsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListScanConfigsResponse:
+ r"""Call the list scan configs method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListScanConfigsRequest):
+ The request object. Request for the ``ListScanConfigs`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListScanConfigsResponse:
+ Response for the ``ListScanConfigs`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1alpha/{parent=projects/*}/scanConfigs",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_scan_configs(
+ request, metadata
+ )
+ pb_request = web_security_scanner.ListScanConfigsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListScanConfigsResponse()
+ pb_resp = web_security_scanner.ListScanConfigsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_scan_configs(resp)
+ return resp
+
+ class _ListScanRuns(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListScanRuns")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListScanRunsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListScanRunsResponse:
+ r"""Call the list scan runs method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListScanRunsRequest):
+ The request object. Request for the ``ListScanRuns`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListScanRunsResponse:
+ Response for the ``ListScanRuns`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1alpha/{parent=projects/*/scanConfigs/*}/scanRuns",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_scan_runs(request, metadata)
+ pb_request = web_security_scanner.ListScanRunsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListScanRunsResponse()
+ pb_resp = web_security_scanner.ListScanRunsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_scan_runs(resp)
+ return resp
+
+ class _StartScanRun(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("StartScanRun")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.StartScanRunRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Call the start scan run method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.StartScanRunRequest):
+ The request object. Request for the ``StartScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_run.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1alpha/{name=projects/*/scanConfigs/*}:start",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_start_scan_run(request, metadata)
+ pb_request = web_security_scanner.StartScanRunRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_run.ScanRun()
+ pb_resp = scan_run.ScanRun.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_start_scan_run(resp)
+ return resp
+
+ class _StopScanRun(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("StopScanRun")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.StopScanRunRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Call the stop scan run method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.StopScanRunRequest):
+ The request object. Request for the ``StopScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_run.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1alpha/{name=projects/*/scanConfigs/*/scanRuns/*}:stop",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_stop_scan_run(request, metadata)
+ pb_request = web_security_scanner.StopScanRunRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_run.ScanRun()
+ pb_resp = scan_run.ScanRun.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_stop_scan_run(resp)
+ return resp
+
+ class _UpdateScanConfig(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("UpdateScanConfig")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "updateMask": {},
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.UpdateScanConfigRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> gcw_scan_config.ScanConfig:
+ r"""Call the update scan config method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.UpdateScanConfigRequest):
+ The request object. Request for the ``UpdateScanConfigRequest`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.gcw_scan_config.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan. next
+ id: 12
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "patch",
+ "uri": "/v1alpha/{scan_config.name=projects/*/scanConfigs/*}",
+ "body": "scan_config",
+ },
+ ]
+ request, metadata = self._interceptor.pre_update_scan_config(
+ request, metadata
+ )
+ pb_request = web_security_scanner.UpdateScanConfigRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = gcw_scan_config.ScanConfig()
+ pb_resp = gcw_scan_config.ScanConfig.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_update_scan_config(resp)
+ return resp
+
+ @property
+ def create_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.CreateScanConfigRequest], gcw_scan_config.ScanConfig
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._CreateScanConfig(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def delete_scan_config(
+ self,
+ ) -> Callable[[web_security_scanner.DeleteScanConfigRequest], empty_pb2.Empty]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._DeleteScanConfig(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_finding(
+ self,
+ ) -> Callable[[web_security_scanner.GetFindingRequest], finding.Finding]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetFinding(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_scan_config(
+ self,
+ ) -> Callable[[web_security_scanner.GetScanConfigRequest], scan_config.ScanConfig]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetScanConfig(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.GetScanRunRequest], scan_run.ScanRun]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetScanRun(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_crawled_urls(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListCrawledUrlsRequest],
+ web_security_scanner.ListCrawledUrlsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListCrawledUrls(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_findings(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingsRequest],
+ web_security_scanner.ListFindingsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListFindings(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_finding_type_stats(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingTypeStatsRequest],
+ web_security_scanner.ListFindingTypeStatsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListFindingTypeStats(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_scan_configs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanConfigsRequest],
+ web_security_scanner.ListScanConfigsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListScanConfigs(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_scan_runs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanRunsRequest],
+ web_security_scanner.ListScanRunsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListScanRuns(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def start_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.StartScanRunRequest], scan_run.ScanRun]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._StartScanRun(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def stop_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.StopScanRunRequest], scan_run.ScanRun]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._StopScanRun(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def update_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.UpdateScanConfigRequest], gcw_scan_config.ScanConfig
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._UpdateScanConfig(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def kind(self) -> str:
+ return "rest"
+
+ def close(self):
+ self._session.close()
+
+
+__all__ = ("WebSecurityScannerRestTransport",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/__init__.py
@@ -0,0 +1,78 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .crawled_url import CrawledUrl
+from .finding import Finding
+from .finding_addon import (
+ OutdatedLibrary,
+ ViolatingResource,
+ VulnerableHeaders,
+ VulnerableParameters,
+ Xss,
+)
+from .finding_type_stats import FindingTypeStats
+from .scan_config import ScanConfig
+from .scan_run import ScanRun
+from .web_security_scanner import (
+ CreateScanConfigRequest,
+ DeleteScanConfigRequest,
+ GetFindingRequest,
+ GetScanConfigRequest,
+ GetScanRunRequest,
+ ListCrawledUrlsRequest,
+ ListCrawledUrlsResponse,
+ ListFindingsRequest,
+ ListFindingsResponse,
+ ListFindingTypeStatsRequest,
+ ListFindingTypeStatsResponse,
+ ListScanConfigsRequest,
+ ListScanConfigsResponse,
+ ListScanRunsRequest,
+ ListScanRunsResponse,
+ StartScanRunRequest,
+ StopScanRunRequest,
+ UpdateScanConfigRequest,
+)
+
+__all__ = (
+ "CrawledUrl",
+ "Finding",
+ "OutdatedLibrary",
+ "ViolatingResource",
+ "VulnerableHeaders",
+ "VulnerableParameters",
+ "Xss",
+ "FindingTypeStats",
+ "ScanConfig",
+ "ScanRun",
+ "CreateScanConfigRequest",
+ "DeleteScanConfigRequest",
+ "GetFindingRequest",
+ "GetScanConfigRequest",
+ "GetScanRunRequest",
+ "ListCrawledUrlsRequest",
+ "ListCrawledUrlsResponse",
+ "ListFindingsRequest",
+ "ListFindingsResponse",
+ "ListFindingTypeStatsRequest",
+ "ListFindingTypeStatsResponse",
+ "ListScanConfigsRequest",
+ "ListScanConfigsResponse",
+ "ListScanRunsRequest",
+ "ListScanRunsResponse",
+ "StartScanRunRequest",
+ "StopScanRunRequest",
+ "UpdateScanConfigRequest",
+)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/crawled_url.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/crawled_url.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/crawled_url.py
@@ -0,0 +1,61 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1alpha",
+ manifest={
+ "CrawledUrl",
+ },
+)
+
+
+class CrawledUrl(proto.Message):
+ r"""A CrawledUrl resource represents a URL that was crawled
+ during a ScanRun. Web Security Scanner Service crawls the web
+ applications, following all links within the scope of sites, to
+ find the URLs to test against.
+
+ Attributes:
+ http_method (str):
+ Output only. The http method of the request
+ that was used to visit the URL, in uppercase.
+ url (str):
+ Output only. The URL that was crawled.
+ body (str):
+ Output only. The body of the request that was
+ used to visit the URL.
+ """
+
+ http_method: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ url: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ body: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/finding.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/finding.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/finding.py
@@ -0,0 +1,241 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1alpha.types import finding_addon
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1alpha",
+ manifest={
+ "Finding",
+ },
+)
+
+
+class Finding(proto.Message):
+ r"""A Finding resource represents a vulnerability instance
+ identified during a ScanRun.
+
+ Attributes:
+ name (str):
+ The resource name of the Finding. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanruns/{scanRunId}/findings/{findingId}'.
+ The finding IDs are generated by the system.
+ finding_type (google.cloud.websecurityscanner_v1alpha.types.Finding.FindingType):
+ The type of the Finding.
+ http_method (str):
+ The http method of the request that triggered
+ the vulnerability, in uppercase.
+ fuzzed_url (str):
+ The URL produced by the server-side fuzzer
+ and used in the request that triggered the
+ vulnerability.
+ body (str):
+ The body of the request that triggered the
+ vulnerability.
+ description (str):
+ The description of the vulnerability.
+ reproduction_url (str):
+ The URL containing human-readable payload
+ that user can leverage to reproduce the
+ vulnerability.
+ frame_url (str):
+ If the vulnerability was originated from
+ nested IFrame, the immediate parent IFrame is
+ reported.
+ final_url (str):
+ The URL where the browser lands when the
+ vulnerability is detected.
+ tracking_id (str):
+ The tracking ID uniquely identifies a
+ vulnerability instance across multiple ScanRuns.
+ outdated_library (google.cloud.websecurityscanner_v1alpha.types.OutdatedLibrary):
+ An addon containing information about
+ outdated libraries.
+ violating_resource (google.cloud.websecurityscanner_v1alpha.types.ViolatingResource):
+ An addon containing detailed information
+ regarding any resource causing the vulnerability
+ such as JavaScript sources, image, audio files,
+ etc.
+ vulnerable_headers (google.cloud.websecurityscanner_v1alpha.types.VulnerableHeaders):
+ An addon containing information about
+ vulnerable or missing HTTP headers.
+ vulnerable_parameters (google.cloud.websecurityscanner_v1alpha.types.VulnerableParameters):
+ An addon containing information about request
+ parameters which were found to be vulnerable.
+ xss (google.cloud.websecurityscanner_v1alpha.types.Xss):
+ An addon containing information reported for
+ an XSS, if any.
+ """
+
+ class FindingType(proto.Enum):
+ r"""Types of Findings.
+
+ Values:
+ FINDING_TYPE_UNSPECIFIED (0):
+ The invalid finding type.
+ MIXED_CONTENT (1):
+ A page that was served over HTTPS also
+ resources over HTTP. A man-in-the-middle
+ attacker could tamper with the HTTP resource and
+ gain full access to the website that loads the
+ resource or to monitor the actions taken by the
+ user.
+ OUTDATED_LIBRARY (2):
+ The version of an included library is known
+ to contain a security issue. The scanner checks
+ the version of library in use against a known
+ list of vulnerable libraries. False positives
+ are possible if the version detection fails or
+ if the library has been manually patched.
+ ROSETTA_FLASH (5):
+ This type of vulnerability occurs when the
+ value of a request parameter is reflected at the
+ beginning of the response, for example, in
+ requests using JSONP. Under certain
+ circumstances, an attacker may be able to supply
+ an alphanumeric-only Flash file in the
+ vulnerable parameter causing the browser to
+ execute the Flash file as if it originated on
+ the vulnerable server.
+ XSS_CALLBACK (3):
+ A cross-site scripting (XSS) bug is found via
+ JavaScript callback. For detailed explanations
+ on XSS, see
+ https://www.google.com/about/appsecurity/learning/xss/.
+ XSS_ERROR (4):
+ A potential cross-site scripting (XSS) bug
+ due to JavaScript breakage. In some
+ circumstances, the application under test might
+ modify the test string before it is parsed by
+ the browser. When the browser attempts to runs
+ this modified test string, it will likely break
+ and throw a JavaScript execution error, thus an
+ injection issue is occurring. However, it may
+ not be exploitable. Manual verification is
+ needed to see if the test string modifications
+ can be evaded and confirm that the issue is in
+ fact an XSS vulnerability. For detailed
+ explanations on XSS, see
+ https://www.google.com/about/appsecurity/learning/xss/.
+ CLEAR_TEXT_PASSWORD (6):
+ An application appears to be transmitting a
+ password field in clear text. An attacker can
+ eavesdrop network traffic and sniff the password
+ field.
+ INVALID_CONTENT_TYPE (7):
+ An application returns sensitive content with
+ an invalid content type, or without an
+ 'X-Content-Type-Options: nosniff' header.
+ XSS_ANGULAR_CALLBACK (8):
+ A cross-site scripting (XSS) vulnerability in
+ AngularJS module that occurs when a
+ user-provided string is interpolated by Angular.
+ INVALID_HEADER (9):
+ A malformed or invalid valued header.
+ MISSPELLED_SECURITY_HEADER_NAME (10):
+ Misspelled security header name.
+ MISMATCHING_SECURITY_HEADER_VALUES (11):
+ Mismatching values in a duplicate security
+ header.
+ """
+ FINDING_TYPE_UNSPECIFIED = 0
+ MIXED_CONTENT = 1
+ OUTDATED_LIBRARY = 2
+ ROSETTA_FLASH = 5
+ XSS_CALLBACK = 3
+ XSS_ERROR = 4
+ CLEAR_TEXT_PASSWORD = 6
+ INVALID_CONTENT_TYPE = 7
+ XSS_ANGULAR_CALLBACK = 8
+ INVALID_HEADER = 9
+ MISSPELLED_SECURITY_HEADER_NAME = 10
+ MISMATCHING_SECURITY_HEADER_VALUES = 11
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ finding_type: FindingType = proto.Field(
+ proto.ENUM,
+ number=2,
+ enum=FindingType,
+ )
+ http_method: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ fuzzed_url: str = proto.Field(
+ proto.STRING,
+ number=4,
+ )
+ body: str = proto.Field(
+ proto.STRING,
+ number=5,
+ )
+ description: str = proto.Field(
+ proto.STRING,
+ number=6,
+ )
+ reproduction_url: str = proto.Field(
+ proto.STRING,
+ number=7,
+ )
+ frame_url: str = proto.Field(
+ proto.STRING,
+ number=8,
+ )
+ final_url: str = proto.Field(
+ proto.STRING,
+ number=9,
+ )
+ tracking_id: str = proto.Field(
+ proto.STRING,
+ number=10,
+ )
+ outdated_library: finding_addon.OutdatedLibrary = proto.Field(
+ proto.MESSAGE,
+ number=11,
+ message=finding_addon.OutdatedLibrary,
+ )
+ violating_resource: finding_addon.ViolatingResource = proto.Field(
+ proto.MESSAGE,
+ number=12,
+ message=finding_addon.ViolatingResource,
+ )
+ vulnerable_headers: finding_addon.VulnerableHeaders = proto.Field(
+ proto.MESSAGE,
+ number=15,
+ message=finding_addon.VulnerableHeaders,
+ )
+ vulnerable_parameters: finding_addon.VulnerableParameters = proto.Field(
+ proto.MESSAGE,
+ number=13,
+ message=finding_addon.VulnerableParameters,
+ )
+ xss: finding_addon.Xss = proto.Field(
+ proto.MESSAGE,
+ number=14,
+ message=finding_addon.Xss,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/finding_addon.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/finding_addon.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/finding_addon.py
@@ -0,0 +1,159 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1alpha",
+ manifest={
+ "OutdatedLibrary",
+ "ViolatingResource",
+ "VulnerableParameters",
+ "VulnerableHeaders",
+ "Xss",
+ },
+)
+
+
+class OutdatedLibrary(proto.Message):
+ r"""Information reported for an outdated library.
+
+ Attributes:
+ library_name (str):
+ The name of the outdated library.
+ version (str):
+ The version number.
+ learn_more_urls (MutableSequence[str]):
+ URLs to learn more information about the
+ vulnerabilities in the library.
+ """
+
+ library_name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ version: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ learn_more_urls: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=3,
+ )
+
+
+class ViolatingResource(proto.Message):
+ r"""Information regarding any resource causing the vulnerability
+ such as JavaScript sources, image, audio files, etc.
+
+ Attributes:
+ content_type (str):
+ The MIME type of this resource.
+ resource_url (str):
+ URL of this violating resource.
+ """
+
+ content_type: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ resource_url: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class VulnerableParameters(proto.Message):
+ r"""Information about vulnerable request parameters.
+
+ Attributes:
+ parameter_names (MutableSequence[str]):
+ The vulnerable parameter names.
+ """
+
+ parameter_names: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=1,
+ )
+
+
+class VulnerableHeaders(proto.Message):
+ r"""Information about vulnerable or missing HTTP Headers.
+
+ Attributes:
+ headers (MutableSequence[google.cloud.websecurityscanner_v1alpha.types.VulnerableHeaders.Header]):
+ List of vulnerable headers.
+ missing_headers (MutableSequence[google.cloud.websecurityscanner_v1alpha.types.VulnerableHeaders.Header]):
+ List of missing headers.
+ """
+
+ class Header(proto.Message):
+ r"""Describes a HTTP Header.
+
+ Attributes:
+ name (str):
+ Header name.
+ value (str):
+ Header value.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ value: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+ headers: MutableSequence[Header] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=Header,
+ )
+ missing_headers: MutableSequence[Header] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=2,
+ message=Header,
+ )
+
+
+class Xss(proto.Message):
+ r"""Information reported for an XSS.
+
+ Attributes:
+ stack_traces (MutableSequence[str]):
+ Stack traces leading to the point where the
+ XSS occurred.
+ error_message (str):
+ An error message generated by a javascript
+ breakage.
+ """
+
+ stack_traces: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=1,
+ )
+ error_message: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/finding_type_stats.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/finding_type_stats.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/finding_type_stats.py
@@ -0,0 +1,55 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1alpha.types import finding
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1alpha",
+ manifest={
+ "FindingTypeStats",
+ },
+)
+
+
+class FindingTypeStats(proto.Message):
+ r"""A FindingTypeStats resource represents stats regarding a
+ specific FindingType of Findings under a given ScanRun.
+
+ Attributes:
+ finding_type (google.cloud.websecurityscanner_v1alpha.types.Finding.FindingType):
+ The finding type associated with the stats.
+ finding_count (int):
+ The count of findings belonging to this
+ finding type.
+ """
+
+ finding_type: finding.Finding.FindingType = proto.Field(
+ proto.ENUM,
+ number=1,
+ enum=finding.Finding.FindingType,
+ )
+ finding_count: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/scan_config.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/scan_config.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/scan_config.py
@@ -0,0 +1,268 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import timestamp_pb2 # type: ignore
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1alpha.types import scan_run
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1alpha",
+ manifest={
+ "ScanConfig",
+ },
+)
+
+
+class ScanConfig(proto.Message):
+ r"""A ScanConfig resource contains the configurations to launch a
+ scan. next id: 12
+
+ Attributes:
+ name (str):
+ The resource name of the ScanConfig. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ The ScanConfig IDs are generated by the system.
+ display_name (str):
+ Required. The user provided display name of
+ the ScanConfig.
+ max_qps (int):
+ The maximum QPS during scanning. A valid value ranges from 5
+ to 20 inclusively. If the field is unspecified or its value
+ is set 0, server will default to 15. Other values outside of
+ [5, 20] range will be rejected with INVALID_ARGUMENT error.
+ starting_urls (MutableSequence[str]):
+ Required. The starting URLs from which the
+ scanner finds site pages.
+ authentication (google.cloud.websecurityscanner_v1alpha.types.ScanConfig.Authentication):
+ The authentication configuration. If
+ specified, service will use the authentication
+ configuration during scanning.
+ user_agent (google.cloud.websecurityscanner_v1alpha.types.ScanConfig.UserAgent):
+ The user agent used during scanning.
+ blacklist_patterns (MutableSequence[str]):
+ The blacklist URL patterns as described in
+ https://cloud.google.com/security-scanner/docs/excluded-urls
+ schedule (google.cloud.websecurityscanner_v1alpha.types.ScanConfig.Schedule):
+ The schedule of the ScanConfig.
+ target_platforms (MutableSequence[google.cloud.websecurityscanner_v1alpha.types.ScanConfig.TargetPlatform]):
+ Set of Cloud Platforms targeted by the scan. If empty,
+ APP_ENGINE will be used as a default.
+ latest_run (google.cloud.websecurityscanner_v1alpha.types.ScanRun):
+ Latest ScanRun if available.
+ """
+
+ class UserAgent(proto.Enum):
+ r"""Type of user agents used for scanning.
+
+ Values:
+ USER_AGENT_UNSPECIFIED (0):
+ The user agent is unknown. Service will default to
+ CHROME_LINUX.
+ CHROME_LINUX (1):
+ Chrome on Linux. This is the service default
+ if unspecified.
+ CHROME_ANDROID (2):
+ Chrome on Android.
+ SAFARI_IPHONE (3):
+ Safari on IPhone.
+ """
+ USER_AGENT_UNSPECIFIED = 0
+ CHROME_LINUX = 1
+ CHROME_ANDROID = 2
+ SAFARI_IPHONE = 3
+
+ class TargetPlatform(proto.Enum):
+ r"""Cloud platforms supported by Cloud Web Security Scanner.
+
+ Values:
+ TARGET_PLATFORM_UNSPECIFIED (0):
+ The target platform is unknown. Requests with this enum
+ value will be rejected with INVALID_ARGUMENT error.
+ APP_ENGINE (1):
+ Google App Engine service.
+ COMPUTE (2):
+ Google Compute Engine service.
+ """
+ TARGET_PLATFORM_UNSPECIFIED = 0
+ APP_ENGINE = 1
+ COMPUTE = 2
+
+ class Authentication(proto.Message):
+ r"""Scan authentication configuration.
+
+ This message has `oneof`_ fields (mutually exclusive fields).
+ For each oneof, at most one member field can be set at the same time.
+ Setting any member of the oneof automatically clears all other
+ members.
+
+ .. _oneof: https://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields
+
+ Attributes:
+ google_account (google.cloud.websecurityscanner_v1alpha.types.ScanConfig.Authentication.GoogleAccount):
+ Authentication using a Google account.
+
+ This field is a member of `oneof`_ ``authentication``.
+ custom_account (google.cloud.websecurityscanner_v1alpha.types.ScanConfig.Authentication.CustomAccount):
+ Authentication using a custom account.
+
+ This field is a member of `oneof`_ ``authentication``.
+ """
+
+ class GoogleAccount(proto.Message):
+ r"""Describes authentication configuration that uses a Google
+ account.
+
+ Attributes:
+ username (str):
+ Required. The user name of the Google
+ account.
+ password (str):
+ Required. Input only. The password of the
+ Google account. The credential is stored
+ encrypted and not returned in any response nor
+ included in audit logs.
+ """
+
+ username: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ password: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+ class CustomAccount(proto.Message):
+ r"""Describes authentication configuration that uses a custom
+ account.
+
+ Attributes:
+ username (str):
+ Required. The user name of the custom
+ account.
+ password (str):
+ Required. Input only. The password of the
+ custom account. The credential is stored
+ encrypted and not returned in any response nor
+ included in audit logs.
+ login_url (str):
+ Required. The login form URL of the website.
+ """
+
+ username: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ password: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ login_url: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+ google_account: "ScanConfig.Authentication.GoogleAccount" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ oneof="authentication",
+ message="ScanConfig.Authentication.GoogleAccount",
+ )
+ custom_account: "ScanConfig.Authentication.CustomAccount" = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ oneof="authentication",
+ message="ScanConfig.Authentication.CustomAccount",
+ )
+
+ class Schedule(proto.Message):
+ r"""Scan schedule configuration.
+
+ Attributes:
+ schedule_time (google.protobuf.timestamp_pb2.Timestamp):
+ A timestamp indicates when the next run will
+ be scheduled. The value is refreshed by the
+ server after each run. If unspecified, it will
+ default to current server time, which means the
+ scan will be scheduled to start immediately.
+ interval_duration_days (int):
+ Required. The duration of time between
+ executions in days.
+ """
+
+ schedule_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message=timestamp_pb2.Timestamp,
+ )
+ interval_duration_days: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ display_name: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ max_qps: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+ starting_urls: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=4,
+ )
+ authentication: Authentication = proto.Field(
+ proto.MESSAGE,
+ number=5,
+ message=Authentication,
+ )
+ user_agent: UserAgent = proto.Field(
+ proto.ENUM,
+ number=6,
+ enum=UserAgent,
+ )
+ blacklist_patterns: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=7,
+ )
+ schedule: Schedule = proto.Field(
+ proto.MESSAGE,
+ number=8,
+ message=Schedule,
+ )
+ target_platforms: MutableSequence[TargetPlatform] = proto.RepeatedField(
+ proto.ENUM,
+ number=9,
+ enum=TargetPlatform,
+ )
+ latest_run: scan_run.ScanRun = proto.Field(
+ proto.MESSAGE,
+ number=11,
+ message=scan_run.ScanRun,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/scan_run.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/scan_run.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/scan_run.py
@@ -0,0 +1,158 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import timestamp_pb2 # type: ignore
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1alpha",
+ manifest={
+ "ScanRun",
+ },
+)
+
+
+class ScanRun(proto.Message):
+ r"""A ScanRun is a output-only resource representing an actual
+ run of the scan.
+
+ Attributes:
+ name (str):
+ The resource name of the ScanRun. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ The ScanRun IDs are generated by the system.
+ execution_state (google.cloud.websecurityscanner_v1alpha.types.ScanRun.ExecutionState):
+ The execution state of the ScanRun.
+ result_state (google.cloud.websecurityscanner_v1alpha.types.ScanRun.ResultState):
+ The result state of the ScanRun. This field
+ is only available after the execution state
+ reaches "FINISHED".
+ start_time (google.protobuf.timestamp_pb2.Timestamp):
+ The time at which the ScanRun started.
+ end_time (google.protobuf.timestamp_pb2.Timestamp):
+ The time at which the ScanRun reached
+ termination state - that the ScanRun is either
+ finished or stopped by user.
+ urls_crawled_count (int):
+ The number of URLs crawled during this
+ ScanRun. If the scan is in progress, the value
+ represents the number of URLs crawled up to now.
+ urls_tested_count (int):
+ The number of URLs tested during this
+ ScanRun. If the scan is in progress, the value
+ represents the number of URLs tested up to now.
+ The number of URLs tested is usually larger than
+ the number URLS crawled because typically a
+ crawled URL is tested with multiple test
+ payloads.
+ has_vulnerabilities (bool):
+ Whether the scan run has found any
+ vulnerabilities.
+ progress_percent (int):
+ The percentage of total completion ranging
+ from 0 to 100. If the scan is in queue, the
+ value is 0. If the scan is running, the value
+ ranges from 0 to 100. If the scan is finished,
+ the value is 100.
+ """
+
+ class ExecutionState(proto.Enum):
+ r"""Types of ScanRun execution state.
+
+ Values:
+ EXECUTION_STATE_UNSPECIFIED (0):
+ Represents an invalid state caused by
+ internal server error. This value should never
+ be returned.
+ QUEUED (1):
+ The scan is waiting in the queue.
+ SCANNING (2):
+ The scan is in progress.
+ FINISHED (3):
+ The scan is either finished or stopped by
+ user.
+ """
+ EXECUTION_STATE_UNSPECIFIED = 0
+ QUEUED = 1
+ SCANNING = 2
+ FINISHED = 3
+
+ class ResultState(proto.Enum):
+ r"""Types of ScanRun result state.
+
+ Values:
+ RESULT_STATE_UNSPECIFIED (0):
+ Default value. This value is returned when
+ the ScanRun is not yet finished.
+ SUCCESS (1):
+ The scan finished without errors.
+ ERROR (2):
+ The scan finished with errors.
+ KILLED (3):
+ The scan was terminated by user.
+ """
+ RESULT_STATE_UNSPECIFIED = 0
+ SUCCESS = 1
+ ERROR = 2
+ KILLED = 3
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ execution_state: ExecutionState = proto.Field(
+ proto.ENUM,
+ number=2,
+ enum=ExecutionState,
+ )
+ result_state: ResultState = proto.Field(
+ proto.ENUM,
+ number=3,
+ enum=ResultState,
+ )
+ start_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=4,
+ message=timestamp_pb2.Timestamp,
+ )
+ end_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=5,
+ message=timestamp_pb2.Timestamp,
+ )
+ urls_crawled_count: int = proto.Field(
+ proto.INT64,
+ number=6,
+ )
+ urls_tested_count: int = proto.Field(
+ proto.INT64,
+ number=7,
+ )
+ has_vulnerabilities: bool = proto.Field(
+ proto.BOOL,
+ number=8,
+ )
+ progress_percent: int = proto.Field(
+ proto.INT32,
+ number=9,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/web_security_scanner.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/web_security_scanner.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1alpha/types/web_security_scanner.py
@@ -0,0 +1,487 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import field_mask_pb2 # type: ignore
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1alpha.types import (
+ finding_type_stats as gcw_finding_type_stats,
+)
+from google.cloud.websecurityscanner_v1alpha.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1alpha.types import crawled_url, finding
+from google.cloud.websecurityscanner_v1alpha.types import scan_run
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1alpha",
+ manifest={
+ "CreateScanConfigRequest",
+ "DeleteScanConfigRequest",
+ "GetScanConfigRequest",
+ "ListScanConfigsRequest",
+ "UpdateScanConfigRequest",
+ "ListScanConfigsResponse",
+ "StartScanRunRequest",
+ "GetScanRunRequest",
+ "ListScanRunsRequest",
+ "ListScanRunsResponse",
+ "StopScanRunRequest",
+ "ListCrawledUrlsRequest",
+ "ListCrawledUrlsResponse",
+ "GetFindingRequest",
+ "ListFindingsRequest",
+ "ListFindingsResponse",
+ "ListFindingTypeStatsRequest",
+ "ListFindingTypeStatsResponse",
+ },
+)
+
+
+class CreateScanConfigRequest(proto.Message):
+ r"""Request for the ``CreateScanConfig`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name where the
+ scan is created, which should be a project
+ resource name in the format
+ 'projects/{projectId}'.
+ scan_config (google.cloud.websecurityscanner_v1alpha.types.ScanConfig):
+ Required. The ScanConfig to be created.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ scan_config: gcw_scan_config.ScanConfig = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=gcw_scan_config.ScanConfig,
+ )
+
+
+class DeleteScanConfigRequest(proto.Message):
+ r"""Request for the ``DeleteScanConfig`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanConfig
+ to be deleted. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class GetScanConfigRequest(proto.Message):
+ r"""Request for the ``GetScanConfig`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanConfig
+ to be returned. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListScanConfigsRequest(proto.Message):
+ r"""Request for the ``ListScanConfigs`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a project resource name in the format
+ 'projects/{projectId}'.
+ page_token (str):
+ A token identifying a page of results to be returned. This
+ should be a ``next_page_token`` value returned from a
+ previous List request. If unspecified, the first page of
+ results is returned.
+ page_size (int):
+ The maximum number of ScanConfigs to return,
+ can be limited by server. If not specified or
+ not positive, the implementation will select a
+ reasonable value.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+
+
+class UpdateScanConfigRequest(proto.Message):
+ r"""Request for the ``UpdateScanConfigRequest`` method.
+
+ Attributes:
+ scan_config (google.cloud.websecurityscanner_v1alpha.types.ScanConfig):
+ Required. The ScanConfig to be updated. The
+ name field must be set to identify the resource
+ to be updated. The values of fields not covered
+ by the mask will be ignored.
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Required. The update mask applies to the resource. For the
+ ``FieldMask`` definition, see
+ https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask
+ """
+
+ scan_config: gcw_scan_config.ScanConfig = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=gcw_scan_config.ScanConfig,
+ )
+ update_mask: field_mask_pb2.FieldMask = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message=field_mask_pb2.FieldMask,
+ )
+
+
+class ListScanConfigsResponse(proto.Message):
+ r"""Response for the ``ListScanConfigs`` method.
+
+ Attributes:
+ scan_configs (MutableSequence[google.cloud.websecurityscanner_v1alpha.types.ScanConfig]):
+ The list of ScanConfigs returned.
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ scan_configs: MutableSequence[gcw_scan_config.ScanConfig] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=gcw_scan_config.ScanConfig,
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class StartScanRunRequest(proto.Message):
+ r"""Request for the ``StartScanRun`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanConfig
+ to be used. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class GetScanRunRequest(proto.Message):
+ r"""Request for the ``GetScanRun`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanRun to
+ be returned. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListScanRunsRequest(proto.Message):
+ r"""Request for the ``ListScanRuns`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a scan resource name in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ page_token (str):
+ A token identifying a page of results to be returned. This
+ should be a ``next_page_token`` value returned from a
+ previous List request. If unspecified, the first page of
+ results is returned.
+ page_size (int):
+ The maximum number of ScanRuns to return, can
+ be limited by server. If not specified or not
+ positive, the implementation will select a
+ reasonable value.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+
+
+class ListScanRunsResponse(proto.Message):
+ r"""Response for the ``ListScanRuns`` method.
+
+ Attributes:
+ scan_runs (MutableSequence[google.cloud.websecurityscanner_v1alpha.types.ScanRun]):
+ The list of ScanRuns returned.
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ scan_runs: MutableSequence[scan_run.ScanRun] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=scan_run.ScanRun,
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class StopScanRunRequest(proto.Message):
+ r"""Request for the ``StopScanRun`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanRun to
+ be stopped. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListCrawledUrlsRequest(proto.Message):
+ r"""Request for the ``ListCrawledUrls`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a scan run resource name in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ page_token (str):
+ A token identifying a page of results to be returned. This
+ should be a ``next_page_token`` value returned from a
+ previous List request. If unspecified, the first page of
+ results is returned.
+ page_size (int):
+ The maximum number of CrawledUrls to return,
+ can be limited by server. If not specified or
+ not positive, the implementation will select a
+ reasonable value.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+
+
+class ListCrawledUrlsResponse(proto.Message):
+ r"""Response for the ``ListCrawledUrls`` method.
+
+ Attributes:
+ crawled_urls (MutableSequence[google.cloud.websecurityscanner_v1alpha.types.CrawledUrl]):
+ The list of CrawledUrls returned.
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ crawled_urls: MutableSequence[crawled_url.CrawledUrl] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=crawled_url.CrawledUrl,
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class GetFindingRequest(proto.Message):
+ r"""Request for the ``GetFinding`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the Finding to
+ be returned. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}/findings/{findingId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListFindingsRequest(proto.Message):
+ r"""Request for the ``ListFindings`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a scan run resource name in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ filter (str):
+ Required. The filter expression. The expression must be in
+ the format: . Supported field: 'finding_type'. Supported
+ operator: '='.
+ page_token (str):
+ A token identifying a page of results to be returned. This
+ should be a ``next_page_token`` value returned from a
+ previous List request. If unspecified, the first page of
+ results is returned.
+ page_size (int):
+ The maximum number of Findings to return, can
+ be limited by server. If not specified or not
+ positive, the implementation will select a
+ reasonable value.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ filter: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=4,
+ )
+
+
+class ListFindingsResponse(proto.Message):
+ r"""Response for the ``ListFindings`` method.
+
+ Attributes:
+ findings (MutableSequence[google.cloud.websecurityscanner_v1alpha.types.Finding]):
+ The list of Findings returned.
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ findings: MutableSequence[finding.Finding] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=finding.Finding,
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class ListFindingTypeStatsRequest(proto.Message):
+ r"""Request for the ``ListFindingTypeStats`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a scan run resource name in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListFindingTypeStatsResponse(proto.Message):
+ r"""Response for the ``ListFindingTypeStats`` method.
+
+ Attributes:
+ finding_type_stats (MutableSequence[google.cloud.websecurityscanner_v1alpha.types.FindingTypeStats]):
+ The list of FindingTypeStats returned.
+ """
+
+ finding_type_stats: MutableSequence[
+ gcw_finding_type_stats.FindingTypeStats
+ ] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=gcw_finding_type_stats.FindingTypeStats,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/__init__.py
@@ -0,0 +1,97 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from google.cloud.websecurityscanner_v1beta import gapic_version as package_version
+
+__version__ = package_version.__version__
+
+
+from .services.web_security_scanner import (
+ WebSecurityScannerAsyncClient,
+ WebSecurityScannerClient,
+)
+from .types.crawled_url import CrawledUrl
+from .types.finding import Finding
+from .types.finding_addon import (
+ Form,
+ OutdatedLibrary,
+ ViolatingResource,
+ VulnerableHeaders,
+ VulnerableParameters,
+ Xss,
+)
+from .types.finding_type_stats import FindingTypeStats
+from .types.scan_config import ScanConfig
+from .types.scan_config_error import ScanConfigError
+from .types.scan_run import ScanRun
+from .types.scan_run_error_trace import ScanRunErrorTrace
+from .types.scan_run_warning_trace import ScanRunWarningTrace
+from .types.web_security_scanner import (
+ CreateScanConfigRequest,
+ DeleteScanConfigRequest,
+ GetFindingRequest,
+ GetScanConfigRequest,
+ GetScanRunRequest,
+ ListCrawledUrlsRequest,
+ ListCrawledUrlsResponse,
+ ListFindingsRequest,
+ ListFindingsResponse,
+ ListFindingTypeStatsRequest,
+ ListFindingTypeStatsResponse,
+ ListScanConfigsRequest,
+ ListScanConfigsResponse,
+ ListScanRunsRequest,
+ ListScanRunsResponse,
+ StartScanRunRequest,
+ StopScanRunRequest,
+ UpdateScanConfigRequest,
+)
+
+__all__ = (
+ "WebSecurityScannerAsyncClient",
+ "CrawledUrl",
+ "CreateScanConfigRequest",
+ "DeleteScanConfigRequest",
+ "Finding",
+ "FindingTypeStats",
+ "Form",
+ "GetFindingRequest",
+ "GetScanConfigRequest",
+ "GetScanRunRequest",
+ "ListCrawledUrlsRequest",
+ "ListCrawledUrlsResponse",
+ "ListFindingTypeStatsRequest",
+ "ListFindingTypeStatsResponse",
+ "ListFindingsRequest",
+ "ListFindingsResponse",
+ "ListScanConfigsRequest",
+ "ListScanConfigsResponse",
+ "ListScanRunsRequest",
+ "ListScanRunsResponse",
+ "OutdatedLibrary",
+ "ScanConfig",
+ "ScanConfigError",
+ "ScanRun",
+ "ScanRunErrorTrace",
+ "ScanRunWarningTrace",
+ "StartScanRunRequest",
+ "StopScanRunRequest",
+ "UpdateScanConfigRequest",
+ "ViolatingResource",
+ "VulnerableHeaders",
+ "VulnerableParameters",
+ "WebSecurityScannerClient",
+ "Xss",
+)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/gapic_version.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/gapic_version.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/gapic_version.py
@@ -0,0 +1,16 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+__version__ = "1.12.1" # {x-release-please-version}
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/__init__.py
@@ -0,0 +1,15 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/__init__.py
@@ -0,0 +1,22 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .async_client import WebSecurityScannerAsyncClient
+from .client import WebSecurityScannerClient
+
+__all__ = (
+ "WebSecurityScannerClient",
+ "WebSecurityScannerAsyncClient",
+)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/async_client.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/async_client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/async_client.py
@@ -0,0 +1,1784 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import functools
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+)
+
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.api_core.client_options import ClientOptions
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.websecurityscanner_v1beta import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.websecurityscanner_v1beta.services.web_security_scanner import pagers
+from google.cloud.websecurityscanner_v1beta.types import (
+ scan_run,
+ scan_run_error_trace,
+ scan_run_warning_trace,
+ web_security_scanner,
+)
+from google.cloud.websecurityscanner_v1beta.types import (
+ crawled_url,
+ finding,
+ finding_addon,
+ finding_type_stats,
+)
+from google.cloud.websecurityscanner_v1beta.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1beta.types import scan_config
+
+from .client import WebSecurityScannerClient
+from .transports.base import DEFAULT_CLIENT_INFO, WebSecurityScannerTransport
+from .transports.grpc_asyncio import WebSecurityScannerGrpcAsyncIOTransport
+
+
+class WebSecurityScannerAsyncClient:
+ """Cloud Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud
+ Platform. It crawls your application, and attempts to exercise
+ as many user inputs and event handlers as possible.
+ """
+
+ _client: WebSecurityScannerClient
+
+ DEFAULT_ENDPOINT = WebSecurityScannerClient.DEFAULT_ENDPOINT
+ DEFAULT_MTLS_ENDPOINT = WebSecurityScannerClient.DEFAULT_MTLS_ENDPOINT
+
+ finding_path = staticmethod(WebSecurityScannerClient.finding_path)
+ parse_finding_path = staticmethod(WebSecurityScannerClient.parse_finding_path)
+ scan_config_path = staticmethod(WebSecurityScannerClient.scan_config_path)
+ parse_scan_config_path = staticmethod(
+ WebSecurityScannerClient.parse_scan_config_path
+ )
+ scan_run_path = staticmethod(WebSecurityScannerClient.scan_run_path)
+ parse_scan_run_path = staticmethod(WebSecurityScannerClient.parse_scan_run_path)
+ common_billing_account_path = staticmethod(
+ WebSecurityScannerClient.common_billing_account_path
+ )
+ parse_common_billing_account_path = staticmethod(
+ WebSecurityScannerClient.parse_common_billing_account_path
+ )
+ common_folder_path = staticmethod(WebSecurityScannerClient.common_folder_path)
+ parse_common_folder_path = staticmethod(
+ WebSecurityScannerClient.parse_common_folder_path
+ )
+ common_organization_path = staticmethod(
+ WebSecurityScannerClient.common_organization_path
+ )
+ parse_common_organization_path = staticmethod(
+ WebSecurityScannerClient.parse_common_organization_path
+ )
+ common_project_path = staticmethod(WebSecurityScannerClient.common_project_path)
+ parse_common_project_path = staticmethod(
+ WebSecurityScannerClient.parse_common_project_path
+ )
+ common_location_path = staticmethod(WebSecurityScannerClient.common_location_path)
+ parse_common_location_path = staticmethod(
+ WebSecurityScannerClient.parse_common_location_path
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ WebSecurityScannerAsyncClient: The constructed client.
+ """
+ return WebSecurityScannerClient.from_service_account_info.__func__(WebSecurityScannerAsyncClient, info, *args, **kwargs) # type: ignore
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ WebSecurityScannerAsyncClient: The constructed client.
+ """
+ return WebSecurityScannerClient.from_service_account_file.__func__(WebSecurityScannerAsyncClient, filename, *args, **kwargs) # type: ignore
+
+ from_service_account_json = from_service_account_file
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ return WebSecurityScannerClient.get_mtls_endpoint_and_cert_source(client_options) # type: ignore
+
+ @property
+ def transport(self) -> WebSecurityScannerTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ WebSecurityScannerTransport: The transport used by the client instance.
+ """
+ return self._client.transport
+
+ get_transport_class = functools.partial(
+ type(WebSecurityScannerClient).get_transport_class,
+ type(WebSecurityScannerClient),
+ )
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Union[str, WebSecurityScannerTransport] = "grpc_asyncio",
+ client_options: Optional[ClientOptions] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the web security scanner client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, ~.WebSecurityScannerTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (ClientOptions): Custom options for the client. It
+ won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ self._client = WebSecurityScannerClient(
+ credentials=credentials,
+ transport=transport,
+ client_options=client_options,
+ client_info=client_info,
+ )
+
+ async def create_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.CreateScanConfigRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ scan_config: Optional[gcw_scan_config.ScanConfig] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> gcw_scan_config.ScanConfig:
+ r"""Creates a new ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ async def sample_create_scan_config():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ scan_config = websecurityscanner_v1beta.ScanConfig()
+ scan_config.display_name = "display_name_value"
+ scan_config.starting_urls = ['starting_urls_value1', 'starting_urls_value2']
+
+ request = websecurityscanner_v1beta.CreateScanConfigRequest(
+ parent="parent_value",
+ scan_config=scan_config,
+ )
+
+ # Make the request
+ response = await client.create_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1beta.types.CreateScanConfigRequest, dict]]):
+ The request object. Request for the ``CreateScanConfig`` method.
+ parent (:class:`str`):
+ Required. The parent resource name
+ where the scan is created, which should
+ be a project resource name in the format
+ 'projects/{projectId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ scan_config (:class:`google.cloud.websecurityscanner_v1beta.types.ScanConfig`):
+ Required. The ScanConfig to be
+ created.
+
+ This corresponds to the ``scan_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, scan_config])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.CreateScanConfigRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if scan_config is not None:
+ request.scan_config = scan_config
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.create_scan_config,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def delete_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.DeleteScanConfigRequest, dict]
+ ] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> None:
+ r"""Deletes an existing ScanConfig and its child
+ resources.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ async def sample_delete_scan_config():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.DeleteScanConfigRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ await client.delete_scan_config(request=request)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1beta.types.DeleteScanConfigRequest, dict]]):
+ The request object. Request for the ``DeleteScanConfig`` method.
+ name (:class:`str`):
+ Required. The resource name of the
+ ScanConfig to be deleted. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.DeleteScanConfigRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.delete_scan_config,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ async def get_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.GetScanConfigRequest, dict]
+ ] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Gets a ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ async def sample_get_scan_config():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.GetScanConfigRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1beta.types.GetScanConfigRequest, dict]]):
+ The request object. Request for the ``GetScanConfig`` method.
+ name (:class:`str`):
+ Required. The resource name of the
+ ScanConfig to be returned. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.GetScanConfigRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_scan_config,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_scan_configs(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListScanConfigsRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListScanConfigsAsyncPager:
+ r"""Lists ScanConfigs under a given project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ async def sample_list_scan_configs():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.ListScanConfigsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_scan_configs(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1beta.types.ListScanConfigsRequest, dict]]):
+ The request object. Request for the ``ListScanConfigs`` method.
+ parent (:class:`str`):
+ Required. The parent resource name,
+ which should be a project resource name
+ in the format 'projects/{projectId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.services.web_security_scanner.pagers.ListScanConfigsAsyncPager:
+ Response for the ListScanConfigs method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.ListScanConfigsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_scan_configs,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListScanConfigsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def update_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.UpdateScanConfigRequest, dict]
+ ] = None,
+ *,
+ scan_config: Optional[gcw_scan_config.ScanConfig] = None,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> gcw_scan_config.ScanConfig:
+ r"""Updates a ScanConfig. This method support partial
+ update of a ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ async def sample_update_scan_config():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ scan_config = websecurityscanner_v1beta.ScanConfig()
+ scan_config.display_name = "display_name_value"
+ scan_config.starting_urls = ['starting_urls_value1', 'starting_urls_value2']
+
+ request = websecurityscanner_v1beta.UpdateScanConfigRequest(
+ scan_config=scan_config,
+ )
+
+ # Make the request
+ response = await client.update_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1beta.types.UpdateScanConfigRequest, dict]]):
+ The request object. Request for the ``UpdateScanConfigRequest`` method.
+ scan_config (:class:`google.cloud.websecurityscanner_v1beta.types.ScanConfig`):
+ Required. The ScanConfig to be
+ updated. The name field must be set to
+ identify the resource to be updated. The
+ values of fields not covered by the mask
+ will be ignored.
+
+ This corresponds to the ``scan_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`):
+ Required. The update mask applies to the resource. For
+ the ``FieldMask`` definition, see
+ https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([scan_config, update_mask])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.UpdateScanConfigRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if scan_config is not None:
+ request.scan_config = scan_config
+ if update_mask is not None:
+ request.update_mask = update_mask
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.update_scan_config,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("scan_config.name", request.scan_config.name),)
+ ),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def start_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.StartScanRunRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Start a ScanRun according to the given ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ async def sample_start_scan_run():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.StartScanRunRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.start_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1beta.types.StartScanRunRequest, dict]]):
+ The request object. Request for the ``StartScanRun`` method.
+ name (:class:`str`):
+ Required. The resource name of the
+ ScanConfig to be used. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.StartScanRunRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.start_scan_run,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.GetScanRunRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Gets a ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ async def sample_get_scan_run():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.GetScanRunRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1beta.types.GetScanRunRequest, dict]]):
+ The request object. Request for the ``GetScanRun`` method.
+ name (:class:`str`):
+ Required. The resource name of the
+ ScanRun to be returned. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.GetScanRunRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_scan_run,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_scan_runs(
+ self,
+ request: Optional[Union[web_security_scanner.ListScanRunsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListScanRunsAsyncPager:
+ r"""Lists ScanRuns under a given ScanConfig, in
+ descending order of ScanRun stop time.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ async def sample_list_scan_runs():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.ListScanRunsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_scan_runs(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1beta.types.ListScanRunsRequest, dict]]):
+ The request object. Request for the ``ListScanRuns`` method.
+ parent (:class:`str`):
+ Required. The parent resource name,
+ which should be a scan resource name in
+ the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.services.web_security_scanner.pagers.ListScanRunsAsyncPager:
+ Response for the ListScanRuns method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.ListScanRunsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_scan_runs,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListScanRunsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def stop_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.StopScanRunRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Stops a ScanRun. The stopped ScanRun is returned.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ async def sample_stop_scan_run():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.StopScanRunRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.stop_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1beta.types.StopScanRunRequest, dict]]):
+ The request object. Request for the ``StopScanRun`` method.
+ name (:class:`str`):
+ Required. The resource name of the
+ ScanRun to be stopped. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.StopScanRunRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.stop_scan_run,
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_crawled_urls(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListCrawledUrlsRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListCrawledUrlsAsyncPager:
+ r"""List CrawledUrls under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ async def sample_list_crawled_urls():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.ListCrawledUrlsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_crawled_urls(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1beta.types.ListCrawledUrlsRequest, dict]]):
+ The request object. Request for the ``ListCrawledUrls`` method.
+ parent (:class:`str`):
+ Required. The parent resource name,
+ which should be a scan run resource name
+ in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.services.web_security_scanner.pagers.ListCrawledUrlsAsyncPager:
+ Response for the ListCrawledUrls method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.ListCrawledUrlsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_crawled_urls,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListCrawledUrlsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def get_finding(
+ self,
+ request: Optional[Union[web_security_scanner.GetFindingRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> finding.Finding:
+ r"""Gets a Finding.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ async def sample_get_finding():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.GetFindingRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = await client.get_finding(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1beta.types.GetFindingRequest, dict]]):
+ The request object. Request for the ``GetFinding`` method.
+ name (:class:`str`):
+ Required. The resource name of the
+ Finding to be returned. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}/findings/{findingId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.Finding:
+ A Finding resource represents a
+ vulnerability instance identified during
+ a ScanRun.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.GetFindingRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.get_finding,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_findings(
+ self,
+ request: Optional[Union[web_security_scanner.ListFindingsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ filter: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListFindingsAsyncPager:
+ r"""List Findings under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ async def sample_list_findings():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.ListFindingsRequest(
+ parent="parent_value",
+ filter="filter_value",
+ )
+
+ # Make the request
+ page_result = client.list_findings(request=request)
+
+ # Handle the response
+ async for response in page_result:
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1beta.types.ListFindingsRequest, dict]]):
+ The request object. Request for the ``ListFindings`` method.
+ parent (:class:`str`):
+ Required. The parent resource name,
+ which should be a scan run resource name
+ in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ filter (:class:`str`):
+ Required. The filter expression. The expression must be
+ in the format: . Supported field: 'finding_type'.
+ Supported operator: '='.
+
+ This corresponds to the ``filter`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.services.web_security_scanner.pagers.ListFindingsAsyncPager:
+ Response for the ListFindings method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, filter])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.ListFindingsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if filter is not None:
+ request.filter = filter
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_findings,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__aiter__` convenience method.
+ response = pagers.ListFindingsAsyncPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def list_finding_type_stats(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListFindingTypeStatsRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListFindingTypeStatsResponse:
+ r"""List all FindingTypeStats under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ async def sample_list_finding_type_stats():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerAsyncClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.ListFindingTypeStatsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ response = await client.list_finding_type_stats(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Optional[Union[google.cloud.websecurityscanner_v1beta.types.ListFindingTypeStatsRequest, dict]]):
+ The request object. Request for the ``ListFindingTypeStats`` method.
+ parent (:class:`str`):
+ Required. The parent resource name,
+ which should be a scan run resource name
+ in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ListFindingTypeStatsResponse:
+ Response for the ListFindingTypeStats method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ request = web_security_scanner.ListFindingTypeStatsRequest(request)
+
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = gapic_v1.method_async.wrap_method(
+ self._client._transport.list_finding_type_stats,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=DEFAULT_CLIENT_INFO,
+ )
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = await rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, exc_type, exc, tb):
+ await self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("WebSecurityScannerAsyncClient",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/client.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/client.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/client.py
@@ -0,0 +1,1968 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+import os
+import re
+from typing import (
+ Dict,
+ Mapping,
+ MutableMapping,
+ MutableSequence,
+ Optional,
+ Sequence,
+ Tuple,
+ Type,
+ Union,
+ cast,
+)
+
+from google.api_core import client_options as client_options_lib
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.exceptions import MutualTLSChannelError # type: ignore
+from google.auth.transport import mtls # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+
+from google.cloud.websecurityscanner_v1beta import gapic_version as package_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+from google.protobuf import field_mask_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+
+from google.cloud.websecurityscanner_v1beta.services.web_security_scanner import pagers
+from google.cloud.websecurityscanner_v1beta.types import (
+ scan_run,
+ scan_run_error_trace,
+ scan_run_warning_trace,
+ web_security_scanner,
+)
+from google.cloud.websecurityscanner_v1beta.types import (
+ crawled_url,
+ finding,
+ finding_addon,
+ finding_type_stats,
+)
+from google.cloud.websecurityscanner_v1beta.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1beta.types import scan_config
+
+from .transports.base import DEFAULT_CLIENT_INFO, WebSecurityScannerTransport
+from .transports.grpc import WebSecurityScannerGrpcTransport
+from .transports.grpc_asyncio import WebSecurityScannerGrpcAsyncIOTransport
+from .transports.rest import WebSecurityScannerRestTransport
+
+
+class WebSecurityScannerClientMeta(type):
+ """Metaclass for the WebSecurityScanner client.
+
+ This provides class-level methods for building and retrieving
+ support objects (e.g. transport) without polluting the client instance
+ objects.
+ """
+
+ _transport_registry = (
+ OrderedDict()
+ ) # type: Dict[str, Type[WebSecurityScannerTransport]]
+ _transport_registry["grpc"] = WebSecurityScannerGrpcTransport
+ _transport_registry["grpc_asyncio"] = WebSecurityScannerGrpcAsyncIOTransport
+ _transport_registry["rest"] = WebSecurityScannerRestTransport
+
+ def get_transport_class(
+ cls,
+ label: Optional[str] = None,
+ ) -> Type[WebSecurityScannerTransport]:
+ """Returns an appropriate transport class.
+
+ Args:
+ label: The name of the desired transport. If none is
+ provided, then the first transport in the registry is used.
+
+ Returns:
+ The transport class to use.
+ """
+ # If a specific transport is requested, return that one.
+ if label:
+ return cls._transport_registry[label]
+
+ # No transport is requested; return the default (that is, the first one
+ # in the dictionary).
+ return next(iter(cls._transport_registry.values()))
+
+
+class WebSecurityScannerClient(metaclass=WebSecurityScannerClientMeta):
+ """Cloud Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud
+ Platform. It crawls your application, and attempts to exercise
+ as many user inputs and event handlers as possible.
+ """
+
+ @staticmethod
+ def _get_default_mtls_endpoint(api_endpoint):
+ """Converts api endpoint to mTLS endpoint.
+
+ Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to
+ "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively.
+ Args:
+ api_endpoint (Optional[str]): the api endpoint to convert.
+ Returns:
+ str: converted mTLS api endpoint.
+ """
+ if not api_endpoint:
+ return api_endpoint
+
+ mtls_endpoint_re = re.compile(
+ r"(?P<name>[^.]+)(?P<mtls>\.mtls)?(?P<sandbox>\.sandbox)?(?P<googledomain>\.googleapis\.com)?"
+ )
+
+ m = mtls_endpoint_re.match(api_endpoint)
+ name, mtls, sandbox, googledomain = m.groups()
+ if mtls or not googledomain:
+ return api_endpoint
+
+ if sandbox:
+ return api_endpoint.replace(
+ "sandbox.googleapis.com", "mtls.sandbox.googleapis.com"
+ )
+
+ return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com")
+
+ DEFAULT_ENDPOINT = "websecurityscanner.googleapis.com"
+ DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore
+ DEFAULT_ENDPOINT
+ )
+
+ @classmethod
+ def from_service_account_info(cls, info: dict, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ info.
+
+ Args:
+ info (dict): The service account private key info.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ WebSecurityScannerClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_info(info)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ @classmethod
+ def from_service_account_file(cls, filename: str, *args, **kwargs):
+ """Creates an instance of this client using the provided credentials
+ file.
+
+ Args:
+ filename (str): The path to the service account private key json
+ file.
+ args: Additional arguments to pass to the constructor.
+ kwargs: Additional arguments to pass to the constructor.
+
+ Returns:
+ WebSecurityScannerClient: The constructed client.
+ """
+ credentials = service_account.Credentials.from_service_account_file(filename)
+ kwargs["credentials"] = credentials
+ return cls(*args, **kwargs)
+
+ from_service_account_json = from_service_account_file
+
+ @property
+ def transport(self) -> WebSecurityScannerTransport:
+ """Returns the transport used by the client instance.
+
+ Returns:
+ WebSecurityScannerTransport: The transport used by the client
+ instance.
+ """
+ return self._transport
+
+ @staticmethod
+ def finding_path(
+ project: str,
+ scan_config: str,
+ scan_run: str,
+ finding: str,
+ ) -> str:
+ """Returns a fully-qualified finding string."""
+ return "projects/{project}/scanConfigs/{scan_config}/scanRuns/{scan_run}/findings/{finding}".format(
+ project=project,
+ scan_config=scan_config,
+ scan_run=scan_run,
+ finding=finding,
+ )
+
+ @staticmethod
+ def parse_finding_path(path: str) -> Dict[str, str]:
+ """Parses a finding path into its component segments."""
+ m = re.match(
+ r"^projects/(?P<project>.+?)/scanConfigs/(?P<scan_config>.+?)/scanRuns/(?P<scan_run>.+?)/findings/(?P<finding>.+?)$",
+ path,
+ )
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def scan_config_path(
+ project: str,
+ scan_config: str,
+ ) -> str:
+ """Returns a fully-qualified scan_config string."""
+ return "projects/{project}/scanConfigs/{scan_config}".format(
+ project=project,
+ scan_config=scan_config,
+ )
+
+ @staticmethod
+ def parse_scan_config_path(path: str) -> Dict[str, str]:
+ """Parses a scan_config path into its component segments."""
+ m = re.match(
+ r"^projects/(?P<project>.+?)/scanConfigs/(?P<scan_config>.+?)$", path
+ )
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def scan_run_path(
+ project: str,
+ scan_config: str,
+ scan_run: str,
+ ) -> str:
+ """Returns a fully-qualified scan_run string."""
+ return (
+ "projects/{project}/scanConfigs/{scan_config}/scanRuns/{scan_run}".format(
+ project=project,
+ scan_config=scan_config,
+ scan_run=scan_run,
+ )
+ )
+
+ @staticmethod
+ def parse_scan_run_path(path: str) -> Dict[str, str]:
+ """Parses a scan_run path into its component segments."""
+ m = re.match(
+ r"^projects/(?P<project>.+?)/scanConfigs/(?P<scan_config>.+?)/scanRuns/(?P<scan_run>.+?)$",
+ path,
+ )
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_billing_account_path(
+ billing_account: str,
+ ) -> str:
+ """Returns a fully-qualified billing_account string."""
+ return "billingAccounts/{billing_account}".format(
+ billing_account=billing_account,
+ )
+
+ @staticmethod
+ def parse_common_billing_account_path(path: str) -> Dict[str, str]:
+ """Parse a billing_account path into its component segments."""
+ m = re.match(r"^billingAccounts/(?P<billing_account>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_folder_path(
+ folder: str,
+ ) -> str:
+ """Returns a fully-qualified folder string."""
+ return "folders/{folder}".format(
+ folder=folder,
+ )
+
+ @staticmethod
+ def parse_common_folder_path(path: str) -> Dict[str, str]:
+ """Parse a folder path into its component segments."""
+ m = re.match(r"^folders/(?P<folder>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_organization_path(
+ organization: str,
+ ) -> str:
+ """Returns a fully-qualified organization string."""
+ return "organizations/{organization}".format(
+ organization=organization,
+ )
+
+ @staticmethod
+ def parse_common_organization_path(path: str) -> Dict[str, str]:
+ """Parse a organization path into its component segments."""
+ m = re.match(r"^organizations/(?P<organization>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_project_path(
+ project: str,
+ ) -> str:
+ """Returns a fully-qualified project string."""
+ return "projects/{project}".format(
+ project=project,
+ )
+
+ @staticmethod
+ def parse_common_project_path(path: str) -> Dict[str, str]:
+ """Parse a project path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @staticmethod
+ def common_location_path(
+ project: str,
+ location: str,
+ ) -> str:
+ """Returns a fully-qualified location string."""
+ return "projects/{project}/locations/{location}".format(
+ project=project,
+ location=location,
+ )
+
+ @staticmethod
+ def parse_common_location_path(path: str) -> Dict[str, str]:
+ """Parse a location path into its component segments."""
+ m = re.match(r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)$", path)
+ return m.groupdict() if m else {}
+
+ @classmethod
+ def get_mtls_endpoint_and_cert_source(
+ cls, client_options: Optional[client_options_lib.ClientOptions] = None
+ ):
+ """Return the API endpoint and client cert source for mutual TLS.
+
+ The client cert source is determined in the following order:
+ (1) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is not "true", the
+ client cert source is None.
+ (2) if `client_options.client_cert_source` is provided, use the provided one; if the
+ default client cert source exists, use the default one; otherwise the client cert
+ source is None.
+
+ The API endpoint is determined in the following order:
+ (1) if `client_options.api_endpoint` if provided, use the provided one.
+ (2) if `GOOGLE_API_USE_CLIENT_CERTIFICATE` environment variable is "always", use the
+ default mTLS endpoint; if the environment variable is "never", use the default API
+ endpoint; otherwise if client cert source exists, use the default mTLS endpoint, otherwise
+ use the default API endpoint.
+
+ More details can be found at https://google.aip.dev/auth/4114.
+
+ Args:
+ client_options (google.api_core.client_options.ClientOptions): Custom options for the
+ client. Only the `api_endpoint` and `client_cert_source` properties may be used
+ in this method.
+
+ Returns:
+ Tuple[str, Callable[[], Tuple[bytes, bytes]]]: returns the API endpoint and the
+ client cert source to use.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If any errors happen.
+ """
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ use_client_cert = os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")
+ use_mtls_endpoint = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto")
+ if use_client_cert not in ("true", "false"):
+ raise ValueError(
+ "Environment variable `GOOGLE_API_USE_CLIENT_CERTIFICATE` must be either `true` or `false`"
+ )
+ if use_mtls_endpoint not in ("auto", "never", "always"):
+ raise MutualTLSChannelError(
+ "Environment variable `GOOGLE_API_USE_MTLS_ENDPOINT` must be `never`, `auto` or `always`"
+ )
+
+ # Figure out the client cert source to use.
+ client_cert_source = None
+ if use_client_cert == "true":
+ if client_options.client_cert_source:
+ client_cert_source = client_options.client_cert_source
+ elif mtls.has_default_client_cert_source():
+ client_cert_source = mtls.default_client_cert_source()
+
+ # Figure out which api endpoint to use.
+ if client_options.api_endpoint is not None:
+ api_endpoint = client_options.api_endpoint
+ elif use_mtls_endpoint == "always" or (
+ use_mtls_endpoint == "auto" and client_cert_source
+ ):
+ api_endpoint = cls.DEFAULT_MTLS_ENDPOINT
+ else:
+ api_endpoint = cls.DEFAULT_ENDPOINT
+
+ return api_endpoint, client_cert_source
+
+ def __init__(
+ self,
+ *,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ transport: Optional[Union[str, WebSecurityScannerTransport]] = None,
+ client_options: Optional[Union[client_options_lib.ClientOptions, dict]] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ ) -> None:
+ """Instantiates the web security scanner client.
+
+ Args:
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ transport (Union[str, WebSecurityScannerTransport]): The
+ transport to use. If set to None, a transport is chosen
+ automatically.
+ client_options (Optional[Union[google.api_core.client_options.ClientOptions, dict]]): Custom options for the
+ client. It won't take effect if a ``transport`` instance is provided.
+ (1) The ``api_endpoint`` property can be used to override the
+ default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT
+ environment variable can also be used to override the endpoint:
+ "always" (always use the default mTLS endpoint), "never" (always
+ use the default regular endpoint) and "auto" (auto switch to the
+ default mTLS endpoint if client certificate is present, this is
+ the default value). However, the ``api_endpoint`` property takes
+ precedence if provided.
+ (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable
+ is "true", then the ``client_cert_source`` property can be used
+ to provide client certificate for mutual TLS transport. If
+ not provided, the default SSL client certificate will be used if
+ present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not
+ set, no client certificate will be used.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ """
+ if isinstance(client_options, dict):
+ client_options = client_options_lib.from_dict(client_options)
+ if client_options is None:
+ client_options = client_options_lib.ClientOptions()
+ client_options = cast(client_options_lib.ClientOptions, client_options)
+
+ api_endpoint, client_cert_source_func = self.get_mtls_endpoint_and_cert_source(
+ client_options
+ )
+
+ api_key_value = getattr(client_options, "api_key", None)
+ if api_key_value and credentials:
+ raise ValueError(
+ "client_options.api_key and credentials are mutually exclusive"
+ )
+
+ # Save or instantiate the transport.
+ # Ordinarily, we provide the transport, but allowing a custom transport
+ # instance provides an extensibility point for unusual situations.
+ if isinstance(transport, WebSecurityScannerTransport):
+ # transport is a WebSecurityScannerTransport instance.
+ if credentials or client_options.credentials_file or api_key_value:
+ raise ValueError(
+ "When providing a transport instance, "
+ "provide its credentials directly."
+ )
+ if client_options.scopes:
+ raise ValueError(
+ "When providing a transport instance, provide its scopes "
+ "directly."
+ )
+ self._transport = transport
+ else:
+ import google.auth._default # type: ignore
+
+ if api_key_value and hasattr(
+ google.auth._default, "get_api_key_credentials"
+ ):
+ credentials = google.auth._default.get_api_key_credentials(
+ api_key_value
+ )
+
+ Transport = type(self).get_transport_class(transport)
+ self._transport = Transport(
+ credentials=credentials,
+ credentials_file=client_options.credentials_file,
+ host=api_endpoint,
+ scopes=client_options.scopes,
+ client_cert_source_for_mtls=client_cert_source_func,
+ quota_project_id=client_options.quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=True,
+ api_audience=client_options.api_audience,
+ )
+
+ def create_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.CreateScanConfigRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ scan_config: Optional[gcw_scan_config.ScanConfig] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> gcw_scan_config.ScanConfig:
+ r"""Creates a new ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ def sample_create_scan_config():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ scan_config = websecurityscanner_v1beta.ScanConfig()
+ scan_config.display_name = "display_name_value"
+ scan_config.starting_urls = ['starting_urls_value1', 'starting_urls_value2']
+
+ request = websecurityscanner_v1beta.CreateScanConfigRequest(
+ parent="parent_value",
+ scan_config=scan_config,
+ )
+
+ # Make the request
+ response = client.create_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1beta.types.CreateScanConfigRequest, dict]):
+ The request object. Request for the ``CreateScanConfig`` method.
+ parent (str):
+ Required. The parent resource name
+ where the scan is created, which should
+ be a project resource name in the format
+ 'projects/{projectId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ scan_config (google.cloud.websecurityscanner_v1beta.types.ScanConfig):
+ Required. The ScanConfig to be
+ created.
+
+ This corresponds to the ``scan_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, scan_config])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.CreateScanConfigRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.CreateScanConfigRequest):
+ request = web_security_scanner.CreateScanConfigRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if scan_config is not None:
+ request.scan_config = scan_config
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.create_scan_config]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def delete_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.DeleteScanConfigRequest, dict]
+ ] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> None:
+ r"""Deletes an existing ScanConfig and its child
+ resources.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ def sample_delete_scan_config():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.DeleteScanConfigRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ client.delete_scan_config(request=request)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1beta.types.DeleteScanConfigRequest, dict]):
+ The request object. Request for the ``DeleteScanConfig`` method.
+ name (str):
+ Required. The resource name of the
+ ScanConfig to be deleted. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.DeleteScanConfigRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.DeleteScanConfigRequest):
+ request = web_security_scanner.DeleteScanConfigRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.delete_scan_config]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ def get_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.GetScanConfigRequest, dict]
+ ] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Gets a ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ def sample_get_scan_config():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.GetScanConfigRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1beta.types.GetScanConfigRequest, dict]):
+ The request object. Request for the ``GetScanConfig`` method.
+ name (str):
+ Required. The resource name of the
+ ScanConfig to be returned. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.GetScanConfigRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.GetScanConfigRequest):
+ request = web_security_scanner.GetScanConfigRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_scan_config]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_scan_configs(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListScanConfigsRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListScanConfigsPager:
+ r"""Lists ScanConfigs under a given project.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ def sample_list_scan_configs():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.ListScanConfigsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_scan_configs(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1beta.types.ListScanConfigsRequest, dict]):
+ The request object. Request for the ``ListScanConfigs`` method.
+ parent (str):
+ Required. The parent resource name,
+ which should be a project resource name
+ in the format 'projects/{projectId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.services.web_security_scanner.pagers.ListScanConfigsPager:
+ Response for the ListScanConfigs method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListScanConfigsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListScanConfigsRequest):
+ request = web_security_scanner.ListScanConfigsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_scan_configs]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListScanConfigsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def update_scan_config(
+ self,
+ request: Optional[
+ Union[web_security_scanner.UpdateScanConfigRequest, dict]
+ ] = None,
+ *,
+ scan_config: Optional[gcw_scan_config.ScanConfig] = None,
+ update_mask: Optional[field_mask_pb2.FieldMask] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> gcw_scan_config.ScanConfig:
+ r"""Updates a ScanConfig. This method support partial
+ update of a ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ def sample_update_scan_config():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ scan_config = websecurityscanner_v1beta.ScanConfig()
+ scan_config.display_name = "display_name_value"
+ scan_config.starting_urls = ['starting_urls_value1', 'starting_urls_value2']
+
+ request = websecurityscanner_v1beta.UpdateScanConfigRequest(
+ scan_config=scan_config,
+ )
+
+ # Make the request
+ response = client.update_scan_config(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1beta.types.UpdateScanConfigRequest, dict]):
+ The request object. Request for the ``UpdateScanConfigRequest`` method.
+ scan_config (google.cloud.websecurityscanner_v1beta.types.ScanConfig):
+ Required. The ScanConfig to be
+ updated. The name field must be set to
+ identify the resource to be updated. The
+ values of fields not covered by the mask
+ will be ignored.
+
+ This corresponds to the ``scan_config`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Required. The update mask applies to the resource. For
+ the ``FieldMask`` definition, see
+ https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask
+
+ This corresponds to the ``update_mask`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([scan_config, update_mask])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.UpdateScanConfigRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.UpdateScanConfigRequest):
+ request = web_security_scanner.UpdateScanConfigRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if scan_config is not None:
+ request.scan_config = scan_config
+ if update_mask is not None:
+ request.update_mask = update_mask
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.update_scan_config]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata(
+ (("scan_config.name", request.scan_config.name),)
+ ),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def start_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.StartScanRunRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Start a ScanRun according to the given ScanConfig.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ def sample_start_scan_run():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.StartScanRunRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.start_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1beta.types.StartScanRunRequest, dict]):
+ The request object. Request for the ``StartScanRun`` method.
+ name (str):
+ Required. The resource name of the
+ ScanConfig to be used. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.StartScanRunRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.StartScanRunRequest):
+ request = web_security_scanner.StartScanRunRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.start_scan_run]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.GetScanRunRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Gets a ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ def sample_get_scan_run():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.GetScanRunRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1beta.types.GetScanRunRequest, dict]):
+ The request object. Request for the ``GetScanRun`` method.
+ name (str):
+ Required. The resource name of the
+ ScanRun to be returned. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.GetScanRunRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.GetScanRunRequest):
+ request = web_security_scanner.GetScanRunRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_scan_run]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_scan_runs(
+ self,
+ request: Optional[Union[web_security_scanner.ListScanRunsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListScanRunsPager:
+ r"""Lists ScanRuns under a given ScanConfig, in
+ descending order of ScanRun stop time.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ def sample_list_scan_runs():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.ListScanRunsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_scan_runs(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1beta.types.ListScanRunsRequest, dict]):
+ The request object. Request for the ``ListScanRuns`` method.
+ parent (str):
+ Required. The parent resource name,
+ which should be a scan resource name in
+ the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.services.web_security_scanner.pagers.ListScanRunsPager:
+ Response for the ListScanRuns method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListScanRunsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListScanRunsRequest):
+ request = web_security_scanner.ListScanRunsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_scan_runs]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListScanRunsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def stop_scan_run(
+ self,
+ request: Optional[Union[web_security_scanner.StopScanRunRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Stops a ScanRun. The stopped ScanRun is returned.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ def sample_stop_scan_run():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.StopScanRunRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.stop_scan_run(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1beta.types.StopScanRunRequest, dict]):
+ The request object. Request for the ``StopScanRun`` method.
+ name (str):
+ Required. The resource name of the
+ ScanRun to be stopped. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.StopScanRunRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.StopScanRunRequest):
+ request = web_security_scanner.StopScanRunRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.stop_scan_run]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_crawled_urls(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListCrawledUrlsRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListCrawledUrlsPager:
+ r"""List CrawledUrls under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ def sample_list_crawled_urls():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.ListCrawledUrlsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ page_result = client.list_crawled_urls(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1beta.types.ListCrawledUrlsRequest, dict]):
+ The request object. Request for the ``ListCrawledUrls`` method.
+ parent (str):
+ Required. The parent resource name,
+ which should be a scan run resource name
+ in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.services.web_security_scanner.pagers.ListCrawledUrlsPager:
+ Response for the ListCrawledUrls method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListCrawledUrlsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListCrawledUrlsRequest):
+ request = web_security_scanner.ListCrawledUrlsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_crawled_urls]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListCrawledUrlsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def get_finding(
+ self,
+ request: Optional[Union[web_security_scanner.GetFindingRequest, dict]] = None,
+ *,
+ name: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> finding.Finding:
+ r"""Gets a Finding.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ def sample_get_finding():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.GetFindingRequest(
+ name="name_value",
+ )
+
+ # Make the request
+ response = client.get_finding(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1beta.types.GetFindingRequest, dict]):
+ The request object. Request for the ``GetFinding`` method.
+ name (str):
+ Required. The resource name of the
+ Finding to be returned. The name follows
+ the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}/findings/{findingId}'.
+
+ This corresponds to the ``name`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.Finding:
+ A Finding resource represents a
+ vulnerability instance identified during
+ a ScanRun.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([name])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.GetFindingRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.GetFindingRequest):
+ request = web_security_scanner.GetFindingRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if name is not None:
+ request.name = name
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.get_finding]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_findings(
+ self,
+ request: Optional[Union[web_security_scanner.ListFindingsRequest, dict]] = None,
+ *,
+ parent: Optional[str] = None,
+ filter: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> pagers.ListFindingsPager:
+ r"""List Findings under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ def sample_list_findings():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.ListFindingsRequest(
+ parent="parent_value",
+ filter="filter_value",
+ )
+
+ # Make the request
+ page_result = client.list_findings(request=request)
+
+ # Handle the response
+ for response in page_result:
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1beta.types.ListFindingsRequest, dict]):
+ The request object. Request for the ``ListFindings`` method.
+ parent (str):
+ Required. The parent resource name,
+ which should be a scan run resource name
+ in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ filter (str):
+ Required. The filter expression. The expression must be
+ in the format: . Supported field: 'finding_type'.
+ Supported operator: '='.
+
+ This corresponds to the ``filter`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.services.web_security_scanner.pagers.ListFindingsPager:
+ Response for the ListFindings method.
+
+ Iterating over this object will yield results and
+ resolve additional pages automatically.
+
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent, filter])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListFindingsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListFindingsRequest):
+ request = web_security_scanner.ListFindingsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+ if filter is not None:
+ request.filter = filter
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_findings]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # This method is paged; wrap the response in a pager, which provides
+ # an `__iter__` convenience method.
+ response = pagers.ListFindingsPager(
+ method=rpc,
+ request=request,
+ response=response,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def list_finding_type_stats(
+ self,
+ request: Optional[
+ Union[web_security_scanner.ListFindingTypeStatsRequest, dict]
+ ] = None,
+ *,
+ parent: Optional[str] = None,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Union[float, object] = gapic_v1.method.DEFAULT,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListFindingTypeStatsResponse:
+ r"""List all FindingTypeStats under a given ScanRun.
+
+ .. code-block:: python
+
+ # This snippet has been automatically generated and should be regarded as a
+ # code template only.
+ # It will require modifications to work:
+ # - It may require correct/in-range values for request initialization.
+ # - It may require specifying regional endpoints when creating the service
+ # client as shown in:
+ # https://googleapis.dev/python/google-api-core/latest/client_options.html
+ from google.cloud import websecurityscanner_v1beta
+
+ def sample_list_finding_type_stats():
+ # Create a client
+ client = websecurityscanner_v1beta.WebSecurityScannerClient()
+
+ # Initialize request argument(s)
+ request = websecurityscanner_v1beta.ListFindingTypeStatsRequest(
+ parent="parent_value",
+ )
+
+ # Make the request
+ response = client.list_finding_type_stats(request=request)
+
+ # Handle the response
+ print(response)
+
+ Args:
+ request (Union[google.cloud.websecurityscanner_v1beta.types.ListFindingTypeStatsRequest, dict]):
+ The request object. Request for the ``ListFindingTypeStats`` method.
+ parent (str):
+ Required. The parent resource name,
+ which should be a scan run resource name
+ in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+
+ This corresponds to the ``parent`` field
+ on the ``request`` instance; if ``request`` is provided, this
+ should not be set.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ google.cloud.websecurityscanner_v1beta.types.ListFindingTypeStatsResponse:
+ Response for the ListFindingTypeStats method.
+ """
+ # Create or coerce a protobuf request object.
+ # Quick check: If we got a request object, we should *not* have
+ # gotten any keyword arguments that map to the request.
+ has_flattened_params = any([parent])
+ if request is not None and has_flattened_params:
+ raise ValueError(
+ "If the `request` argument is set, then none of "
+ "the individual field arguments should be set."
+ )
+
+ # Minor optimization to avoid making a copy if the user passes
+ # in a web_security_scanner.ListFindingTypeStatsRequest.
+ # There's no risk of modifying the input as we've already verified
+ # there are no flattened fields.
+ if not isinstance(request, web_security_scanner.ListFindingTypeStatsRequest):
+ request = web_security_scanner.ListFindingTypeStatsRequest(request)
+ # If we have keyword arguments corresponding to fields on the
+ # request, apply these.
+ if parent is not None:
+ request.parent = parent
+
+ # Wrap the RPC method; this adds retry and timeout information,
+ # and friendly error handling.
+ rpc = self._transport._wrapped_methods[self._transport.list_finding_type_stats]
+
+ # Certain fields should be provided within the metadata header;
+ # add these here.
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", request.parent),)),
+ )
+
+ # Send the request.
+ response = rpc(
+ request,
+ retry=retry,
+ timeout=timeout,
+ metadata=metadata,
+ )
+
+ # Done; return the response.
+ return response
+
+ def __enter__(self) -> "WebSecurityScannerClient":
+ return self
+
+ def __exit__(self, type, value, traceback):
+ """Releases underlying transport's resources.
+
+ .. warning::
+ ONLY use as a context manager if the transport is NOT shared
+ with other clients! Exiting the with block will CLOSE the transport
+ and may cause errors in other clients!
+ """
+ self.transport.close()
+
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+__all__ = ("WebSecurityScannerClient",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/pagers.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/pagers.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/pagers.py
@@ -0,0 +1,549 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import (
+ Any,
+ AsyncIterator,
+ Awaitable,
+ Callable,
+ Iterator,
+ Optional,
+ Sequence,
+ Tuple,
+)
+
+from google.cloud.websecurityscanner_v1beta.types import (
+ crawled_url,
+ finding,
+ scan_config,
+ scan_run,
+ web_security_scanner,
+)
+
+
+class ListScanConfigsPager:
+ """A pager for iterating through ``list_scan_configs`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1beta.types.ListScanConfigsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``scan_configs`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListScanConfigs`` requests and continue to iterate
+ through the ``scan_configs`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1beta.types.ListScanConfigsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., web_security_scanner.ListScanConfigsResponse],
+ request: web_security_scanner.ListScanConfigsRequest,
+ response: web_security_scanner.ListScanConfigsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1beta.types.ListScanConfigsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1beta.types.ListScanConfigsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListScanConfigsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[web_security_scanner.ListScanConfigsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[scan_config.ScanConfig]:
+ for page in self.pages:
+ yield from page.scan_configs
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListScanConfigsAsyncPager:
+ """A pager for iterating through ``list_scan_configs`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1beta.types.ListScanConfigsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``scan_configs`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListScanConfigs`` requests and continue to iterate
+ through the ``scan_configs`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1beta.types.ListScanConfigsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[web_security_scanner.ListScanConfigsResponse]],
+ request: web_security_scanner.ListScanConfigsRequest,
+ response: web_security_scanner.ListScanConfigsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1beta.types.ListScanConfigsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1beta.types.ListScanConfigsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListScanConfigsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(
+ self,
+ ) -> AsyncIterator[web_security_scanner.ListScanConfigsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[scan_config.ScanConfig]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.scan_configs:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListScanRunsPager:
+ """A pager for iterating through ``list_scan_runs`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1beta.types.ListScanRunsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``scan_runs`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListScanRuns`` requests and continue to iterate
+ through the ``scan_runs`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1beta.types.ListScanRunsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., web_security_scanner.ListScanRunsResponse],
+ request: web_security_scanner.ListScanRunsRequest,
+ response: web_security_scanner.ListScanRunsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1beta.types.ListScanRunsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1beta.types.ListScanRunsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListScanRunsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[web_security_scanner.ListScanRunsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[scan_run.ScanRun]:
+ for page in self.pages:
+ yield from page.scan_runs
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListScanRunsAsyncPager:
+ """A pager for iterating through ``list_scan_runs`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1beta.types.ListScanRunsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``scan_runs`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListScanRuns`` requests and continue to iterate
+ through the ``scan_runs`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1beta.types.ListScanRunsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[web_security_scanner.ListScanRunsResponse]],
+ request: web_security_scanner.ListScanRunsRequest,
+ response: web_security_scanner.ListScanRunsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1beta.types.ListScanRunsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1beta.types.ListScanRunsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListScanRunsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[web_security_scanner.ListScanRunsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[scan_run.ScanRun]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.scan_runs:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListCrawledUrlsPager:
+ """A pager for iterating through ``list_crawled_urls`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1beta.types.ListCrawledUrlsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``crawled_urls`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListCrawledUrls`` requests and continue to iterate
+ through the ``crawled_urls`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1beta.types.ListCrawledUrlsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., web_security_scanner.ListCrawledUrlsResponse],
+ request: web_security_scanner.ListCrawledUrlsRequest,
+ response: web_security_scanner.ListCrawledUrlsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1beta.types.ListCrawledUrlsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1beta.types.ListCrawledUrlsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListCrawledUrlsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[web_security_scanner.ListCrawledUrlsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[crawled_url.CrawledUrl]:
+ for page in self.pages:
+ yield from page.crawled_urls
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListCrawledUrlsAsyncPager:
+ """A pager for iterating through ``list_crawled_urls`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1beta.types.ListCrawledUrlsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``crawled_urls`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListCrawledUrls`` requests and continue to iterate
+ through the ``crawled_urls`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1beta.types.ListCrawledUrlsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[web_security_scanner.ListCrawledUrlsResponse]],
+ request: web_security_scanner.ListCrawledUrlsRequest,
+ response: web_security_scanner.ListCrawledUrlsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1beta.types.ListCrawledUrlsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1beta.types.ListCrawledUrlsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListCrawledUrlsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(
+ self,
+ ) -> AsyncIterator[web_security_scanner.ListCrawledUrlsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[crawled_url.CrawledUrl]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.crawled_urls:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListFindingsPager:
+ """A pager for iterating through ``list_findings`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1beta.types.ListFindingsResponse` object, and
+ provides an ``__iter__`` method to iterate through its
+ ``findings`` field.
+
+ If there are more pages, the ``__iter__`` method will make additional
+ ``ListFindings`` requests and continue to iterate
+ through the ``findings`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1beta.types.ListFindingsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., web_security_scanner.ListFindingsResponse],
+ request: web_security_scanner.ListFindingsRequest,
+ response: web_security_scanner.ListFindingsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiate the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1beta.types.ListFindingsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1beta.types.ListFindingsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListFindingsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ def pages(self) -> Iterator[web_security_scanner.ListFindingsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __iter__(self) -> Iterator[finding.Finding]:
+ for page in self.pages:
+ yield from page.findings
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
+
+
+class ListFindingsAsyncPager:
+ """A pager for iterating through ``list_findings`` requests.
+
+ This class thinly wraps an initial
+ :class:`google.cloud.websecurityscanner_v1beta.types.ListFindingsResponse` object, and
+ provides an ``__aiter__`` method to iterate through its
+ ``findings`` field.
+
+ If there are more pages, the ``__aiter__`` method will make additional
+ ``ListFindings`` requests and continue to iterate
+ through the ``findings`` field on the
+ corresponding responses.
+
+ All the usual :class:`google.cloud.websecurityscanner_v1beta.types.ListFindingsResponse`
+ attributes are available on the pager. If multiple requests are made, only
+ the most recent response is retained, and thus used for attribute lookup.
+ """
+
+ def __init__(
+ self,
+ method: Callable[..., Awaitable[web_security_scanner.ListFindingsResponse]],
+ request: web_security_scanner.ListFindingsRequest,
+ response: web_security_scanner.ListFindingsResponse,
+ *,
+ metadata: Sequence[Tuple[str, str]] = ()
+ ):
+ """Instantiates the pager.
+
+ Args:
+ method (Callable): The method that was originally called, and
+ which instantiated this pager.
+ request (google.cloud.websecurityscanner_v1beta.types.ListFindingsRequest):
+ The initial request object.
+ response (google.cloud.websecurityscanner_v1beta.types.ListFindingsResponse):
+ The initial response object.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+ self._method = method
+ self._request = web_security_scanner.ListFindingsRequest(request)
+ self._response = response
+ self._metadata = metadata
+
+ def __getattr__(self, name: str) -> Any:
+ return getattr(self._response, name)
+
+ @property
+ async def pages(self) -> AsyncIterator[web_security_scanner.ListFindingsResponse]:
+ yield self._response
+ while self._response.next_page_token:
+ self._request.page_token = self._response.next_page_token
+ self._response = await self._method(self._request, metadata=self._metadata)
+ yield self._response
+
+ def __aiter__(self) -> AsyncIterator[finding.Finding]:
+ async def async_generator():
+ async for page in self.pages:
+ for response in page.findings:
+ yield response
+
+ return async_generator()
+
+ def __repr__(self) -> str:
+ return "{0}<{1!r}>".format(self.__class__.__name__, self._response)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/__init__.py
@@ -0,0 +1,38 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from collections import OrderedDict
+from typing import Dict, Type
+
+from .base import WebSecurityScannerTransport
+from .grpc import WebSecurityScannerGrpcTransport
+from .grpc_asyncio import WebSecurityScannerGrpcAsyncIOTransport
+from .rest import WebSecurityScannerRestInterceptor, WebSecurityScannerRestTransport
+
+# Compile a registry of transports.
+_transport_registry = (
+ OrderedDict()
+) # type: Dict[str, Type[WebSecurityScannerTransport]]
+_transport_registry["grpc"] = WebSecurityScannerGrpcTransport
+_transport_registry["grpc_asyncio"] = WebSecurityScannerGrpcAsyncIOTransport
+_transport_registry["rest"] = WebSecurityScannerRestTransport
+
+__all__ = (
+ "WebSecurityScannerTransport",
+ "WebSecurityScannerGrpcTransport",
+ "WebSecurityScannerGrpcAsyncIOTransport",
+ "WebSecurityScannerRestTransport",
+ "WebSecurityScannerRestInterceptor",
+)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/base.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/base.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/base.py
@@ -0,0 +1,432 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import abc
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Union
+
+import google.api_core
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import retry as retries
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.oauth2 import service_account # type: ignore
+from google.protobuf import empty_pb2 # type: ignore
+
+from google.cloud.websecurityscanner_v1beta import gapic_version as package_version
+from google.cloud.websecurityscanner_v1beta.types import scan_run, web_security_scanner
+from google.cloud.websecurityscanner_v1beta.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1beta.types import finding
+from google.cloud.websecurityscanner_v1beta.types import scan_config
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=package_version.__version__
+)
+
+
+class WebSecurityScannerTransport(abc.ABC):
+ """Abstract transport class for WebSecurityScanner."""
+
+ AUTH_SCOPES = ("https://www.googleapis.com/auth/cloud-platform",)
+
+ DEFAULT_HOST: str = "websecurityscanner.googleapis.com"
+
+ def __init__(
+ self,
+ *,
+ host: str = DEFAULT_HOST,
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ **kwargs,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A list of scopes.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ """
+
+ scopes_kwargs = {"scopes": scopes, "default_scopes": self.AUTH_SCOPES}
+
+ # Save the scopes.
+ self._scopes = scopes
+
+ # If no credentials are provided, then determine the appropriate
+ # defaults.
+ if credentials and credentials_file:
+ raise core_exceptions.DuplicateCredentialArgs(
+ "'credentials_file' and 'credentials' are mutually exclusive"
+ )
+
+ if credentials_file is not None:
+ credentials, _ = google.auth.load_credentials_from_file(
+ credentials_file, **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ elif credentials is None:
+ credentials, _ = google.auth.default(
+ **scopes_kwargs, quota_project_id=quota_project_id
+ )
+ # Don't apply audience if the credentials file passed from user.
+ if hasattr(credentials, "with_gdch_audience"):
+ credentials = credentials.with_gdch_audience(
+ api_audience if api_audience else host
+ )
+
+ # If the credentials are service account credentials, then always try to use self signed JWT.
+ if (
+ always_use_jwt_access
+ and isinstance(credentials, service_account.Credentials)
+ and hasattr(service_account.Credentials, "with_always_use_jwt_access")
+ ):
+ credentials = credentials.with_always_use_jwt_access(True)
+
+ # Save the credentials.
+ self._credentials = credentials
+
+ # Save the hostname. Default to port 443 (HTTPS) if none is specified.
+ if ":" not in host:
+ host += ":443"
+ self._host = host
+
+ def _prep_wrapped_messages(self, client_info):
+ # Precompute the wrapped methods.
+ self._wrapped_methods = {
+ self.create_scan_config: gapic_v1.method.wrap_method(
+ self.create_scan_config,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.delete_scan_config: gapic_v1.method.wrap_method(
+ self.delete_scan_config,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.get_scan_config: gapic_v1.method.wrap_method(
+ self.get_scan_config,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_scan_configs: gapic_v1.method.wrap_method(
+ self.list_scan_configs,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.update_scan_config: gapic_v1.method.wrap_method(
+ self.update_scan_config,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.start_scan_run: gapic_v1.method.wrap_method(
+ self.start_scan_run,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.get_scan_run: gapic_v1.method.wrap_method(
+ self.get_scan_run,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_scan_runs: gapic_v1.method.wrap_method(
+ self.list_scan_runs,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.stop_scan_run: gapic_v1.method.wrap_method(
+ self.stop_scan_run,
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_crawled_urls: gapic_v1.method.wrap_method(
+ self.list_crawled_urls,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.get_finding: gapic_v1.method.wrap_method(
+ self.get_finding,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_findings: gapic_v1.method.wrap_method(
+ self.list_findings,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ self.list_finding_type_stats: gapic_v1.method.wrap_method(
+ self.list_finding_type_stats,
+ default_retry=retries.Retry(
+ initial=0.1,
+ maximum=60.0,
+ multiplier=1.3,
+ predicate=retries.if_exception_type(
+ core_exceptions.DeadlineExceeded,
+ core_exceptions.ServiceUnavailable,
+ ),
+ deadline=600.0,
+ ),
+ default_timeout=600.0,
+ client_info=client_info,
+ ),
+ }
+
+ def close(self):
+ """Closes resources associated with the transport.
+
+ .. warning::
+ Only call this method if the transport is NOT shared
+ with other clients - this may cause errors in other clients!
+ """
+ raise NotImplementedError()
+
+ @property
+ def create_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.CreateScanConfigRequest],
+ Union[gcw_scan_config.ScanConfig, Awaitable[gcw_scan_config.ScanConfig]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def delete_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.DeleteScanConfigRequest],
+ Union[empty_pb2.Empty, Awaitable[empty_pb2.Empty]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetScanConfigRequest],
+ Union[scan_config.ScanConfig, Awaitable[scan_config.ScanConfig]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_scan_configs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanConfigsRequest],
+ Union[
+ web_security_scanner.ListScanConfigsResponse,
+ Awaitable[web_security_scanner.ListScanConfigsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def update_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.UpdateScanConfigRequest],
+ Union[gcw_scan_config.ScanConfig, Awaitable[gcw_scan_config.ScanConfig]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def start_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.StartScanRunRequest],
+ Union[scan_run.ScanRun, Awaitable[scan_run.ScanRun]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetScanRunRequest],
+ Union[scan_run.ScanRun, Awaitable[scan_run.ScanRun]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_scan_runs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanRunsRequest],
+ Union[
+ web_security_scanner.ListScanRunsResponse,
+ Awaitable[web_security_scanner.ListScanRunsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def stop_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.StopScanRunRequest],
+ Union[scan_run.ScanRun, Awaitable[scan_run.ScanRun]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_crawled_urls(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListCrawledUrlsRequest],
+ Union[
+ web_security_scanner.ListCrawledUrlsResponse,
+ Awaitable[web_security_scanner.ListCrawledUrlsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def get_finding(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetFindingRequest],
+ Union[finding.Finding, Awaitable[finding.Finding]],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_findings(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingsRequest],
+ Union[
+ web_security_scanner.ListFindingsResponse,
+ Awaitable[web_security_scanner.ListFindingsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def list_finding_type_stats(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingTypeStatsRequest],
+ Union[
+ web_security_scanner.ListFindingTypeStatsResponse,
+ Awaitable[web_security_scanner.ListFindingTypeStatsResponse],
+ ],
+ ]:
+ raise NotImplementedError()
+
+ @property
+ def kind(self) -> str:
+ raise NotImplementedError()
+
+
+__all__ = ("WebSecurityScannerTransport",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/grpc.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/grpc.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/grpc.py
@@ -0,0 +1,606 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers
+import google.auth # type: ignore
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.protobuf import empty_pb2 # type: ignore
+import grpc # type: ignore
+
+from google.cloud.websecurityscanner_v1beta.types import scan_run, web_security_scanner
+from google.cloud.websecurityscanner_v1beta.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1beta.types import finding
+from google.cloud.websecurityscanner_v1beta.types import scan_config
+
+from .base import DEFAULT_CLIENT_INFO, WebSecurityScannerTransport
+
+
+class WebSecurityScannerGrpcTransport(WebSecurityScannerTransport):
+ """gRPC backend transport for WebSecurityScanner.
+
+ Cloud Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud
+ Platform. It crawls your application, and attempts to exercise
+ as many user inputs and event handlers as possible.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _stubs: Dict[str, Callable]
+
+ def __init__(
+ self,
+ *,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[grpc.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ channel (Optional[grpc.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> grpc.Channel:
+ """Create and return a gRPC channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is mutually exclusive with credentials.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ grpc.Channel: A gRPC channel object.
+
+ Raises:
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+
+ return grpc_helpers.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ @property
+ def grpc_channel(self) -> grpc.Channel:
+ """Return the channel designed to connect to this service."""
+ return self._grpc_channel
+
+ @property
+ def create_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.CreateScanConfigRequest], gcw_scan_config.ScanConfig
+ ]:
+ r"""Return a callable for the create scan config method over gRPC.
+
+ Creates a new ScanConfig.
+
+ Returns:
+ Callable[[~.CreateScanConfigRequest],
+ ~.ScanConfig]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_scan_config" not in self._stubs:
+ self._stubs["create_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/CreateScanConfig",
+ request_serializer=web_security_scanner.CreateScanConfigRequest.serialize,
+ response_deserializer=gcw_scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["create_scan_config"]
+
+ @property
+ def delete_scan_config(
+ self,
+ ) -> Callable[[web_security_scanner.DeleteScanConfigRequest], empty_pb2.Empty]:
+ r"""Return a callable for the delete scan config method over gRPC.
+
+ Deletes an existing ScanConfig and its child
+ resources.
+
+ Returns:
+ Callable[[~.DeleteScanConfigRequest],
+ ~.Empty]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_scan_config" not in self._stubs:
+ self._stubs["delete_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/DeleteScanConfig",
+ request_serializer=web_security_scanner.DeleteScanConfigRequest.serialize,
+ response_deserializer=empty_pb2.Empty.FromString,
+ )
+ return self._stubs["delete_scan_config"]
+
+ @property
+ def get_scan_config(
+ self,
+ ) -> Callable[[web_security_scanner.GetScanConfigRequest], scan_config.ScanConfig]:
+ r"""Return a callable for the get scan config method over gRPC.
+
+ Gets a ScanConfig.
+
+ Returns:
+ Callable[[~.GetScanConfigRequest],
+ ~.ScanConfig]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_scan_config" not in self._stubs:
+ self._stubs["get_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/GetScanConfig",
+ request_serializer=web_security_scanner.GetScanConfigRequest.serialize,
+ response_deserializer=scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["get_scan_config"]
+
+ @property
+ def list_scan_configs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanConfigsRequest],
+ web_security_scanner.ListScanConfigsResponse,
+ ]:
+ r"""Return a callable for the list scan configs method over gRPC.
+
+ Lists ScanConfigs under a given project.
+
+ Returns:
+ Callable[[~.ListScanConfigsRequest],
+ ~.ListScanConfigsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_scan_configs" not in self._stubs:
+ self._stubs["list_scan_configs"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/ListScanConfigs",
+ request_serializer=web_security_scanner.ListScanConfigsRequest.serialize,
+ response_deserializer=web_security_scanner.ListScanConfigsResponse.deserialize,
+ )
+ return self._stubs["list_scan_configs"]
+
+ @property
+ def update_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.UpdateScanConfigRequest], gcw_scan_config.ScanConfig
+ ]:
+ r"""Return a callable for the update scan config method over gRPC.
+
+ Updates a ScanConfig. This method support partial
+ update of a ScanConfig.
+
+ Returns:
+ Callable[[~.UpdateScanConfigRequest],
+ ~.ScanConfig]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_scan_config" not in self._stubs:
+ self._stubs["update_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/UpdateScanConfig",
+ request_serializer=web_security_scanner.UpdateScanConfigRequest.serialize,
+ response_deserializer=gcw_scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["update_scan_config"]
+
+ @property
+ def start_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.StartScanRunRequest], scan_run.ScanRun]:
+ r"""Return a callable for the start scan run method over gRPC.
+
+ Start a ScanRun according to the given ScanConfig.
+
+ Returns:
+ Callable[[~.StartScanRunRequest],
+ ~.ScanRun]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "start_scan_run" not in self._stubs:
+ self._stubs["start_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/StartScanRun",
+ request_serializer=web_security_scanner.StartScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["start_scan_run"]
+
+ @property
+ def get_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.GetScanRunRequest], scan_run.ScanRun]:
+ r"""Return a callable for the get scan run method over gRPC.
+
+ Gets a ScanRun.
+
+ Returns:
+ Callable[[~.GetScanRunRequest],
+ ~.ScanRun]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_scan_run" not in self._stubs:
+ self._stubs["get_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/GetScanRun",
+ request_serializer=web_security_scanner.GetScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["get_scan_run"]
+
+ @property
+ def list_scan_runs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanRunsRequest],
+ web_security_scanner.ListScanRunsResponse,
+ ]:
+ r"""Return a callable for the list scan runs method over gRPC.
+
+ Lists ScanRuns under a given ScanConfig, in
+ descending order of ScanRun stop time.
+
+ Returns:
+ Callable[[~.ListScanRunsRequest],
+ ~.ListScanRunsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_scan_runs" not in self._stubs:
+ self._stubs["list_scan_runs"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/ListScanRuns",
+ request_serializer=web_security_scanner.ListScanRunsRequest.serialize,
+ response_deserializer=web_security_scanner.ListScanRunsResponse.deserialize,
+ )
+ return self._stubs["list_scan_runs"]
+
+ @property
+ def stop_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.StopScanRunRequest], scan_run.ScanRun]:
+ r"""Return a callable for the stop scan run method over gRPC.
+
+ Stops a ScanRun. The stopped ScanRun is returned.
+
+ Returns:
+ Callable[[~.StopScanRunRequest],
+ ~.ScanRun]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "stop_scan_run" not in self._stubs:
+ self._stubs["stop_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/StopScanRun",
+ request_serializer=web_security_scanner.StopScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["stop_scan_run"]
+
+ @property
+ def list_crawled_urls(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListCrawledUrlsRequest],
+ web_security_scanner.ListCrawledUrlsResponse,
+ ]:
+ r"""Return a callable for the list crawled urls method over gRPC.
+
+ List CrawledUrls under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListCrawledUrlsRequest],
+ ~.ListCrawledUrlsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_crawled_urls" not in self._stubs:
+ self._stubs["list_crawled_urls"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/ListCrawledUrls",
+ request_serializer=web_security_scanner.ListCrawledUrlsRequest.serialize,
+ response_deserializer=web_security_scanner.ListCrawledUrlsResponse.deserialize,
+ )
+ return self._stubs["list_crawled_urls"]
+
+ @property
+ def get_finding(
+ self,
+ ) -> Callable[[web_security_scanner.GetFindingRequest], finding.Finding]:
+ r"""Return a callable for the get finding method over gRPC.
+
+ Gets a Finding.
+
+ Returns:
+ Callable[[~.GetFindingRequest],
+ ~.Finding]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_finding" not in self._stubs:
+ self._stubs["get_finding"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/GetFinding",
+ request_serializer=web_security_scanner.GetFindingRequest.serialize,
+ response_deserializer=finding.Finding.deserialize,
+ )
+ return self._stubs["get_finding"]
+
+ @property
+ def list_findings(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingsRequest],
+ web_security_scanner.ListFindingsResponse,
+ ]:
+ r"""Return a callable for the list findings method over gRPC.
+
+ List Findings under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListFindingsRequest],
+ ~.ListFindingsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_findings" not in self._stubs:
+ self._stubs["list_findings"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/ListFindings",
+ request_serializer=web_security_scanner.ListFindingsRequest.serialize,
+ response_deserializer=web_security_scanner.ListFindingsResponse.deserialize,
+ )
+ return self._stubs["list_findings"]
+
+ @property
+ def list_finding_type_stats(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingTypeStatsRequest],
+ web_security_scanner.ListFindingTypeStatsResponse,
+ ]:
+ r"""Return a callable for the list finding type stats method over gRPC.
+
+ List all FindingTypeStats under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListFindingTypeStatsRequest],
+ ~.ListFindingTypeStatsResponse]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_finding_type_stats" not in self._stubs:
+ self._stubs["list_finding_type_stats"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/ListFindingTypeStats",
+ request_serializer=web_security_scanner.ListFindingTypeStatsRequest.serialize,
+ response_deserializer=web_security_scanner.ListFindingTypeStatsResponse.deserialize,
+ )
+ return self._stubs["list_finding_type_stats"]
+
+ def close(self):
+ self.grpc_channel.close()
+
+ @property
+ def kind(self) -> str:
+ return "grpc"
+
+
+__all__ = ("WebSecurityScannerGrpcTransport",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/grpc_asyncio.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/grpc_asyncio.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/grpc_asyncio.py
@@ -0,0 +1,617 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from typing import Awaitable, Callable, Dict, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, grpc_helpers_async
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.protobuf import empty_pb2 # type: ignore
+import grpc # type: ignore
+from grpc.experimental import aio # type: ignore
+
+from google.cloud.websecurityscanner_v1beta.types import scan_run, web_security_scanner
+from google.cloud.websecurityscanner_v1beta.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1beta.types import finding
+from google.cloud.websecurityscanner_v1beta.types import scan_config
+
+from .base import DEFAULT_CLIENT_INFO, WebSecurityScannerTransport
+from .grpc import WebSecurityScannerGrpcTransport
+
+
+class WebSecurityScannerGrpcAsyncIOTransport(WebSecurityScannerTransport):
+ """gRPC AsyncIO backend transport for WebSecurityScanner.
+
+ Cloud Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud
+ Platform. It crawls your application, and attempts to exercise
+ as many user inputs and event handlers as possible.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends protocol buffers over the wire using gRPC (which is built on
+ top of HTTP/2); the ``grpcio`` package must be installed.
+ """
+
+ _grpc_channel: aio.Channel
+ _stubs: Dict[str, Callable] = {}
+
+ @classmethod
+ def create_channel(
+ cls,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ quota_project_id: Optional[str] = None,
+ **kwargs,
+ ) -> aio.Channel:
+ """Create and return a gRPC AsyncIO channel object.
+ Args:
+ host (Optional[str]): The host for the channel to use.
+ credentials (Optional[~.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify this application to the service. If
+ none are specified, the client will attempt to ascertain
+ the credentials from the environment.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ kwargs (Optional[dict]): Keyword arguments, which are passed to the
+ channel creation.
+ Returns:
+ aio.Channel: A gRPC AsyncIO channel object.
+ """
+
+ return grpc_helpers_async.create_channel(
+ host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ quota_project_id=quota_project_id,
+ default_scopes=cls.AUTH_SCOPES,
+ scopes=scopes,
+ default_host=cls.DEFAULT_HOST,
+ **kwargs,
+ )
+
+ def __init__(
+ self,
+ *,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ channel: Optional[aio.Channel] = None,
+ api_mtls_endpoint: Optional[str] = None,
+ client_cert_source: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ ssl_channel_credentials: Optional[grpc.ChannelCredentials] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+ This argument is ignored if ``channel`` is provided.
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional[Sequence[str]]): A optional list of scopes needed for this
+ service. These are only used when credentials are not specified and
+ are passed to :func:`google.auth.default`.
+ channel (Optional[aio.Channel]): A ``Channel`` instance through
+ which to make calls.
+ api_mtls_endpoint (Optional[str]): Deprecated. The mutual TLS endpoint.
+ If provided, it overrides the ``host`` argument and tries to create
+ a mutual TLS channel with client SSL credentials from
+ ``client_cert_source`` or application default SSL credentials.
+ client_cert_source (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ Deprecated. A callback to provide client SSL certificate bytes and
+ private key bytes, both in PEM format. It is ignored if
+ ``api_mtls_endpoint`` is None.
+ ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials
+ for the grpc channel. It is ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]):
+ A callback to provide client certificate bytes and private key bytes,
+ both in PEM format. It is used to configure a mutual TLS channel. It is
+ ignored if ``channel`` or ``ssl_channel_credentials`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you're developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+
+ Raises:
+ google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
+ creation failed for any reason.
+ google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials``
+ and ``credentials_file`` are passed.
+ """
+ self._grpc_channel = None
+ self._ssl_channel_credentials = ssl_channel_credentials
+ self._stubs: Dict[str, Callable] = {}
+
+ if api_mtls_endpoint:
+ warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning)
+ if client_cert_source:
+ warnings.warn("client_cert_source is deprecated", DeprecationWarning)
+
+ if channel:
+ # Ignore credentials if a channel was passed.
+ credentials = False
+ # If a channel was explicitly provided, set it.
+ self._grpc_channel = channel
+ self._ssl_channel_credentials = None
+ else:
+ if api_mtls_endpoint:
+ host = api_mtls_endpoint
+
+ # Create SSL credentials with client_cert_source or application
+ # default SSL credentials.
+ if client_cert_source:
+ cert, key = client_cert_source()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+ else:
+ self._ssl_channel_credentials = SslCredentials().ssl_credentials
+
+ else:
+ if client_cert_source_for_mtls and not ssl_channel_credentials:
+ cert, key = client_cert_source_for_mtls()
+ self._ssl_channel_credentials = grpc.ssl_channel_credentials(
+ certificate_chain=cert, private_key=key
+ )
+
+ # The base transport sets the host, credentials and scopes
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ credentials_file=credentials_file,
+ scopes=scopes,
+ quota_project_id=quota_project_id,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+
+ if not self._grpc_channel:
+ self._grpc_channel = type(self).create_channel(
+ self._host,
+ # use the credentials which are saved
+ credentials=self._credentials,
+ # Set ``credentials_file`` to ``None`` here as
+ # the credentials that we saved earlier should be used.
+ credentials_file=None,
+ scopes=self._scopes,
+ ssl_credentials=self._ssl_channel_credentials,
+ quota_project_id=quota_project_id,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Wrap messages. This must be done after self._grpc_channel exists
+ self._prep_wrapped_messages(client_info)
+
+ @property
+ def grpc_channel(self) -> aio.Channel:
+ """Create the channel designed to connect to this service.
+
+ This property caches on the instance; repeated calls return
+ the same channel.
+ """
+ # Return the channel from cache.
+ return self._grpc_channel
+
+ @property
+ def create_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.CreateScanConfigRequest],
+ Awaitable[gcw_scan_config.ScanConfig],
+ ]:
+ r"""Return a callable for the create scan config method over gRPC.
+
+ Creates a new ScanConfig.
+
+ Returns:
+ Callable[[~.CreateScanConfigRequest],
+ Awaitable[~.ScanConfig]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "create_scan_config" not in self._stubs:
+ self._stubs["create_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/CreateScanConfig",
+ request_serializer=web_security_scanner.CreateScanConfigRequest.serialize,
+ response_deserializer=gcw_scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["create_scan_config"]
+
+ @property
+ def delete_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.DeleteScanConfigRequest], Awaitable[empty_pb2.Empty]
+ ]:
+ r"""Return a callable for the delete scan config method over gRPC.
+
+ Deletes an existing ScanConfig and its child
+ resources.
+
+ Returns:
+ Callable[[~.DeleteScanConfigRequest],
+ Awaitable[~.Empty]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "delete_scan_config" not in self._stubs:
+ self._stubs["delete_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/DeleteScanConfig",
+ request_serializer=web_security_scanner.DeleteScanConfigRequest.serialize,
+ response_deserializer=empty_pb2.Empty.FromString,
+ )
+ return self._stubs["delete_scan_config"]
+
+ @property
+ def get_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetScanConfigRequest], Awaitable[scan_config.ScanConfig]
+ ]:
+ r"""Return a callable for the get scan config method over gRPC.
+
+ Gets a ScanConfig.
+
+ Returns:
+ Callable[[~.GetScanConfigRequest],
+ Awaitable[~.ScanConfig]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_scan_config" not in self._stubs:
+ self._stubs["get_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/GetScanConfig",
+ request_serializer=web_security_scanner.GetScanConfigRequest.serialize,
+ response_deserializer=scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["get_scan_config"]
+
+ @property
+ def list_scan_configs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanConfigsRequest],
+ Awaitable[web_security_scanner.ListScanConfigsResponse],
+ ]:
+ r"""Return a callable for the list scan configs method over gRPC.
+
+ Lists ScanConfigs under a given project.
+
+ Returns:
+ Callable[[~.ListScanConfigsRequest],
+ Awaitable[~.ListScanConfigsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_scan_configs" not in self._stubs:
+ self._stubs["list_scan_configs"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/ListScanConfigs",
+ request_serializer=web_security_scanner.ListScanConfigsRequest.serialize,
+ response_deserializer=web_security_scanner.ListScanConfigsResponse.deserialize,
+ )
+ return self._stubs["list_scan_configs"]
+
+ @property
+ def update_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.UpdateScanConfigRequest],
+ Awaitable[gcw_scan_config.ScanConfig],
+ ]:
+ r"""Return a callable for the update scan config method over gRPC.
+
+ Updates a ScanConfig. This method support partial
+ update of a ScanConfig.
+
+ Returns:
+ Callable[[~.UpdateScanConfigRequest],
+ Awaitable[~.ScanConfig]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "update_scan_config" not in self._stubs:
+ self._stubs["update_scan_config"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/UpdateScanConfig",
+ request_serializer=web_security_scanner.UpdateScanConfigRequest.serialize,
+ response_deserializer=gcw_scan_config.ScanConfig.deserialize,
+ )
+ return self._stubs["update_scan_config"]
+
+ @property
+ def start_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.StartScanRunRequest], Awaitable[scan_run.ScanRun]
+ ]:
+ r"""Return a callable for the start scan run method over gRPC.
+
+ Start a ScanRun according to the given ScanConfig.
+
+ Returns:
+ Callable[[~.StartScanRunRequest],
+ Awaitable[~.ScanRun]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "start_scan_run" not in self._stubs:
+ self._stubs["start_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/StartScanRun",
+ request_serializer=web_security_scanner.StartScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["start_scan_run"]
+
+ @property
+ def get_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.GetScanRunRequest], Awaitable[scan_run.ScanRun]
+ ]:
+ r"""Return a callable for the get scan run method over gRPC.
+
+ Gets a ScanRun.
+
+ Returns:
+ Callable[[~.GetScanRunRequest],
+ Awaitable[~.ScanRun]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_scan_run" not in self._stubs:
+ self._stubs["get_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/GetScanRun",
+ request_serializer=web_security_scanner.GetScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["get_scan_run"]
+
+ @property
+ def list_scan_runs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanRunsRequest],
+ Awaitable[web_security_scanner.ListScanRunsResponse],
+ ]:
+ r"""Return a callable for the list scan runs method over gRPC.
+
+ Lists ScanRuns under a given ScanConfig, in
+ descending order of ScanRun stop time.
+
+ Returns:
+ Callable[[~.ListScanRunsRequest],
+ Awaitable[~.ListScanRunsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_scan_runs" not in self._stubs:
+ self._stubs["list_scan_runs"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/ListScanRuns",
+ request_serializer=web_security_scanner.ListScanRunsRequest.serialize,
+ response_deserializer=web_security_scanner.ListScanRunsResponse.deserialize,
+ )
+ return self._stubs["list_scan_runs"]
+
+ @property
+ def stop_scan_run(
+ self,
+ ) -> Callable[
+ [web_security_scanner.StopScanRunRequest], Awaitable[scan_run.ScanRun]
+ ]:
+ r"""Return a callable for the stop scan run method over gRPC.
+
+ Stops a ScanRun. The stopped ScanRun is returned.
+
+ Returns:
+ Callable[[~.StopScanRunRequest],
+ Awaitable[~.ScanRun]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "stop_scan_run" not in self._stubs:
+ self._stubs["stop_scan_run"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/StopScanRun",
+ request_serializer=web_security_scanner.StopScanRunRequest.serialize,
+ response_deserializer=scan_run.ScanRun.deserialize,
+ )
+ return self._stubs["stop_scan_run"]
+
+ @property
+ def list_crawled_urls(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListCrawledUrlsRequest],
+ Awaitable[web_security_scanner.ListCrawledUrlsResponse],
+ ]:
+ r"""Return a callable for the list crawled urls method over gRPC.
+
+ List CrawledUrls under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListCrawledUrlsRequest],
+ Awaitable[~.ListCrawledUrlsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_crawled_urls" not in self._stubs:
+ self._stubs["list_crawled_urls"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/ListCrawledUrls",
+ request_serializer=web_security_scanner.ListCrawledUrlsRequest.serialize,
+ response_deserializer=web_security_scanner.ListCrawledUrlsResponse.deserialize,
+ )
+ return self._stubs["list_crawled_urls"]
+
+ @property
+ def get_finding(
+ self,
+ ) -> Callable[[web_security_scanner.GetFindingRequest], Awaitable[finding.Finding]]:
+ r"""Return a callable for the get finding method over gRPC.
+
+ Gets a Finding.
+
+ Returns:
+ Callable[[~.GetFindingRequest],
+ Awaitable[~.Finding]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "get_finding" not in self._stubs:
+ self._stubs["get_finding"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/GetFinding",
+ request_serializer=web_security_scanner.GetFindingRequest.serialize,
+ response_deserializer=finding.Finding.deserialize,
+ )
+ return self._stubs["get_finding"]
+
+ @property
+ def list_findings(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingsRequest],
+ Awaitable[web_security_scanner.ListFindingsResponse],
+ ]:
+ r"""Return a callable for the list findings method over gRPC.
+
+ List Findings under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListFindingsRequest],
+ Awaitable[~.ListFindingsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_findings" not in self._stubs:
+ self._stubs["list_findings"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/ListFindings",
+ request_serializer=web_security_scanner.ListFindingsRequest.serialize,
+ response_deserializer=web_security_scanner.ListFindingsResponse.deserialize,
+ )
+ return self._stubs["list_findings"]
+
+ @property
+ def list_finding_type_stats(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingTypeStatsRequest],
+ Awaitable[web_security_scanner.ListFindingTypeStatsResponse],
+ ]:
+ r"""Return a callable for the list finding type stats method over gRPC.
+
+ List all FindingTypeStats under a given ScanRun.
+
+ Returns:
+ Callable[[~.ListFindingTypeStatsRequest],
+ Awaitable[~.ListFindingTypeStatsResponse]]:
+ A function that, when called, will call the underlying RPC
+ on the server.
+ """
+ # Generate a "stub function" on-the-fly which will actually make
+ # the request.
+ # gRPC handles serialization and deserialization, so we just need
+ # to pass in the functions for each.
+ if "list_finding_type_stats" not in self._stubs:
+ self._stubs["list_finding_type_stats"] = self.grpc_channel.unary_unary(
+ "/google.cloud.websecurityscanner.v1beta.WebSecurityScanner/ListFindingTypeStats",
+ request_serializer=web_security_scanner.ListFindingTypeStatsRequest.serialize,
+ response_deserializer=web_security_scanner.ListFindingTypeStatsResponse.deserialize,
+ )
+ return self._stubs["list_finding_type_stats"]
+
+ def close(self):
+ return self.grpc_channel.close()
+
+
+__all__ = ("WebSecurityScannerGrpcAsyncIOTransport",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/rest.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/rest.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/services/web_security_scanner/transports/rest.py
@@ -0,0 +1,1866 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+import dataclasses
+import json # type: ignore
+import re
+from typing import Any, Callable, Dict, List, Optional, Sequence, Tuple, Union
+import warnings
+
+from google.api_core import gapic_v1, path_template, rest_helpers, rest_streaming
+from google.api_core import exceptions as core_exceptions
+from google.api_core import retry as retries
+from google.auth import credentials as ga_credentials # type: ignore
+from google.auth.transport.grpc import SslCredentials # type: ignore
+from google.auth.transport.requests import AuthorizedSession # type: ignore
+from google.protobuf import json_format
+import grpc # type: ignore
+from requests import __version__ as requests_version
+
+try:
+ OptionalRetry = Union[retries.Retry, gapic_v1.method._MethodDefault]
+except AttributeError: # pragma: NO COVER
+ OptionalRetry = Union[retries.Retry, object] # type: ignore
+
+
+from google.protobuf import empty_pb2 # type: ignore
+
+from google.cloud.websecurityscanner_v1beta.types import scan_run, web_security_scanner
+from google.cloud.websecurityscanner_v1beta.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1beta.types import finding
+from google.cloud.websecurityscanner_v1beta.types import scan_config
+
+from .base import DEFAULT_CLIENT_INFO as BASE_DEFAULT_CLIENT_INFO
+from .base import WebSecurityScannerTransport
+
+DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
+ gapic_version=BASE_DEFAULT_CLIENT_INFO.gapic_version,
+ grpc_version=None,
+ rest_version=requests_version,
+)
+
+
+class WebSecurityScannerRestInterceptor:
+ """Interceptor for WebSecurityScanner.
+
+ Interceptors are used to manipulate requests, request metadata, and responses
+ in arbitrary ways.
+ Example use cases include:
+ * Logging
+ * Verifying requests according to service or custom semantics
+ * Stripping extraneous information from responses
+
+ These use cases and more can be enabled by injecting an
+ instance of a custom subclass when constructing the WebSecurityScannerRestTransport.
+
+ .. code-block:: python
+ class MyCustomWebSecurityScannerInterceptor(WebSecurityScannerRestInterceptor):
+ def pre_create_scan_config(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_create_scan_config(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_delete_scan_config(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def pre_get_finding(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_finding(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_scan_config(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_scan_config(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_get_scan_run(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_get_scan_run(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_crawled_urls(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_crawled_urls(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_findings(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_findings(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_finding_type_stats(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_finding_type_stats(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_scan_configs(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_scan_configs(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_list_scan_runs(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_list_scan_runs(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_start_scan_run(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_start_scan_run(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_stop_scan_run(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_stop_scan_run(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ def pre_update_scan_config(self, request, metadata):
+ logging.log(f"Received request: {request}")
+ return request, metadata
+
+ def post_update_scan_config(self, response):
+ logging.log(f"Received response: {response}")
+ return response
+
+ transport = WebSecurityScannerRestTransport(interceptor=MyCustomWebSecurityScannerInterceptor())
+ client = WebSecurityScannerClient(transport=transport)
+
+
+ """
+
+ def pre_create_scan_config(
+ self,
+ request: web_security_scanner.CreateScanConfigRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.CreateScanConfigRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for create_scan_config
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_create_scan_config(
+ self, response: gcw_scan_config.ScanConfig
+ ) -> gcw_scan_config.ScanConfig:
+ """Post-rpc interceptor for create_scan_config
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_delete_scan_config(
+ self,
+ request: web_security_scanner.DeleteScanConfigRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.DeleteScanConfigRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for delete_scan_config
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def pre_get_finding(
+ self,
+ request: web_security_scanner.GetFindingRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.GetFindingRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_finding
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_get_finding(self, response: finding.Finding) -> finding.Finding:
+ """Post-rpc interceptor for get_finding
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_scan_config(
+ self,
+ request: web_security_scanner.GetScanConfigRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.GetScanConfigRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_scan_config
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_get_scan_config(
+ self, response: scan_config.ScanConfig
+ ) -> scan_config.ScanConfig:
+ """Post-rpc interceptor for get_scan_config
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_get_scan_run(
+ self,
+ request: web_security_scanner.GetScanRunRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.GetScanRunRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for get_scan_run
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_get_scan_run(self, response: scan_run.ScanRun) -> scan_run.ScanRun:
+ """Post-rpc interceptor for get_scan_run
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_crawled_urls(
+ self,
+ request: web_security_scanner.ListCrawledUrlsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.ListCrawledUrlsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_crawled_urls
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_crawled_urls(
+ self, response: web_security_scanner.ListCrawledUrlsResponse
+ ) -> web_security_scanner.ListCrawledUrlsResponse:
+ """Post-rpc interceptor for list_crawled_urls
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_findings(
+ self,
+ request: web_security_scanner.ListFindingsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.ListFindingsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_findings
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_findings(
+ self, response: web_security_scanner.ListFindingsResponse
+ ) -> web_security_scanner.ListFindingsResponse:
+ """Post-rpc interceptor for list_findings
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_finding_type_stats(
+ self,
+ request: web_security_scanner.ListFindingTypeStatsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[
+ web_security_scanner.ListFindingTypeStatsRequest, Sequence[Tuple[str, str]]
+ ]:
+ """Pre-rpc interceptor for list_finding_type_stats
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_finding_type_stats(
+ self, response: web_security_scanner.ListFindingTypeStatsResponse
+ ) -> web_security_scanner.ListFindingTypeStatsResponse:
+ """Post-rpc interceptor for list_finding_type_stats
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_scan_configs(
+ self,
+ request: web_security_scanner.ListScanConfigsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.ListScanConfigsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_scan_configs
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_scan_configs(
+ self, response: web_security_scanner.ListScanConfigsResponse
+ ) -> web_security_scanner.ListScanConfigsResponse:
+ """Post-rpc interceptor for list_scan_configs
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_list_scan_runs(
+ self,
+ request: web_security_scanner.ListScanRunsRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.ListScanRunsRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for list_scan_runs
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_list_scan_runs(
+ self, response: web_security_scanner.ListScanRunsResponse
+ ) -> web_security_scanner.ListScanRunsResponse:
+ """Post-rpc interceptor for list_scan_runs
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_start_scan_run(
+ self,
+ request: web_security_scanner.StartScanRunRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.StartScanRunRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for start_scan_run
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_start_scan_run(self, response: scan_run.ScanRun) -> scan_run.ScanRun:
+ """Post-rpc interceptor for start_scan_run
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_stop_scan_run(
+ self,
+ request: web_security_scanner.StopScanRunRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.StopScanRunRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for stop_scan_run
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_stop_scan_run(self, response: scan_run.ScanRun) -> scan_run.ScanRun:
+ """Post-rpc interceptor for stop_scan_run
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+ def pre_update_scan_config(
+ self,
+ request: web_security_scanner.UpdateScanConfigRequest,
+ metadata: Sequence[Tuple[str, str]],
+ ) -> Tuple[web_security_scanner.UpdateScanConfigRequest, Sequence[Tuple[str, str]]]:
+ """Pre-rpc interceptor for update_scan_config
+
+ Override in a subclass to manipulate the request or metadata
+ before they are sent to the WebSecurityScanner server.
+ """
+ return request, metadata
+
+ def post_update_scan_config(
+ self, response: gcw_scan_config.ScanConfig
+ ) -> gcw_scan_config.ScanConfig:
+ """Post-rpc interceptor for update_scan_config
+
+ Override in a subclass to manipulate the response
+ after it is returned by the WebSecurityScanner server but before
+ it is returned to user code.
+ """
+ return response
+
+
+@dataclasses.dataclass
+class WebSecurityScannerRestStub:
+ _session: AuthorizedSession
+ _host: str
+ _interceptor: WebSecurityScannerRestInterceptor
+
+
+class WebSecurityScannerRestTransport(WebSecurityScannerTransport):
+ """REST backend transport for WebSecurityScanner.
+
+ Cloud Web Security Scanner Service identifies security
+ vulnerabilities in web applications hosted on Google Cloud
+ Platform. It crawls your application, and attempts to exercise
+ as many user inputs and event handlers as possible.
+
+ This class defines the same methods as the primary client, so the
+ primary client can load the underlying transport implementation
+ and call it.
+
+ It sends JSON representations of protocol buffers over HTTP/1.1
+
+ """
+
+ def __init__(
+ self,
+ *,
+ host: str = "websecurityscanner.googleapis.com",
+ credentials: Optional[ga_credentials.Credentials] = None,
+ credentials_file: Optional[str] = None,
+ scopes: Optional[Sequence[str]] = None,
+ client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None,
+ quota_project_id: Optional[str] = None,
+ client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
+ always_use_jwt_access: Optional[bool] = False,
+ url_scheme: str = "https",
+ interceptor: Optional[WebSecurityScannerRestInterceptor] = None,
+ api_audience: Optional[str] = None,
+ ) -> None:
+ """Instantiate the transport.
+
+ Args:
+ host (Optional[str]):
+ The hostname to connect to.
+ credentials (Optional[google.auth.credentials.Credentials]): The
+ authorization credentials to attach to requests. These
+ credentials identify the application to the service; if none
+ are specified, the client will attempt to ascertain the
+ credentials from the environment.
+
+ credentials_file (Optional[str]): A file with credentials that can
+ be loaded with :func:`google.auth.load_credentials_from_file`.
+ This argument is ignored if ``channel`` is provided.
+ scopes (Optional(Sequence[str])): A list of scopes. This argument is
+ ignored if ``channel`` is provided.
+ client_cert_source_for_mtls (Callable[[], Tuple[bytes, bytes]]): Client
+ certificate to configure mutual TLS HTTP channel. It is ignored
+ if ``channel`` is provided.
+ quota_project_id (Optional[str]): An optional project to use for billing
+ and quota.
+ client_info (google.api_core.gapic_v1.client_info.ClientInfo):
+ The client info used to send a user-agent string along with
+ API requests. If ``None``, then default info will be used.
+ Generally, you only need to set this if you are developing
+ your own client library.
+ always_use_jwt_access (Optional[bool]): Whether self signed JWT should
+ be used for service account credentials.
+ url_scheme: the protocol scheme for the API endpoint. Normally
+ "https", but for testing or local servers,
+ "http" can be specified.
+ """
+ # Run the base constructor
+ # TODO(yon-mg): resolve other ctor params i.e. scopes, quota, etc.
+ # TODO: When custom host (api_endpoint) is set, `scopes` must *also* be set on the
+ # credentials object
+ maybe_url_match = re.match("^(?P<scheme>http(?:s)?://)?(?P<host>.*)$", host)
+ if maybe_url_match is None:
+ raise ValueError(
+ f"Unexpected hostname structure: {host}"
+ ) # pragma: NO COVER
+
+ url_match_items = maybe_url_match.groupdict()
+
+ host = f"{url_scheme}://{host}" if not url_match_items["scheme"] else host
+
+ super().__init__(
+ host=host,
+ credentials=credentials,
+ client_info=client_info,
+ always_use_jwt_access=always_use_jwt_access,
+ api_audience=api_audience,
+ )
+ self._session = AuthorizedSession(
+ self._credentials, default_host=self.DEFAULT_HOST
+ )
+ if client_cert_source_for_mtls:
+ self._session.configure_mtls_channel(client_cert_source_for_mtls)
+ self._interceptor = interceptor or WebSecurityScannerRestInterceptor()
+ self._prep_wrapped_messages(client_info)
+
+ class _CreateScanConfig(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("CreateScanConfig")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.CreateScanConfigRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> gcw_scan_config.ScanConfig:
+ r"""Call the create scan config method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.CreateScanConfigRequest):
+ The request object. Request for the ``CreateScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.gcw_scan_config.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1beta/{parent=projects/*}/scanConfigs",
+ "body": "scan_config",
+ },
+ ]
+ request, metadata = self._interceptor.pre_create_scan_config(
+ request, metadata
+ )
+ pb_request = web_security_scanner.CreateScanConfigRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = gcw_scan_config.ScanConfig()
+ pb_resp = gcw_scan_config.ScanConfig.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_create_scan_config(resp)
+ return resp
+
+ class _DeleteScanConfig(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("DeleteScanConfig")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.DeleteScanConfigRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ):
+ r"""Call the delete scan config method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.DeleteScanConfigRequest):
+ The request object. Request for the ``DeleteScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "delete",
+ "uri": "/v1beta/{name=projects/*/scanConfigs/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_delete_scan_config(
+ request, metadata
+ )
+ pb_request = web_security_scanner.DeleteScanConfigRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ class _GetFinding(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("GetFinding")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.GetFindingRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> finding.Finding:
+ r"""Call the get finding method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.GetFindingRequest):
+ The request object. Request for the ``GetFinding`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.finding.Finding:
+ A Finding resource represents a
+ vulnerability instance identified during
+ a ScanRun.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1beta/{name=projects/*/scanConfigs/*/scanRuns/*/findings/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_finding(request, metadata)
+ pb_request = web_security_scanner.GetFindingRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = finding.Finding()
+ pb_resp = finding.Finding.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_finding(resp)
+ return resp
+
+ class _GetScanConfig(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("GetScanConfig")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.GetScanConfigRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_config.ScanConfig:
+ r"""Call the get scan config method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.GetScanConfigRequest):
+ The request object. Request for the ``GetScanConfig`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_config.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1beta/{name=projects/*/scanConfigs/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_scan_config(request, metadata)
+ pb_request = web_security_scanner.GetScanConfigRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_config.ScanConfig()
+ pb_resp = scan_config.ScanConfig.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_scan_config(resp)
+ return resp
+
+ class _GetScanRun(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("GetScanRun")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.GetScanRunRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Call the get scan run method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.GetScanRunRequest):
+ The request object. Request for the ``GetScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_run.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1beta/{name=projects/*/scanConfigs/*/scanRuns/*}",
+ },
+ ]
+ request, metadata = self._interceptor.pre_get_scan_run(request, metadata)
+ pb_request = web_security_scanner.GetScanRunRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_run.ScanRun()
+ pb_resp = scan_run.ScanRun.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_get_scan_run(resp)
+ return resp
+
+ class _ListCrawledUrls(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListCrawledUrls")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListCrawledUrlsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListCrawledUrlsResponse:
+ r"""Call the list crawled urls method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListCrawledUrlsRequest):
+ The request object. Request for the ``ListCrawledUrls`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListCrawledUrlsResponse:
+ Response for the ``ListCrawledUrls`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1beta/{parent=projects/*/scanConfigs/*/scanRuns/*}/crawledUrls",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_crawled_urls(
+ request, metadata
+ )
+ pb_request = web_security_scanner.ListCrawledUrlsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListCrawledUrlsResponse()
+ pb_resp = web_security_scanner.ListCrawledUrlsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_crawled_urls(resp)
+ return resp
+
+ class _ListFindings(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListFindings")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "filter": "",
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListFindingsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListFindingsResponse:
+ r"""Call the list findings method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListFindingsRequest):
+ The request object. Request for the ``ListFindings`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListFindingsResponse:
+ Response for the ``ListFindings`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1beta/{parent=projects/*/scanConfigs/*/scanRuns/*}/findings",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_findings(request, metadata)
+ pb_request = web_security_scanner.ListFindingsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListFindingsResponse()
+ pb_resp = web_security_scanner.ListFindingsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_findings(resp)
+ return resp
+
+ class _ListFindingTypeStats(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListFindingTypeStats")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListFindingTypeStatsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListFindingTypeStatsResponse:
+ r"""Call the list finding type stats method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListFindingTypeStatsRequest):
+ The request object. Request for the ``ListFindingTypeStats`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListFindingTypeStatsResponse:
+ Response for the ``ListFindingTypeStats`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1beta/{parent=projects/*/scanConfigs/*/scanRuns/*}/findingTypeStats",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_finding_type_stats(
+ request, metadata
+ )
+ pb_request = web_security_scanner.ListFindingTypeStatsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListFindingTypeStatsResponse()
+ pb_resp = web_security_scanner.ListFindingTypeStatsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_finding_type_stats(resp)
+ return resp
+
+ class _ListScanConfigs(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListScanConfigs")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListScanConfigsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListScanConfigsResponse:
+ r"""Call the list scan configs method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListScanConfigsRequest):
+ The request object. Request for the ``ListScanConfigs`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListScanConfigsResponse:
+ Response for the ``ListScanConfigs`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1beta/{parent=projects/*}/scanConfigs",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_scan_configs(
+ request, metadata
+ )
+ pb_request = web_security_scanner.ListScanConfigsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListScanConfigsResponse()
+ pb_resp = web_security_scanner.ListScanConfigsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_scan_configs(resp)
+ return resp
+
+ class _ListScanRuns(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("ListScanRuns")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.ListScanRunsRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> web_security_scanner.ListScanRunsResponse:
+ r"""Call the list scan runs method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.ListScanRunsRequest):
+ The request object. Request for the ``ListScanRuns`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.web_security_scanner.ListScanRunsResponse:
+ Response for the ``ListScanRuns`` method.
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "get",
+ "uri": "/v1beta/{parent=projects/*/scanConfigs/*}/scanRuns",
+ },
+ ]
+ request, metadata = self._interceptor.pre_list_scan_runs(request, metadata)
+ pb_request = web_security_scanner.ListScanRunsRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = web_security_scanner.ListScanRunsResponse()
+ pb_resp = web_security_scanner.ListScanRunsResponse.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_list_scan_runs(resp)
+ return resp
+
+ class _StartScanRun(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("StartScanRun")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.StartScanRunRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Call the start scan run method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.StartScanRunRequest):
+ The request object. Request for the ``StartScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_run.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1beta/{name=projects/*/scanConfigs/*}:start",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_start_scan_run(request, metadata)
+ pb_request = web_security_scanner.StartScanRunRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_run.ScanRun()
+ pb_resp = scan_run.ScanRun.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_start_scan_run(resp)
+ return resp
+
+ class _StopScanRun(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("StopScanRun")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {}
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.StopScanRunRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> scan_run.ScanRun:
+ r"""Call the stop scan run method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.StopScanRunRequest):
+ The request object. Request for the ``StopScanRun`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.scan_run.ScanRun:
+ A ScanRun is a output-only resource
+ representing an actual run of the scan.
+ Next id: 12
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "post",
+ "uri": "/v1beta/{name=projects/*/scanConfigs/*/scanRuns/*}:stop",
+ "body": "*",
+ },
+ ]
+ request, metadata = self._interceptor.pre_stop_scan_run(request, metadata)
+ pb_request = web_security_scanner.StopScanRunRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = scan_run.ScanRun()
+ pb_resp = scan_run.ScanRun.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_stop_scan_run(resp)
+ return resp
+
+ class _UpdateScanConfig(WebSecurityScannerRestStub):
+ def __hash__(self):
+ return hash("UpdateScanConfig")
+
+ __REQUIRED_FIELDS_DEFAULT_VALUES: Dict[str, Any] = {
+ "updateMask": {},
+ }
+
+ @classmethod
+ def _get_unset_required_fields(cls, message_dict):
+ return {
+ k: v
+ for k, v in cls.__REQUIRED_FIELDS_DEFAULT_VALUES.items()
+ if k not in message_dict
+ }
+
+ def __call__(
+ self,
+ request: web_security_scanner.UpdateScanConfigRequest,
+ *,
+ retry: OptionalRetry = gapic_v1.method.DEFAULT,
+ timeout: Optional[float] = None,
+ metadata: Sequence[Tuple[str, str]] = (),
+ ) -> gcw_scan_config.ScanConfig:
+ r"""Call the update scan config method over HTTP.
+
+ Args:
+ request (~.web_security_scanner.UpdateScanConfigRequest):
+ The request object. Request for the ``UpdateScanConfigRequest`` method.
+ retry (google.api_core.retry.Retry): Designation of what errors, if any,
+ should be retried.
+ timeout (float): The timeout for this request.
+ metadata (Sequence[Tuple[str, str]]): Strings which should be
+ sent along with the request as metadata.
+
+ Returns:
+ ~.gcw_scan_config.ScanConfig:
+ A ScanConfig resource contains the
+ configurations to launch a scan.
+
+ """
+
+ http_options: List[Dict[str, str]] = [
+ {
+ "method": "patch",
+ "uri": "/v1beta/{scan_config.name=projects/*/scanConfigs/*}",
+ "body": "scan_config",
+ },
+ ]
+ request, metadata = self._interceptor.pre_update_scan_config(
+ request, metadata
+ )
+ pb_request = web_security_scanner.UpdateScanConfigRequest.pb(request)
+ transcoded_request = path_template.transcode(http_options, pb_request)
+
+ # Jsonify the request body
+
+ body = json_format.MessageToJson(
+ transcoded_request["body"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ uri = transcoded_request["uri"]
+ method = transcoded_request["method"]
+
+ # Jsonify the query params
+ query_params = json.loads(
+ json_format.MessageToJson(
+ transcoded_request["query_params"],
+ including_default_value_fields=False,
+ use_integers_for_enums=True,
+ )
+ )
+ query_params.update(self._get_unset_required_fields(query_params))
+
+ query_params["$alt"] = "json;enum-encoding=int"
+
+ # Send the request
+ headers = dict(metadata)
+ headers["Content-Type"] = "application/json"
+ response = getattr(self._session, method)(
+ "{host}{uri}".format(host=self._host, uri=uri),
+ timeout=timeout,
+ headers=headers,
+ params=rest_helpers.flatten_query_params(query_params, strict=True),
+ data=body,
+ )
+
+ # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception
+ # subclass.
+ if response.status_code >= 400:
+ raise core_exceptions.from_http_response(response)
+
+ # Return the response
+ resp = gcw_scan_config.ScanConfig()
+ pb_resp = gcw_scan_config.ScanConfig.pb(resp)
+
+ json_format.Parse(response.content, pb_resp, ignore_unknown_fields=True)
+ resp = self._interceptor.post_update_scan_config(resp)
+ return resp
+
+ @property
+ def create_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.CreateScanConfigRequest], gcw_scan_config.ScanConfig
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._CreateScanConfig(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def delete_scan_config(
+ self,
+ ) -> Callable[[web_security_scanner.DeleteScanConfigRequest], empty_pb2.Empty]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._DeleteScanConfig(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_finding(
+ self,
+ ) -> Callable[[web_security_scanner.GetFindingRequest], finding.Finding]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetFinding(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_scan_config(
+ self,
+ ) -> Callable[[web_security_scanner.GetScanConfigRequest], scan_config.ScanConfig]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetScanConfig(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def get_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.GetScanRunRequest], scan_run.ScanRun]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._GetScanRun(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_crawled_urls(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListCrawledUrlsRequest],
+ web_security_scanner.ListCrawledUrlsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListCrawledUrls(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_findings(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingsRequest],
+ web_security_scanner.ListFindingsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListFindings(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_finding_type_stats(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListFindingTypeStatsRequest],
+ web_security_scanner.ListFindingTypeStatsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListFindingTypeStats(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_scan_configs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanConfigsRequest],
+ web_security_scanner.ListScanConfigsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListScanConfigs(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def list_scan_runs(
+ self,
+ ) -> Callable[
+ [web_security_scanner.ListScanRunsRequest],
+ web_security_scanner.ListScanRunsResponse,
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._ListScanRuns(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def start_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.StartScanRunRequest], scan_run.ScanRun]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._StartScanRun(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def stop_scan_run(
+ self,
+ ) -> Callable[[web_security_scanner.StopScanRunRequest], scan_run.ScanRun]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._StopScanRun(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def update_scan_config(
+ self,
+ ) -> Callable[
+ [web_security_scanner.UpdateScanConfigRequest], gcw_scan_config.ScanConfig
+ ]:
+ # The return type is fine, but mypy isn't sophisticated enough to determine what's going on here.
+ # In C++ this would require a dynamic_cast
+ return self._UpdateScanConfig(self._session, self._host, self._interceptor) # type: ignore
+
+ @property
+ def kind(self) -> str:
+ return "rest"
+
+ def close(self):
+ self._session.close()
+
+
+__all__ = ("WebSecurityScannerRestTransport",)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/__init__.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/__init__.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/__init__.py
@@ -0,0 +1,86 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from .crawled_url import CrawledUrl
+from .finding import Finding
+from .finding_addon import (
+ Form,
+ OutdatedLibrary,
+ ViolatingResource,
+ VulnerableHeaders,
+ VulnerableParameters,
+ Xss,
+)
+from .finding_type_stats import FindingTypeStats
+from .scan_config import ScanConfig
+from .scan_config_error import ScanConfigError
+from .scan_run import ScanRun
+from .scan_run_error_trace import ScanRunErrorTrace
+from .scan_run_warning_trace import ScanRunWarningTrace
+from .web_security_scanner import (
+ CreateScanConfigRequest,
+ DeleteScanConfigRequest,
+ GetFindingRequest,
+ GetScanConfigRequest,
+ GetScanRunRequest,
+ ListCrawledUrlsRequest,
+ ListCrawledUrlsResponse,
+ ListFindingsRequest,
+ ListFindingsResponse,
+ ListFindingTypeStatsRequest,
+ ListFindingTypeStatsResponse,
+ ListScanConfigsRequest,
+ ListScanConfigsResponse,
+ ListScanRunsRequest,
+ ListScanRunsResponse,
+ StartScanRunRequest,
+ StopScanRunRequest,
+ UpdateScanConfigRequest,
+)
+
+__all__ = (
+ "CrawledUrl",
+ "Finding",
+ "Form",
+ "OutdatedLibrary",
+ "ViolatingResource",
+ "VulnerableHeaders",
+ "VulnerableParameters",
+ "Xss",
+ "FindingTypeStats",
+ "ScanConfig",
+ "ScanConfigError",
+ "ScanRun",
+ "ScanRunErrorTrace",
+ "ScanRunWarningTrace",
+ "CreateScanConfigRequest",
+ "DeleteScanConfigRequest",
+ "GetFindingRequest",
+ "GetScanConfigRequest",
+ "GetScanRunRequest",
+ "ListCrawledUrlsRequest",
+ "ListCrawledUrlsResponse",
+ "ListFindingsRequest",
+ "ListFindingsResponse",
+ "ListFindingTypeStatsRequest",
+ "ListFindingTypeStatsResponse",
+ "ListScanConfigsRequest",
+ "ListScanConfigsResponse",
+ "ListScanRunsRequest",
+ "ListScanRunsResponse",
+ "StartScanRunRequest",
+ "StopScanRunRequest",
+ "UpdateScanConfigRequest",
+)
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/crawled_url.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/crawled_url.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/crawled_url.py
@@ -0,0 +1,61 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1beta",
+ manifest={
+ "CrawledUrl",
+ },
+)
+
+
+class CrawledUrl(proto.Message):
+ r"""A CrawledUrl resource represents a URL that was crawled
+ during a ScanRun. Web Security Scanner Service crawls the web
+ applications, following all links within the scope of sites, to
+ find the URLs to test against.
+
+ Attributes:
+ http_method (str):
+ The http method of the request that was used
+ to visit the URL, in uppercase.
+ url (str):
+ The URL that was crawled.
+ body (str):
+ The body of the request that was used to
+ visit the URL.
+ """
+
+ http_method: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ url: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ body: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/finding.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/finding.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/finding.py
@@ -0,0 +1,167 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1beta.types import finding_addon
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1beta",
+ manifest={
+ "Finding",
+ },
+)
+
+
+class Finding(proto.Message):
+ r"""A Finding resource represents a vulnerability instance
+ identified during a ScanRun.
+
+ Attributes:
+ name (str):
+ The resource name of the Finding. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanruns/{scanRunId}/findings/{findingId}'.
+ The finding IDs are generated by the system.
+ finding_type (str):
+ The type of the Finding.
+ Detailed and up-to-date information on findings
+ can be found here:
+ https://cloud.google.com/security-scanner/docs/scan-result-details
+ http_method (str):
+ The http method of the request that triggered
+ the vulnerability, in uppercase.
+ fuzzed_url (str):
+ The URL produced by the server-side fuzzer
+ and used in the request that triggered the
+ vulnerability.
+ body (str):
+ The body of the request that triggered the
+ vulnerability.
+ description (str):
+ The description of the vulnerability.
+ reproduction_url (str):
+ The URL containing human-readable payload
+ that user can leverage to reproduce the
+ vulnerability.
+ frame_url (str):
+ If the vulnerability was originated from
+ nested IFrame, the immediate parent IFrame is
+ reported.
+ final_url (str):
+ The URL where the browser lands when the
+ vulnerability is detected.
+ tracking_id (str):
+ The tracking ID uniquely identifies a
+ vulnerability instance across multiple ScanRuns.
+ form (google.cloud.websecurityscanner_v1beta.types.Form):
+ An addon containing information reported for
+ a vulnerability with an HTML form, if any.
+ outdated_library (google.cloud.websecurityscanner_v1beta.types.OutdatedLibrary):
+ An addon containing information about
+ outdated libraries.
+ violating_resource (google.cloud.websecurityscanner_v1beta.types.ViolatingResource):
+ An addon containing detailed information
+ regarding any resource causing the vulnerability
+ such as JavaScript sources, image, audio files,
+ etc.
+ vulnerable_headers (google.cloud.websecurityscanner_v1beta.types.VulnerableHeaders):
+ An addon containing information about
+ vulnerable or missing HTTP headers.
+ vulnerable_parameters (google.cloud.websecurityscanner_v1beta.types.VulnerableParameters):
+ An addon containing information about request
+ parameters which were found to be vulnerable.
+ xss (google.cloud.websecurityscanner_v1beta.types.Xss):
+ An addon containing information reported for
+ an XSS, if any.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ finding_type: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ http_method: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ fuzzed_url: str = proto.Field(
+ proto.STRING,
+ number=4,
+ )
+ body: str = proto.Field(
+ proto.STRING,
+ number=5,
+ )
+ description: str = proto.Field(
+ proto.STRING,
+ number=6,
+ )
+ reproduction_url: str = proto.Field(
+ proto.STRING,
+ number=7,
+ )
+ frame_url: str = proto.Field(
+ proto.STRING,
+ number=8,
+ )
+ final_url: str = proto.Field(
+ proto.STRING,
+ number=9,
+ )
+ tracking_id: str = proto.Field(
+ proto.STRING,
+ number=10,
+ )
+ form: finding_addon.Form = proto.Field(
+ proto.MESSAGE,
+ number=16,
+ message=finding_addon.Form,
+ )
+ outdated_library: finding_addon.OutdatedLibrary = proto.Field(
+ proto.MESSAGE,
+ number=11,
+ message=finding_addon.OutdatedLibrary,
+ )
+ violating_resource: finding_addon.ViolatingResource = proto.Field(
+ proto.MESSAGE,
+ number=12,
+ message=finding_addon.ViolatingResource,
+ )
+ vulnerable_headers: finding_addon.VulnerableHeaders = proto.Field(
+ proto.MESSAGE,
+ number=15,
+ message=finding_addon.VulnerableHeaders,
+ )
+ vulnerable_parameters: finding_addon.VulnerableParameters = proto.Field(
+ proto.MESSAGE,
+ number=13,
+ message=finding_addon.VulnerableParameters,
+ )
+ xss: finding_addon.Xss = proto.Field(
+ proto.MESSAGE,
+ number=14,
+ message=finding_addon.Xss,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/finding_addon.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/finding_addon.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/finding_addon.py
@@ -0,0 +1,182 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1beta",
+ manifest={
+ "Form",
+ "OutdatedLibrary",
+ "ViolatingResource",
+ "VulnerableParameters",
+ "VulnerableHeaders",
+ "Xss",
+ },
+)
+
+
+class Form(proto.Message):
+ r"""! Information about a vulnerability with an HTML.
+
+ Attributes:
+ action_uri (str):
+ ! The URI where to send the form when it's
+ submitted.
+ fields (MutableSequence[str]):
+ ! The names of form fields related to the
+ vulnerability.
+ """
+
+ action_uri: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ fields: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=2,
+ )
+
+
+class OutdatedLibrary(proto.Message):
+ r"""Information reported for an outdated library.
+
+ Attributes:
+ library_name (str):
+ The name of the outdated library.
+ version (str):
+ The version number.
+ learn_more_urls (MutableSequence[str]):
+ URLs to learn more information about the
+ vulnerabilities in the library.
+ """
+
+ library_name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ version: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ learn_more_urls: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=3,
+ )
+
+
+class ViolatingResource(proto.Message):
+ r"""Information regarding any resource causing the vulnerability
+ such as JavaScript sources, image, audio files, etc.
+
+ Attributes:
+ content_type (str):
+ The MIME type of this resource.
+ resource_url (str):
+ URL of this violating resource.
+ """
+
+ content_type: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ resource_url: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class VulnerableParameters(proto.Message):
+ r"""Information about vulnerable request parameters.
+
+ Attributes:
+ parameter_names (MutableSequence[str]):
+ The vulnerable parameter names.
+ """
+
+ parameter_names: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=1,
+ )
+
+
+class VulnerableHeaders(proto.Message):
+ r"""Information about vulnerable or missing HTTP Headers.
+
+ Attributes:
+ headers (MutableSequence[google.cloud.websecurityscanner_v1beta.types.VulnerableHeaders.Header]):
+ List of vulnerable headers.
+ missing_headers (MutableSequence[google.cloud.websecurityscanner_v1beta.types.VulnerableHeaders.Header]):
+ List of missing headers.
+ """
+
+ class Header(proto.Message):
+ r"""Describes a HTTP Header.
+
+ Attributes:
+ name (str):
+ Header name.
+ value (str):
+ Header value.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ value: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+ headers: MutableSequence[Header] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=Header,
+ )
+ missing_headers: MutableSequence[Header] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=2,
+ message=Header,
+ )
+
+
+class Xss(proto.Message):
+ r"""Information reported for an XSS.
+
+ Attributes:
+ stack_traces (MutableSequence[str]):
+ Stack traces leading to the point where the
+ XSS occurred.
+ error_message (str):
+ An error message generated by a javascript
+ breakage.
+ """
+
+ stack_traces: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=1,
+ )
+ error_message: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/finding_type_stats.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/finding_type_stats.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/finding_type_stats.py
@@ -0,0 +1,52 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1beta",
+ manifest={
+ "FindingTypeStats",
+ },
+)
+
+
+class FindingTypeStats(proto.Message):
+ r"""A FindingTypeStats resource represents stats regarding a
+ specific FindingType of Findings under a given ScanRun.
+
+ Attributes:
+ finding_type (str):
+ The finding type associated with the stats.
+ finding_count (int):
+ The count of findings belonging to this
+ finding type.
+ """
+
+ finding_type: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ finding_count: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_config.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_config.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_config.py
@@ -0,0 +1,319 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import timestamp_pb2 # type: ignore
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1beta.types import scan_run
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1beta",
+ manifest={
+ "ScanConfig",
+ },
+)
+
+
+class ScanConfig(proto.Message):
+ r"""A ScanConfig resource contains the configurations to launch a
+ scan.
+
+ Attributes:
+ name (str):
+ The resource name of the ScanConfig. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ The ScanConfig IDs are generated by the system.
+ display_name (str):
+ Required. The user provided display name of
+ the ScanConfig.
+ max_qps (int):
+ The maximum QPS during scanning. A valid value ranges from 5
+ to 20 inclusively. If the field is unspecified or its value
+ is set 0, server will default to 15. Other values outside of
+ [5, 20] range will be rejected with INVALID_ARGUMENT error.
+ starting_urls (MutableSequence[str]):
+ Required. The starting URLs from which the
+ scanner finds site pages.
+ authentication (google.cloud.websecurityscanner_v1beta.types.ScanConfig.Authentication):
+ The authentication configuration. If
+ specified, service will use the authentication
+ configuration during scanning.
+ user_agent (google.cloud.websecurityscanner_v1beta.types.ScanConfig.UserAgent):
+ The user agent used during scanning.
+ blacklist_patterns (MutableSequence[str]):
+ The blacklist URL patterns as described in
+ https://cloud.google.com/security-scanner/docs/excluded-urls
+ schedule (google.cloud.websecurityscanner_v1beta.types.ScanConfig.Schedule):
+ The schedule of the ScanConfig.
+ target_platforms (MutableSequence[google.cloud.websecurityscanner_v1beta.types.ScanConfig.TargetPlatform]):
+ Set of Cloud Platforms targeted by the scan. If empty,
+ APP_ENGINE will be used as a default.
+ export_to_security_command_center (google.cloud.websecurityscanner_v1beta.types.ScanConfig.ExportToSecurityCommandCenter):
+ Controls export of scan configurations and
+ results to Cloud Security Command Center.
+ latest_run (google.cloud.websecurityscanner_v1beta.types.ScanRun):
+ Latest ScanRun if available.
+ risk_level (google.cloud.websecurityscanner_v1beta.types.ScanConfig.RiskLevel):
+ The risk level selected for the scan
+ """
+
+ class UserAgent(proto.Enum):
+ r"""Type of user agents used for scanning.
+
+ Values:
+ USER_AGENT_UNSPECIFIED (0):
+ The user agent is unknown. Service will default to
+ CHROME_LINUX.
+ CHROME_LINUX (1):
+ Chrome on Linux. This is the service default
+ if unspecified.
+ CHROME_ANDROID (2):
+ Chrome on Android.
+ SAFARI_IPHONE (3):
+ Safari on IPhone.
+ """
+ USER_AGENT_UNSPECIFIED = 0
+ CHROME_LINUX = 1
+ CHROME_ANDROID = 2
+ SAFARI_IPHONE = 3
+
+ class TargetPlatform(proto.Enum):
+ r"""Cloud platforms supported by Cloud Web Security Scanner.
+
+ Values:
+ TARGET_PLATFORM_UNSPECIFIED (0):
+ The target platform is unknown. Requests with this enum
+ value will be rejected with INVALID_ARGUMENT error.
+ APP_ENGINE (1):
+ Google App Engine service.
+ COMPUTE (2):
+ Google Compute Engine service.
+ """
+ TARGET_PLATFORM_UNSPECIFIED = 0
+ APP_ENGINE = 1
+ COMPUTE = 2
+
+ class RiskLevel(proto.Enum):
+ r"""Scan risk levels supported by Cloud Web Security Scanner. LOW
+ impact scanning will minimize requests with the potential to
+ modify data. To achieve the maximum scan coverage, NORMAL risk
+ level is recommended.
+
+ Values:
+ RISK_LEVEL_UNSPECIFIED (0):
+ Use default, which is NORMAL.
+ NORMAL (1):
+ Normal scanning (Recommended)
+ LOW (2):
+ Lower impact scanning
+ """
+ RISK_LEVEL_UNSPECIFIED = 0
+ NORMAL = 1
+ LOW = 2
+
+ class ExportToSecurityCommandCenter(proto.Enum):
+ r"""Controls export of scan configurations and results to Cloud
+ Security Command Center.
+
+ Values:
+ EXPORT_TO_SECURITY_COMMAND_CENTER_UNSPECIFIED (0):
+ Use default, which is ENABLED.
+ ENABLED (1):
+ Export results of this scan to Cloud Security
+ Command Center.
+ DISABLED (2):
+ Do not export results of this scan to Cloud
+ Security Command Center.
+ """
+ EXPORT_TO_SECURITY_COMMAND_CENTER_UNSPECIFIED = 0
+ ENABLED = 1
+ DISABLED = 2
+
+ class Authentication(proto.Message):
+ r"""Scan authentication configuration.
+
+ This message has `oneof`_ fields (mutually exclusive fields).
+ For each oneof, at most one member field can be set at the same time.
+ Setting any member of the oneof automatically clears all other
+ members.
+
+ .. _oneof: https://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields
+
+ Attributes:
+ google_account (google.cloud.websecurityscanner_v1beta.types.ScanConfig.Authentication.GoogleAccount):
+ Authentication using a Google account.
+
+ This field is a member of `oneof`_ ``authentication``.
+ custom_account (google.cloud.websecurityscanner_v1beta.types.ScanConfig.Authentication.CustomAccount):
+ Authentication using a custom account.
+
+ This field is a member of `oneof`_ ``authentication``.
+ """
+
+ class GoogleAccount(proto.Message):
+ r"""Describes authentication configuration that uses a Google
+ account.
+
+ Attributes:
+ username (str):
+ Required. The user name of the Google
+ account.
+ password (str):
+ Required. Input only. The password of the
+ Google account. The credential is stored
+ encrypted and not returned in any response nor
+ included in audit logs.
+ """
+
+ username: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ password: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+ class CustomAccount(proto.Message):
+ r"""Describes authentication configuration that uses a custom
+ account.
+
+ Attributes:
+ username (str):
+ Required. The user name of the custom
+ account.
+ password (str):
+ Required. Input only. The password of the
+ custom account. The credential is stored
+ encrypted and not returned in any response nor
+ included in audit logs.
+ login_url (str):
+ Required. The login form URL of the website.
+ """
+
+ username: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ password: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ login_url: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+
+ google_account: "ScanConfig.Authentication.GoogleAccount" = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ oneof="authentication",
+ message="ScanConfig.Authentication.GoogleAccount",
+ )
+ custom_account: "ScanConfig.Authentication.CustomAccount" = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ oneof="authentication",
+ message="ScanConfig.Authentication.CustomAccount",
+ )
+
+ class Schedule(proto.Message):
+ r"""Scan schedule configuration.
+
+ Attributes:
+ schedule_time (google.protobuf.timestamp_pb2.Timestamp):
+ A timestamp indicates when the next run will
+ be scheduled. The value is refreshed by the
+ server after each run. If unspecified, it will
+ default to current server time, which means the
+ scan will be scheduled to start immediately.
+ interval_duration_days (int):
+ Required. The duration of time between
+ executions in days.
+ """
+
+ schedule_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=1,
+ message=timestamp_pb2.Timestamp,
+ )
+ interval_duration_days: int = proto.Field(
+ proto.INT32,
+ number=2,
+ )
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ display_name: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ max_qps: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+ starting_urls: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=4,
+ )
+ authentication: Authentication = proto.Field(
+ proto.MESSAGE,
+ number=5,
+ message=Authentication,
+ )
+ user_agent: UserAgent = proto.Field(
+ proto.ENUM,
+ number=6,
+ enum=UserAgent,
+ )
+ blacklist_patterns: MutableSequence[str] = proto.RepeatedField(
+ proto.STRING,
+ number=7,
+ )
+ schedule: Schedule = proto.Field(
+ proto.MESSAGE,
+ number=8,
+ message=Schedule,
+ )
+ target_platforms: MutableSequence[TargetPlatform] = proto.RepeatedField(
+ proto.ENUM,
+ number=9,
+ enum=TargetPlatform,
+ )
+ export_to_security_command_center: ExportToSecurityCommandCenter = proto.Field(
+ proto.ENUM,
+ number=10,
+ enum=ExportToSecurityCommandCenter,
+ )
+ latest_run: scan_run.ScanRun = proto.Field(
+ proto.MESSAGE,
+ number=11,
+ message=scan_run.ScanRun,
+ )
+ risk_level: RiskLevel = proto.Field(
+ proto.ENUM,
+ number=12,
+ enum=RiskLevel,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_config_error.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_config_error.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_config_error.py
@@ -0,0 +1,241 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1beta",
+ manifest={
+ "ScanConfigError",
+ },
+)
+
+
+class ScanConfigError(proto.Message):
+ r"""Defines a custom error message used by CreateScanConfig and
+ UpdateScanConfig APIs when scan configuration validation fails.
+ It is also reported as part of a ScanRunErrorTrace message if
+ scan validation fails due to a scan configuration error.
+
+ Attributes:
+ code (google.cloud.websecurityscanner_v1beta.types.ScanConfigError.Code):
+ Indicates the reason code for a configuration
+ failure.
+ field_name (str):
+ Indicates the full name of the ScanConfig field that
+ triggers this error, for example "scan_config.max_qps". This
+ field is provided for troubleshooting purposes only and its
+ actual value can change in the future.
+ """
+
+ class Code(proto.Enum):
+ r"""Output only.
+ Defines an error reason code.
+ Next id: 44
+
+ Values:
+ CODE_UNSPECIFIED (0):
+ There is no error.
+ OK (0):
+ There is no error.
+ INTERNAL_ERROR (1):
+ Indicates an internal server error.
+ Please DO NOT USE THIS ERROR CODE unless the
+ root cause is truly unknown.
+ APPENGINE_API_BACKEND_ERROR (2):
+ One of the seed URLs is an App Engine URL but
+ we cannot validate the scan settings due to an
+ App Engine API backend error.
+ APPENGINE_API_NOT_ACCESSIBLE (3):
+ One of the seed URLs is an App Engine URL but
+ we cannot access the App Engine API to validate
+ scan settings.
+ APPENGINE_DEFAULT_HOST_MISSING (4):
+ One of the seed URLs is an App Engine URL but
+ the Default Host of the App Engine is not set.
+ CANNOT_USE_GOOGLE_COM_ACCOUNT (6):
+ Google corporate accounts can not be used for
+ scanning.
+ CANNOT_USE_OWNER_ACCOUNT (7):
+ The account of the scan creator can not be
+ used for scanning.
+ COMPUTE_API_BACKEND_ERROR (8):
+ This scan targets Compute Engine, but we
+ cannot validate scan settings due to a Compute
+ Engine API backend error.
+ COMPUTE_API_NOT_ACCESSIBLE (9):
+ This scan targets Compute Engine, but we
+ cannot access the Compute Engine API to validate
+ the scan settings.
+ CUSTOM_LOGIN_URL_DOES_NOT_BELONG_TO_CURRENT_PROJECT (10):
+ The Custom Login URL does not belong to the
+ current project.
+ CUSTOM_LOGIN_URL_MALFORMED (11):
+ The Custom Login URL is malformed (can not be
+ parsed).
+ CUSTOM_LOGIN_URL_MAPPED_TO_NON_ROUTABLE_ADDRESS (12):
+ The Custom Login URL is mapped to a
+ non-routable IP address in DNS.
+ CUSTOM_LOGIN_URL_MAPPED_TO_UNRESERVED_ADDRESS (13):
+ The Custom Login URL is mapped to an IP
+ address which is not reserved for the current
+ project.
+ CUSTOM_LOGIN_URL_HAS_NON_ROUTABLE_IP_ADDRESS (14):
+ The Custom Login URL has a non-routable IP
+ address.
+ CUSTOM_LOGIN_URL_HAS_UNRESERVED_IP_ADDRESS (15):
+ The Custom Login URL has an IP address which
+ is not reserved for the current project.
+ DUPLICATE_SCAN_NAME (16):
+ Another scan with the same name
+ (case-sensitive) already exists.
+ INVALID_FIELD_VALUE (18):
+ A field is set to an invalid value.
+ FAILED_TO_AUTHENTICATE_TO_TARGET (19):
+ There was an error trying to authenticate to
+ the scan target.
+ FINDING_TYPE_UNSPECIFIED (20):
+ Finding type value is not specified in the
+ list findings request.
+ FORBIDDEN_TO_SCAN_COMPUTE (21):
+ Scan targets Compute Engine, yet current
+ project was not whitelisted for Google Compute
+ Engine Scanning Alpha access.
+ FORBIDDEN_UPDATE_TO_MANAGED_SCAN (43):
+ User tries to update managed scan
+ MALFORMED_FILTER (22):
+ The supplied filter is malformed. For
+ example, it can not be parsed, does not have a
+ filter type in expression, or the same filter
+ type appears more than once.
+ MALFORMED_RESOURCE_NAME (23):
+ The supplied resource name is malformed (can
+ not be parsed).
+ PROJECT_INACTIVE (24):
+ The current project is not in an active
+ state.
+ REQUIRED_FIELD (25):
+ A required field is not set.
+ RESOURCE_NAME_INCONSISTENT (26):
+ Project id, scanconfig id, scanrun id, or
+ finding id are not consistent with each other in
+ resource name.
+ SCAN_ALREADY_RUNNING (27):
+ The scan being requested to start is already
+ running.
+ SCAN_NOT_RUNNING (28):
+ The scan that was requested to be stopped is
+ not running.
+ SEED_URL_DOES_NOT_BELONG_TO_CURRENT_PROJECT (29):
+ One of the seed URLs does not belong to the
+ current project.
+ SEED_URL_MALFORMED (30):
+ One of the seed URLs is malformed (can not be
+ parsed).
+ SEED_URL_MAPPED_TO_NON_ROUTABLE_ADDRESS (31):
+ One of the seed URLs is mapped to a
+ non-routable IP address in DNS.
+ SEED_URL_MAPPED_TO_UNRESERVED_ADDRESS (32):
+ One of the seed URLs is mapped to an IP
+ address which is not reserved for the current
+ project.
+ SEED_URL_HAS_NON_ROUTABLE_IP_ADDRESS (33):
+ One of the seed URLs has on-routable IP
+ address.
+ SEED_URL_HAS_UNRESERVED_IP_ADDRESS (35):
+ One of the seed URLs has an IP address that
+ is not reserved for the current project.
+ SERVICE_ACCOUNT_NOT_CONFIGURED (36):
+ The Cloud Security Scanner service account is
+ not configured under the project.
+ TOO_MANY_SCANS (37):
+ A project has reached the maximum number of
+ scans.
+ UNABLE_TO_RESOLVE_PROJECT_INFO (38):
+ Resolving the details of the current project
+ fails.
+ UNSUPPORTED_BLACKLIST_PATTERN_FORMAT (39):
+ One or more blacklist patterns were in the
+ wrong format.
+ UNSUPPORTED_FILTER (40):
+ The supplied filter is not supported.
+ UNSUPPORTED_FINDING_TYPE (41):
+ The supplied finding type is not supported.
+ For example, we do not provide findings of the
+ given finding type.
+ UNSUPPORTED_URL_SCHEME (42):
+ The URL scheme of one or more of the supplied
+ URLs is not supported.
+ """
+ _pb_options = {"allow_alias": True}
+ CODE_UNSPECIFIED = 0
+ OK = 0
+ INTERNAL_ERROR = 1
+ APPENGINE_API_BACKEND_ERROR = 2
+ APPENGINE_API_NOT_ACCESSIBLE = 3
+ APPENGINE_DEFAULT_HOST_MISSING = 4
+ CANNOT_USE_GOOGLE_COM_ACCOUNT = 6
+ CANNOT_USE_OWNER_ACCOUNT = 7
+ COMPUTE_API_BACKEND_ERROR = 8
+ COMPUTE_API_NOT_ACCESSIBLE = 9
+ CUSTOM_LOGIN_URL_DOES_NOT_BELONG_TO_CURRENT_PROJECT = 10
+ CUSTOM_LOGIN_URL_MALFORMED = 11
+ CUSTOM_LOGIN_URL_MAPPED_TO_NON_ROUTABLE_ADDRESS = 12
+ CUSTOM_LOGIN_URL_MAPPED_TO_UNRESERVED_ADDRESS = 13
+ CUSTOM_LOGIN_URL_HAS_NON_ROUTABLE_IP_ADDRESS = 14
+ CUSTOM_LOGIN_URL_HAS_UNRESERVED_IP_ADDRESS = 15
+ DUPLICATE_SCAN_NAME = 16
+ INVALID_FIELD_VALUE = 18
+ FAILED_TO_AUTHENTICATE_TO_TARGET = 19
+ FINDING_TYPE_UNSPECIFIED = 20
+ FORBIDDEN_TO_SCAN_COMPUTE = 21
+ FORBIDDEN_UPDATE_TO_MANAGED_SCAN = 43
+ MALFORMED_FILTER = 22
+ MALFORMED_RESOURCE_NAME = 23
+ PROJECT_INACTIVE = 24
+ REQUIRED_FIELD = 25
+ RESOURCE_NAME_INCONSISTENT = 26
+ SCAN_ALREADY_RUNNING = 27
+ SCAN_NOT_RUNNING = 28
+ SEED_URL_DOES_NOT_BELONG_TO_CURRENT_PROJECT = 29
+ SEED_URL_MALFORMED = 30
+ SEED_URL_MAPPED_TO_NON_ROUTABLE_ADDRESS = 31
+ SEED_URL_MAPPED_TO_UNRESERVED_ADDRESS = 32
+ SEED_URL_HAS_NON_ROUTABLE_IP_ADDRESS = 33
+ SEED_URL_HAS_UNRESERVED_IP_ADDRESS = 35
+ SERVICE_ACCOUNT_NOT_CONFIGURED = 36
+ TOO_MANY_SCANS = 37
+ UNABLE_TO_RESOLVE_PROJECT_INFO = 38
+ UNSUPPORTED_BLACKLIST_PATTERN_FORMAT = 39
+ UNSUPPORTED_FILTER = 40
+ UNSUPPORTED_FINDING_TYPE = 41
+ UNSUPPORTED_URL_SCHEME = 42
+
+ code: Code = proto.Field(
+ proto.ENUM,
+ number=1,
+ enum=Code,
+ )
+ field_name: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_run.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_run.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_run.py
@@ -0,0 +1,182 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import timestamp_pb2 # type: ignore
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1beta.types import (
+ scan_run_error_trace,
+ scan_run_warning_trace,
+)
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1beta",
+ manifest={
+ "ScanRun",
+ },
+)
+
+
+class ScanRun(proto.Message):
+ r"""A ScanRun is a output-only resource representing an actual
+ run of the scan. Next id: 12
+
+ Attributes:
+ name (str):
+ The resource name of the ScanRun. The name
+ follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ The ScanRun IDs are generated by the system.
+ execution_state (google.cloud.websecurityscanner_v1beta.types.ScanRun.ExecutionState):
+ The execution state of the ScanRun.
+ result_state (google.cloud.websecurityscanner_v1beta.types.ScanRun.ResultState):
+ The result state of the ScanRun. This field
+ is only available after the execution state
+ reaches "FINISHED".
+ start_time (google.protobuf.timestamp_pb2.Timestamp):
+ The time at which the ScanRun started.
+ end_time (google.protobuf.timestamp_pb2.Timestamp):
+ The time at which the ScanRun reached
+ termination state - that the ScanRun is either
+ finished or stopped by user.
+ urls_crawled_count (int):
+ The number of URLs crawled during this
+ ScanRun. If the scan is in progress, the value
+ represents the number of URLs crawled up to now.
+ urls_tested_count (int):
+ The number of URLs tested during this
+ ScanRun. If the scan is in progress, the value
+ represents the number of URLs tested up to now.
+ The number of URLs tested is usually larger than
+ the number URLS crawled because typically a
+ crawled URL is tested with multiple test
+ payloads.
+ has_vulnerabilities (bool):
+ Whether the scan run has found any
+ vulnerabilities.
+ progress_percent (int):
+ The percentage of total completion ranging
+ from 0 to 100. If the scan is in queue, the
+ value is 0. If the scan is running, the value
+ ranges from 0 to 100. If the scan is finished,
+ the value is 100.
+ error_trace (google.cloud.websecurityscanner_v1beta.types.ScanRunErrorTrace):
+ If result_state is an ERROR, this field provides the primary
+ reason for scan's termination and more details, if such are
+ available.
+ warning_traces (MutableSequence[google.cloud.websecurityscanner_v1beta.types.ScanRunWarningTrace]):
+ A list of warnings, if such are encountered
+ during this scan run.
+ """
+
+ class ExecutionState(proto.Enum):
+ r"""Types of ScanRun execution state.
+
+ Values:
+ EXECUTION_STATE_UNSPECIFIED (0):
+ Represents an invalid state caused by
+ internal server error. This value should never
+ be returned.
+ QUEUED (1):
+ The scan is waiting in the queue.
+ SCANNING (2):
+ The scan is in progress.
+ FINISHED (3):
+ The scan is either finished or stopped by
+ user.
+ """
+ EXECUTION_STATE_UNSPECIFIED = 0
+ QUEUED = 1
+ SCANNING = 2
+ FINISHED = 3
+
+ class ResultState(proto.Enum):
+ r"""Types of ScanRun result state.
+
+ Values:
+ RESULT_STATE_UNSPECIFIED (0):
+ Default value. This value is returned when
+ the ScanRun is not yet finished.
+ SUCCESS (1):
+ The scan finished without errors.
+ ERROR (2):
+ The scan finished with errors.
+ KILLED (3):
+ The scan was terminated by user.
+ """
+ RESULT_STATE_UNSPECIFIED = 0
+ SUCCESS = 1
+ ERROR = 2
+ KILLED = 3
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ execution_state: ExecutionState = proto.Field(
+ proto.ENUM,
+ number=2,
+ enum=ExecutionState,
+ )
+ result_state: ResultState = proto.Field(
+ proto.ENUM,
+ number=3,
+ enum=ResultState,
+ )
+ start_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=4,
+ message=timestamp_pb2.Timestamp,
+ )
+ end_time: timestamp_pb2.Timestamp = proto.Field(
+ proto.MESSAGE,
+ number=5,
+ message=timestamp_pb2.Timestamp,
+ )
+ urls_crawled_count: int = proto.Field(
+ proto.INT64,
+ number=6,
+ )
+ urls_tested_count: int = proto.Field(
+ proto.INT64,
+ number=7,
+ )
+ has_vulnerabilities: bool = proto.Field(
+ proto.BOOL,
+ number=8,
+ )
+ progress_percent: int = proto.Field(
+ proto.INT32,
+ number=9,
+ )
+ error_trace: scan_run_error_trace.ScanRunErrorTrace = proto.Field(
+ proto.MESSAGE,
+ number=10,
+ message=scan_run_error_trace.ScanRunErrorTrace,
+ )
+ warning_traces: MutableSequence[
+ scan_run_warning_trace.ScanRunWarningTrace
+ ] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=11,
+ message=scan_run_warning_trace.ScanRunWarningTrace,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_run_error_trace.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_run_error_trace.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_run_error_trace.py
@@ -0,0 +1,108 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1beta.types import (
+ scan_config_error as gcw_scan_config_error,
+)
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1beta",
+ manifest={
+ "ScanRunErrorTrace",
+ },
+)
+
+
+class ScanRunErrorTrace(proto.Message):
+ r"""Output only.
+ Defines an error trace message for a ScanRun.
+
+ Attributes:
+ code (google.cloud.websecurityscanner_v1beta.types.ScanRunErrorTrace.Code):
+ Indicates the error reason code.
+ scan_config_error (google.cloud.websecurityscanner_v1beta.types.ScanConfigError):
+ If the scan encounters SCAN_CONFIG_ISSUE error, this field
+ has the error message encountered during scan configuration
+ validation that is performed before each scan run.
+ most_common_http_error_code (int):
+ If the scan encounters TOO_MANY_HTTP_ERRORS, this field
+ indicates the most common HTTP error code, if such is
+ available. For example, if this code is 404, the scan has
+ encountered too many NOT_FOUND responses.
+ """
+
+ class Code(proto.Enum):
+ r"""Output only.
+ Defines an error reason code.
+ Next id: 7
+
+ Values:
+ CODE_UNSPECIFIED (0):
+ Default value is never used.
+ INTERNAL_ERROR (1):
+ Indicates that the scan run failed due to an
+ internal server error.
+ SCAN_CONFIG_ISSUE (2):
+ Indicates a scan configuration error, usually due to
+ outdated ScanConfig settings, such as starting_urls or the
+ DNS configuration.
+ AUTHENTICATION_CONFIG_ISSUE (3):
+ Indicates an authentication error, usually
+ due to outdated ScanConfig authentication
+ settings.
+ TIMED_OUT_WHILE_SCANNING (4):
+ Indicates a scan operation timeout, usually
+ caused by a very large site.
+ TOO_MANY_REDIRECTS (5):
+ Indicates that a scan encountered excessive
+ redirects, either to authentication or some
+ other page outside of the scan scope.
+ TOO_MANY_HTTP_ERRORS (6):
+ Indicates that a scan encountered numerous errors from the
+ web site pages. When available, most_common_http_error_code
+ field indicates the most common HTTP error code encountered
+ during the scan.
+ """
+ CODE_UNSPECIFIED = 0
+ INTERNAL_ERROR = 1
+ SCAN_CONFIG_ISSUE = 2
+ AUTHENTICATION_CONFIG_ISSUE = 3
+ TIMED_OUT_WHILE_SCANNING = 4
+ TOO_MANY_REDIRECTS = 5
+ TOO_MANY_HTTP_ERRORS = 6
+
+ code: Code = proto.Field(
+ proto.ENUM,
+ number=1,
+ enum=Code,
+ )
+ scan_config_error: gcw_scan_config_error.ScanConfigError = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=gcw_scan_config_error.ScanConfigError,
+ )
+ most_common_http_error_code: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_run_warning_trace.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_run_warning_trace.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/scan_run_warning_trace.py
@@ -0,0 +1,79 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+import proto # type: ignore
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1beta",
+ manifest={
+ "ScanRunWarningTrace",
+ },
+)
+
+
+class ScanRunWarningTrace(proto.Message):
+ r"""Output only.
+ Defines a warning trace message for ScanRun. Warning traces
+ provide customers with useful information that helps make the
+ scanning process more effective.
+
+ Attributes:
+ code (google.cloud.websecurityscanner_v1beta.types.ScanRunWarningTrace.Code):
+ Indicates the warning code.
+ """
+
+ class Code(proto.Enum):
+ r"""Output only.
+ Defines a warning message code.
+ Next id: 6
+
+ Values:
+ CODE_UNSPECIFIED (0):
+ Default value is never used.
+ INSUFFICIENT_CRAWL_RESULTS (1):
+ Indicates that a scan discovered an
+ unexpectedly low number of URLs. This is
+ sometimes caused by complex navigation features
+ or by using a single URL for numerous pages.
+ TOO_MANY_CRAWL_RESULTS (2):
+ Indicates that a scan discovered too many
+ URLs to test, or excessive redundant URLs.
+ TOO_MANY_FUZZ_TASKS (3):
+ Indicates that too many tests have been
+ generated for the scan. Customer should try
+ reducing the number of starting URLs, increasing
+ the QPS rate, or narrowing down the scope of the
+ scan using the excluded patterns.
+ BLOCKED_BY_IAP (4):
+ Indicates that a scan is blocked by IAP.
+ """
+ CODE_UNSPECIFIED = 0
+ INSUFFICIENT_CRAWL_RESULTS = 1
+ TOO_MANY_CRAWL_RESULTS = 2
+ TOO_MANY_FUZZ_TASKS = 3
+ BLOCKED_BY_IAP = 4
+
+ code: Code = proto.Field(
+ proto.ENUM,
+ number=1,
+ enum=Code,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/web_security_scanner.py b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/web_security_scanner.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/google/cloud/websecurityscanner_v1beta/types/web_security_scanner.py
@@ -0,0 +1,487 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+from __future__ import annotations
+
+from typing import MutableMapping, MutableSequence
+
+from google.protobuf import field_mask_pb2 # type: ignore
+import proto # type: ignore
+
+from google.cloud.websecurityscanner_v1beta.types import (
+ finding_type_stats as gcw_finding_type_stats,
+)
+from google.cloud.websecurityscanner_v1beta.types import scan_config as gcw_scan_config
+from google.cloud.websecurityscanner_v1beta.types import crawled_url, finding
+from google.cloud.websecurityscanner_v1beta.types import scan_run
+
+__protobuf__ = proto.module(
+ package="google.cloud.websecurityscanner.v1beta",
+ manifest={
+ "CreateScanConfigRequest",
+ "DeleteScanConfigRequest",
+ "GetScanConfigRequest",
+ "ListScanConfigsRequest",
+ "UpdateScanConfigRequest",
+ "ListScanConfigsResponse",
+ "StartScanRunRequest",
+ "GetScanRunRequest",
+ "ListScanRunsRequest",
+ "ListScanRunsResponse",
+ "StopScanRunRequest",
+ "ListCrawledUrlsRequest",
+ "ListCrawledUrlsResponse",
+ "GetFindingRequest",
+ "ListFindingsRequest",
+ "ListFindingsResponse",
+ "ListFindingTypeStatsRequest",
+ "ListFindingTypeStatsResponse",
+ },
+)
+
+
+class CreateScanConfigRequest(proto.Message):
+ r"""Request for the ``CreateScanConfig`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name where the
+ scan is created, which should be a project
+ resource name in the format
+ 'projects/{projectId}'.
+ scan_config (google.cloud.websecurityscanner_v1beta.types.ScanConfig):
+ Required. The ScanConfig to be created.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ scan_config: gcw_scan_config.ScanConfig = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=gcw_scan_config.ScanConfig,
+ )
+
+
+class DeleteScanConfigRequest(proto.Message):
+ r"""Request for the ``DeleteScanConfig`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanConfig
+ to be deleted. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class GetScanConfigRequest(proto.Message):
+ r"""Request for the ``GetScanConfig`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanConfig
+ to be returned. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListScanConfigsRequest(proto.Message):
+ r"""Request for the ``ListScanConfigs`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a project resource name in the format
+ 'projects/{projectId}'.
+ page_token (str):
+ A token identifying a page of results to be returned. This
+ should be a ``next_page_token`` value returned from a
+ previous List request. If unspecified, the first page of
+ results is returned.
+ page_size (int):
+ The maximum number of ScanConfigs to return,
+ can be limited by server. If not specified or
+ not positive, the implementation will select a
+ reasonable value.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+
+
+class UpdateScanConfigRequest(proto.Message):
+ r"""Request for the ``UpdateScanConfigRequest`` method.
+
+ Attributes:
+ scan_config (google.cloud.websecurityscanner_v1beta.types.ScanConfig):
+ Required. The ScanConfig to be updated. The
+ name field must be set to identify the resource
+ to be updated. The values of fields not covered
+ by the mask will be ignored.
+ update_mask (google.protobuf.field_mask_pb2.FieldMask):
+ Required. The update mask applies to the resource. For the
+ ``FieldMask`` definition, see
+ https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask
+ """
+
+ scan_config: gcw_scan_config.ScanConfig = proto.Field(
+ proto.MESSAGE,
+ number=2,
+ message=gcw_scan_config.ScanConfig,
+ )
+ update_mask: field_mask_pb2.FieldMask = proto.Field(
+ proto.MESSAGE,
+ number=3,
+ message=field_mask_pb2.FieldMask,
+ )
+
+
+class ListScanConfigsResponse(proto.Message):
+ r"""Response for the ``ListScanConfigs`` method.
+
+ Attributes:
+ scan_configs (MutableSequence[google.cloud.websecurityscanner_v1beta.types.ScanConfig]):
+ The list of ScanConfigs returned.
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ scan_configs: MutableSequence[gcw_scan_config.ScanConfig] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=gcw_scan_config.ScanConfig,
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class StartScanRunRequest(proto.Message):
+ r"""Request for the ``StartScanRun`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanConfig
+ to be used. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class GetScanRunRequest(proto.Message):
+ r"""Request for the ``GetScanRun`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanRun to
+ be returned. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListScanRunsRequest(proto.Message):
+ r"""Request for the ``ListScanRuns`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a scan resource name in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}'.
+ page_token (str):
+ A token identifying a page of results to be returned. This
+ should be a ``next_page_token`` value returned from a
+ previous List request. If unspecified, the first page of
+ results is returned.
+ page_size (int):
+ The maximum number of ScanRuns to return, can
+ be limited by server. If not specified or not
+ positive, the implementation will select a
+ reasonable value.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+
+
+class ListScanRunsResponse(proto.Message):
+ r"""Response for the ``ListScanRuns`` method.
+
+ Attributes:
+ scan_runs (MutableSequence[google.cloud.websecurityscanner_v1beta.types.ScanRun]):
+ The list of ScanRuns returned.
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ scan_runs: MutableSequence[scan_run.ScanRun] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=scan_run.ScanRun,
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class StopScanRunRequest(proto.Message):
+ r"""Request for the ``StopScanRun`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the ScanRun to
+ be stopped. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListCrawledUrlsRequest(proto.Message):
+ r"""Request for the ``ListCrawledUrls`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a scan run resource name in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ page_token (str):
+ A token identifying a page of results to be returned. This
+ should be a ``next_page_token`` value returned from a
+ previous List request. If unspecified, the first page of
+ results is returned.
+ page_size (int):
+ The maximum number of CrawledUrls to return,
+ can be limited by server. If not specified or
+ not positive, the implementation will select a
+ reasonable value.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=3,
+ )
+
+
+class ListCrawledUrlsResponse(proto.Message):
+ r"""Response for the ``ListCrawledUrls`` method.
+
+ Attributes:
+ crawled_urls (MutableSequence[google.cloud.websecurityscanner_v1beta.types.CrawledUrl]):
+ The list of CrawledUrls returned.
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ crawled_urls: MutableSequence[crawled_url.CrawledUrl] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=crawled_url.CrawledUrl,
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class GetFindingRequest(proto.Message):
+ r"""Request for the ``GetFinding`` method.
+
+ Attributes:
+ name (str):
+ Required. The resource name of the Finding to
+ be returned. The name follows the format of
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}/findings/{findingId}'.
+ """
+
+ name: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListFindingsRequest(proto.Message):
+ r"""Request for the ``ListFindings`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a scan run resource name in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ filter (str):
+ Required. The filter expression. The expression must be in
+ the format: . Supported field: 'finding_type'. Supported
+ operator: '='.
+ page_token (str):
+ A token identifying a page of results to be returned. This
+ should be a ``next_page_token`` value returned from a
+ previous List request. If unspecified, the first page of
+ results is returned.
+ page_size (int):
+ The maximum number of Findings to return, can
+ be limited by server. If not specified or not
+ positive, the implementation will select a
+ reasonable value.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+ filter: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+ page_token: str = proto.Field(
+ proto.STRING,
+ number=3,
+ )
+ page_size: int = proto.Field(
+ proto.INT32,
+ number=4,
+ )
+
+
+class ListFindingsResponse(proto.Message):
+ r"""Response for the ``ListFindings`` method.
+
+ Attributes:
+ findings (MutableSequence[google.cloud.websecurityscanner_v1beta.types.Finding]):
+ The list of Findings returned.
+ next_page_token (str):
+ Token to retrieve the next page of results,
+ or empty if there are no more results in the
+ list.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ findings: MutableSequence[finding.Finding] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=finding.Finding,
+ )
+ next_page_token: str = proto.Field(
+ proto.STRING,
+ number=2,
+ )
+
+
+class ListFindingTypeStatsRequest(proto.Message):
+ r"""Request for the ``ListFindingTypeStats`` method.
+
+ Attributes:
+ parent (str):
+ Required. The parent resource name, which
+ should be a scan run resource name in the format
+ 'projects/{projectId}/scanConfigs/{scanConfigId}/scanRuns/{scanRunId}'.
+ """
+
+ parent: str = proto.Field(
+ proto.STRING,
+ number=1,
+ )
+
+
+class ListFindingTypeStatsResponse(proto.Message):
+ r"""Response for the ``ListFindingTypeStats`` method.
+
+ Attributes:
+ finding_type_stats (MutableSequence[google.cloud.websecurityscanner_v1beta.types.FindingTypeStats]):
+ The list of FindingTypeStats returned.
+ """
+
+ finding_type_stats: MutableSequence[
+ gcw_finding_type_stats.FindingTypeStats
+ ] = proto.RepeatedField(
+ proto.MESSAGE,
+ number=1,
+ message=gcw_finding_type_stats.FindingTypeStats,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/packages/google-cloud-websecurityscanner/noxfile.py b/packages/google-cloud-websecurityscanner/noxfile.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/noxfile.py
@@ -0,0 +1,428 @@
+# -*- coding: utf-8 -*-
+#
+# Copyright 2018 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# Generated by synthtool. DO NOT EDIT!
+
+from __future__ import absolute_import
+
+import os
+import pathlib
+import re
+import shutil
+import warnings
+
+import nox
+
+BLACK_VERSION = "black==22.3.0"
+ISORT_VERSION = "isort==5.10.1"
+LINT_PATHS = ["docs", "google", "tests", "noxfile.py", "setup.py"]
+
+DEFAULT_PYTHON_VERSION = "3.9"
+
+UNIT_TEST_PYTHON_VERSIONS = ["3.7", "3.8", "3.9", "3.10", "3.11"]
+UNIT_TEST_STANDARD_DEPENDENCIES = [
+ "mock",
+ "asyncmock",
+ "pytest",
+ "pytest-cov",
+ "pytest-asyncio",
+]
+UNIT_TEST_EXTERNAL_DEPENDENCIES = []
+UNIT_TEST_LOCAL_DEPENDENCIES = []
+UNIT_TEST_DEPENDENCIES = []
+UNIT_TEST_EXTRAS = []
+UNIT_TEST_EXTRAS_BY_PYTHON = {}
+
+SYSTEM_TEST_PYTHON_VERSIONS = []
+SYSTEM_TEST_STANDARD_DEPENDENCIES = [
+ "mock",
+ "pytest",
+ "google-cloud-testutils",
+]
+SYSTEM_TEST_EXTERNAL_DEPENDENCIES = []
+SYSTEM_TEST_LOCAL_DEPENDENCIES = []
+SYSTEM_TEST_DEPENDENCIES = []
+SYSTEM_TEST_EXTRAS = []
+SYSTEM_TEST_EXTRAS_BY_PYTHON = {}
+
+CURRENT_DIRECTORY = pathlib.Path(__file__).parent.absolute()
+
+# 'docfx' is excluded since it only needs to run in 'docs-presubmit'
+nox.options.sessions = [
+ "unit",
+ "system",
+ "cover",
+ "lint",
+ "lint_setup_py",
+ "blacken",
+ "docs",
+]
+
+# Error if a python version is missing
+nox.options.error_on_missing_interpreters = True
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def lint(session):
+ """Run linters.
+
+ Returns a failure if the linters find linting errors or sufficiently
+ serious code quality issues.
+ """
+ session.install("flake8", BLACK_VERSION)
+ session.run(
+ "black",
+ "--check",
+ *LINT_PATHS,
+ )
+ session.run("flake8", "google", "tests")
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def blacken(session):
+ """Run black. Format code to uniform standard."""
+ session.install(BLACK_VERSION)
+ session.run(
+ "black",
+ *LINT_PATHS,
+ )
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def format(session):
+ """
+ Run isort to sort imports. Then run black
+ to format code to uniform standard.
+ """
+ session.install(BLACK_VERSION, ISORT_VERSION)
+ # Use the --fss option to sort imports using strict alphabetical order.
+ # See https://pycqa.github.io/isort/docs/configuration/options.html#force-sort-within-sections
+ session.run(
+ "isort",
+ "--fss",
+ *LINT_PATHS,
+ )
+ session.run(
+ "black",
+ *LINT_PATHS,
+ )
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def lint_setup_py(session):
+ """Verify that setup.py is valid (including RST check)."""
+ session.install("docutils", "pygments")
+ session.run("python", "setup.py", "check", "--restructuredtext", "--strict")
+
+
+def install_unittest_dependencies(session, *constraints):
+ standard_deps = UNIT_TEST_STANDARD_DEPENDENCIES + UNIT_TEST_DEPENDENCIES
+ session.install(*standard_deps, *constraints)
+
+ if UNIT_TEST_EXTERNAL_DEPENDENCIES:
+ warnings.warn(
+ "'unit_test_external_dependencies' is deprecated. Instead, please "
+ "use 'unit_test_dependencies' or 'unit_test_local_dependencies'.",
+ DeprecationWarning,
+ )
+ session.install(*UNIT_TEST_EXTERNAL_DEPENDENCIES, *constraints)
+
+ if UNIT_TEST_LOCAL_DEPENDENCIES:
+ session.install(*UNIT_TEST_LOCAL_DEPENDENCIES, *constraints)
+
+ if UNIT_TEST_EXTRAS_BY_PYTHON:
+ extras = UNIT_TEST_EXTRAS_BY_PYTHON.get(session.python, [])
+ elif UNIT_TEST_EXTRAS:
+ extras = UNIT_TEST_EXTRAS
+ else:
+ extras = []
+
+ if extras:
+ session.install("-e", f".[{','.join(extras)}]", *constraints)
+ else:
+ session.install("-e", ".", *constraints)
+
+
+def default(session):
+ # Install all test dependencies, then install this package in-place.
+
+ constraints_path = str(
+ CURRENT_DIRECTORY / "testing" / f"constraints-{session.python}.txt"
+ )
+ install_unittest_dependencies(session, "-c", constraints_path)
+
+ # Run py.test against the unit tests.
+ session.run(
+ "py.test",
+ "--quiet",
+ f"--junitxml=unit_{session.python}_sponge_log.xml",
+ "--cov=google",
+ "--cov=tests/unit",
+ "--cov-append",
+ "--cov-config=.coveragerc",
+ "--cov-report=",
+ "--cov-fail-under=0",
+ os.path.join("tests", "unit"),
+ *session.posargs,
+ )
+
+
+@nox.session(python=UNIT_TEST_PYTHON_VERSIONS)
+def unit(session):
+ """Run the unit test suite."""
+ default(session)
+
+
+def install_systemtest_dependencies(session, *constraints):
+
+ # Use pre-release gRPC for system tests.
+ # Exclude version 1.52.0rc1 which has a known issue.
+ # See https://github.com/grpc/grpc/issues/32163
+ session.install("--pre", "grpcio!=1.52.0rc1")
+
+ session.install(*SYSTEM_TEST_STANDARD_DEPENDENCIES, *constraints)
+
+ if SYSTEM_TEST_EXTERNAL_DEPENDENCIES:
+ session.install(*SYSTEM_TEST_EXTERNAL_DEPENDENCIES, *constraints)
+
+ if SYSTEM_TEST_LOCAL_DEPENDENCIES:
+ session.install("-e", *SYSTEM_TEST_LOCAL_DEPENDENCIES, *constraints)
+
+ if SYSTEM_TEST_DEPENDENCIES:
+ session.install("-e", *SYSTEM_TEST_DEPENDENCIES, *constraints)
+
+ if SYSTEM_TEST_EXTRAS_BY_PYTHON:
+ extras = SYSTEM_TEST_EXTRAS_BY_PYTHON.get(session.python, [])
+ elif SYSTEM_TEST_EXTRAS:
+ extras = SYSTEM_TEST_EXTRAS
+ else:
+ extras = []
+
+ if extras:
+ session.install("-e", f".[{','.join(extras)}]", *constraints)
+ else:
+ session.install("-e", ".", *constraints)
+
+
+@nox.session(python=SYSTEM_TEST_PYTHON_VERSIONS)
+def system(session):
+ """Run the system test suite."""
+ constraints_path = str(
+ CURRENT_DIRECTORY / "testing" / f"constraints-{session.python}.txt"
+ )
+ system_test_path = os.path.join("tests", "system.py")
+ system_test_folder_path = os.path.join("tests", "system")
+
+ # Check the value of `RUN_SYSTEM_TESTS` env var. It defaults to true.
+ if os.environ.get("RUN_SYSTEM_TESTS", "true") == "false":
+ session.skip("RUN_SYSTEM_TESTS is set to false, skipping")
+ # Install pyopenssl for mTLS testing.
+ if os.environ.get("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false") == "true":
+ session.install("pyopenssl")
+
+ system_test_exists = os.path.exists(system_test_path)
+ system_test_folder_exists = os.path.exists(system_test_folder_path)
+ # Sanity check: only run tests if found.
+ if not system_test_exists and not system_test_folder_exists:
+ session.skip("System tests were not found")
+
+ install_systemtest_dependencies(session, "-c", constraints_path)
+
+ # Run py.test against the system tests.
+ if system_test_exists:
+ session.run(
+ "py.test",
+ "--quiet",
+ f"--junitxml=system_{session.python}_sponge_log.xml",
+ system_test_path,
+ *session.posargs,
+ )
+ if system_test_folder_exists:
+ session.run(
+ "py.test",
+ "--quiet",
+ f"--junitxml=system_{session.python}_sponge_log.xml",
+ system_test_folder_path,
+ *session.posargs,
+ )
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def cover(session):
+ """Run the final coverage report.
+
+ This outputs the coverage report aggregating coverage from the unit
+ test runs (not system test runs), and then erases coverage data.
+ """
+ session.install("coverage", "pytest-cov")
+ session.run("coverage", "report", "--show-missing", "--fail-under=100")
+
+ session.run("coverage", "erase")
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def docs(session):
+ """Build the docs for this library."""
+
+ session.install("-e", ".")
+ session.install(
+ "sphinx==4.0.1",
+ "alabaster",
+ "recommonmark",
+ )
+
+ shutil.rmtree(os.path.join("docs", "_build"), ignore_errors=True)
+ session.run(
+ "sphinx-build",
+ "-W", # warnings as errors
+ "-T", # show full traceback on exception
+ "-N", # no colors
+ "-b",
+ "html",
+ "-d",
+ os.path.join("docs", "_build", "doctrees", ""),
+ os.path.join("docs", ""),
+ os.path.join("docs", "_build", "html", ""),
+ )
+
+
+@nox.session(python=DEFAULT_PYTHON_VERSION)
+def docfx(session):
+ """Build the docfx yaml files for this library."""
+
+ session.install("-e", ".")
+ session.install(
+ "sphinx==4.0.1",
+ "alabaster",
+ "recommonmark",
+ "gcp-sphinx-docfx-yaml",
+ )
+
+ shutil.rmtree(os.path.join("docs", "_build"), ignore_errors=True)
+ session.run(
+ "sphinx-build",
+ "-T", # show full traceback on exception
+ "-N", # no colors
+ "-D",
+ (
+ "extensions=sphinx.ext.autodoc,"
+ "sphinx.ext.autosummary,"
+ "docfx_yaml.extension,"
+ "sphinx.ext.intersphinx,"
+ "sphinx.ext.coverage,"
+ "sphinx.ext.napoleon,"
+ "sphinx.ext.todo,"
+ "sphinx.ext.viewcode,"
+ "recommonmark"
+ ),
+ "-b",
+ "html",
+ "-d",
+ os.path.join("docs", "_build", "doctrees", ""),
+ os.path.join("docs", ""),
+ os.path.join("docs", "_build", "html", ""),
+ )
+
+
+@nox.session(python="3.11")
+def prerelease_deps(session):
+ """Run all tests with prerelease versions of dependencies installed."""
+
+ # Install all dependencies
+ session.install("-e", ".[all, tests, tracing]")
+ unit_deps_all = UNIT_TEST_STANDARD_DEPENDENCIES + UNIT_TEST_EXTERNAL_DEPENDENCIES
+ session.install(*unit_deps_all)
+ system_deps_all = (
+ SYSTEM_TEST_STANDARD_DEPENDENCIES
+ + SYSTEM_TEST_EXTERNAL_DEPENDENCIES
+ + SYSTEM_TEST_EXTRAS
+ )
+ session.install(*system_deps_all)
+
+ # Because we test minimum dependency versions on the minimum Python
+ # version, the first version we test with in the unit tests sessions has a
+ # constraints file containing all dependencies and extras.
+ with open(
+ CURRENT_DIRECTORY
+ / "testing"
+ / f"constraints-{UNIT_TEST_PYTHON_VERSIONS[0]}.txt",
+ encoding="utf-8",
+ ) as constraints_file:
+ constraints_text = constraints_file.read()
+
+ # Ignore leading whitespace and comment lines.
+ constraints_deps = [
+ match.group(1)
+ for match in re.finditer(
+ r"^\s*(\S+)(?===\S+)", constraints_text, flags=re.MULTILINE
+ )
+ ]
+
+ session.install(*constraints_deps)
+
+ prerel_deps = [
+ "protobuf",
+ # dependency of grpc
+ "six",
+ "googleapis-common-protos",
+ # Exclude version 1.52.0rc1 which has a known issue. See https://github.com/grpc/grpc/issues/32163
+ "grpcio!=1.52.0rc1",
+ "grpcio-status",
+ "google-api-core",
+ "proto-plus",
+ "google-cloud-testutils",
+ # dependencies of google-cloud-testutils"
+ "click",
+ ]
+
+ for dep in prerel_deps:
+ session.install("--pre", "--no-deps", "--upgrade", dep)
+
+ # Remaining dependencies
+ other_deps = [
+ "requests",
+ "google-auth",
+ ]
+ session.install(*other_deps)
+
+ # Print out prerelease package versions
+ session.run(
+ "python", "-c", "import google.protobuf; print(google.protobuf.__version__)"
+ )
+ session.run("python", "-c", "import grpc; print(grpc.__version__)")
+
+ session.run("py.test", "tests/unit")
+
+ system_test_path = os.path.join("tests", "system.py")
+ system_test_folder_path = os.path.join("tests", "system")
+
+ # Only run system tests if found.
+ if os.path.exists(system_test_path):
+ session.run(
+ "py.test",
+ "--verbose",
+ f"--junitxml=system_{session.python}_sponge_log.xml",
+ system_test_path,
+ *session.posargs,
+ )
+ if os.path.exists(system_test_folder_path):
+ session.run(
+ "py.test",
+ "--verbose",
+ f"--junitxml=system_{session.python}_sponge_log.xml",
+ system_test_folder_path,
+ *session.posargs,
+ )
diff --git a/packages/google-cloud-websecurityscanner/scripts/fixup_websecurityscanner_v1_keywords.py b/packages/google-cloud-websecurityscanner/scripts/fixup_websecurityscanner_v1_keywords.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/scripts/fixup_websecurityscanner_v1_keywords.py
@@ -0,0 +1,188 @@
+#! /usr/bin/env python3
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import argparse
+import os
+import libcst as cst
+import pathlib
+import sys
+from typing import (Any, Callable, Dict, List, Sequence, Tuple)
+
+
+def partition(
+ predicate: Callable[[Any], bool],
+ iterator: Sequence[Any]
+) -> Tuple[List[Any], List[Any]]:
+ """A stable, out-of-place partition."""
+ results = ([], [])
+
+ for i in iterator:
+ results[int(predicate(i))].append(i)
+
+ # Returns trueList, falseList
+ return results[1], results[0]
+
+
+class websecurityscannerCallTransformer(cst.CSTTransformer):
+ CTRL_PARAMS: Tuple[str] = ('retry', 'timeout', 'metadata')
+ METHOD_TO_PARAMS: Dict[str, Tuple[str]] = {
+ 'create_scan_config': ('parent', 'scan_config', ),
+ 'delete_scan_config': ('name', ),
+ 'get_finding': ('name', ),
+ 'get_scan_config': ('name', ),
+ 'get_scan_run': ('name', ),
+ 'list_crawled_urls': ('parent', 'page_token', 'page_size', ),
+ 'list_findings': ('parent', 'filter', 'page_token', 'page_size', ),
+ 'list_finding_type_stats': ('parent', ),
+ 'list_scan_configs': ('parent', 'page_token', 'page_size', ),
+ 'list_scan_runs': ('parent', 'page_token', 'page_size', ),
+ 'start_scan_run': ('name', ),
+ 'stop_scan_run': ('name', ),
+ 'update_scan_config': ('scan_config', 'update_mask', ),
+ }
+
+ def leave_Call(self, original: cst.Call, updated: cst.Call) -> cst.CSTNode:
+ try:
+ key = original.func.attr.value
+ kword_params = self.METHOD_TO_PARAMS[key]
+ except (AttributeError, KeyError):
+ # Either not a method from the API or too convoluted to be sure.
+ return updated
+
+ # If the existing code is valid, keyword args come after positional args.
+ # Therefore, all positional args must map to the first parameters.
+ args, kwargs = partition(lambda a: not bool(a.keyword), updated.args)
+ if any(k.keyword.value == "request" for k in kwargs):
+ # We've already fixed this file, don't fix it again.
+ return updated
+
+ kwargs, ctrl_kwargs = partition(
+ lambda a: a.keyword.value not in self.CTRL_PARAMS,
+ kwargs
+ )
+
+ args, ctrl_args = args[:len(kword_params)], args[len(kword_params):]
+ ctrl_kwargs.extend(cst.Arg(value=a.value, keyword=cst.Name(value=ctrl))
+ for a, ctrl in zip(ctrl_args, self.CTRL_PARAMS))
+
+ request_arg = cst.Arg(
+ value=cst.Dict([
+ cst.DictElement(
+ cst.SimpleString("'{}'".format(name)),
+cst.Element(value=arg.value)
+ )
+ # Note: the args + kwargs looks silly, but keep in mind that
+ # the control parameters had to be stripped out, and that
+ # those could have been passed positionally or by keyword.
+ for name, arg in zip(kword_params, args + kwargs)]),
+ keyword=cst.Name("request")
+ )
+
+ return updated.with_changes(
+ args=[request_arg] + ctrl_kwargs
+ )
+
+
+def fix_files(
+ in_dir: pathlib.Path,
+ out_dir: pathlib.Path,
+ *,
+ transformer=websecurityscannerCallTransformer(),
+):
+ """Duplicate the input dir to the output dir, fixing file method calls.
+
+ Preconditions:
+ * in_dir is a real directory
+ * out_dir is a real, empty directory
+ """
+ pyfile_gen = (
+ pathlib.Path(os.path.join(root, f))
+ for root, _, files in os.walk(in_dir)
+ for f in files if os.path.splitext(f)[1] == ".py"
+ )
+
+ for fpath in pyfile_gen:
+ with open(fpath, 'r') as f:
+ src = f.read()
+
+ # Parse the code and insert method call fixes.
+ tree = cst.parse_module(src)
+ updated = tree.visit(transformer)
+
+ # Create the path and directory structure for the new file.
+ updated_path = out_dir.joinpath(fpath.relative_to(in_dir))
+ updated_path.parent.mkdir(parents=True, exist_ok=True)
+
+ # Generate the updated source file at the corresponding path.
+ with open(updated_path, 'w') as f:
+ f.write(updated.code)
+
+
+if __name__ == '__main__':
+ parser = argparse.ArgumentParser(
+ description="""Fix up source that uses the websecurityscanner client library.
+
+The existing sources are NOT overwritten but are copied to output_dir with changes made.
+
+Note: This tool operates at a best-effort level at converting positional
+ parameters in client method calls to keyword based parameters.
+ Cases where it WILL FAIL include
+ A) * or ** expansion in a method call.
+ B) Calls via function or method alias (includes free function calls)
+ C) Indirect or dispatched calls (e.g. the method is looked up dynamically)
+
+ These all constitute false negatives. The tool will also detect false
+ positives when an API method shares a name with another method.
+""")
+ parser.add_argument(
+ '-d',
+ '--input-directory',
+ required=True,
+ dest='input_dir',
+ help='the input directory to walk for python files to fix up',
+ )
+ parser.add_argument(
+ '-o',
+ '--output-directory',
+ required=True,
+ dest='output_dir',
+ help='the directory to output files fixed via un-flattening',
+ )
+ args = parser.parse_args()
+ input_dir = pathlib.Path(args.input_dir)
+ output_dir = pathlib.Path(args.output_dir)
+ if not input_dir.is_dir():
+ print(
+ f"input directory '{input_dir}' does not exist or is not a directory",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ if not output_dir.is_dir():
+ print(
+ f"output directory '{output_dir}' does not exist or is not a directory",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ if os.listdir(output_dir):
+ print(
+ f"output directory '{output_dir}' is not empty",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ fix_files(input_dir, output_dir)
diff --git a/packages/google-cloud-websecurityscanner/scripts/fixup_websecurityscanner_v1alpha_keywords.py b/packages/google-cloud-websecurityscanner/scripts/fixup_websecurityscanner_v1alpha_keywords.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/scripts/fixup_websecurityscanner_v1alpha_keywords.py
@@ -0,0 +1,188 @@
+#! /usr/bin/env python3
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import argparse
+import os
+import libcst as cst
+import pathlib
+import sys
+from typing import (Any, Callable, Dict, List, Sequence, Tuple)
+
+
+def partition(
+ predicate: Callable[[Any], bool],
+ iterator: Sequence[Any]
+) -> Tuple[List[Any], List[Any]]:
+ """A stable, out-of-place partition."""
+ results = ([], [])
+
+ for i in iterator:
+ results[int(predicate(i))].append(i)
+
+ # Returns trueList, falseList
+ return results[1], results[0]
+
+
+class websecurityscannerCallTransformer(cst.CSTTransformer):
+ CTRL_PARAMS: Tuple[str] = ('retry', 'timeout', 'metadata')
+ METHOD_TO_PARAMS: Dict[str, Tuple[str]] = {
+ 'create_scan_config': ('parent', 'scan_config', ),
+ 'delete_scan_config': ('name', ),
+ 'get_finding': ('name', ),
+ 'get_scan_config': ('name', ),
+ 'get_scan_run': ('name', ),
+ 'list_crawled_urls': ('parent', 'page_token', 'page_size', ),
+ 'list_findings': ('parent', 'filter', 'page_token', 'page_size', ),
+ 'list_finding_type_stats': ('parent', ),
+ 'list_scan_configs': ('parent', 'page_token', 'page_size', ),
+ 'list_scan_runs': ('parent', 'page_token', 'page_size', ),
+ 'start_scan_run': ('name', ),
+ 'stop_scan_run': ('name', ),
+ 'update_scan_config': ('scan_config', 'update_mask', ),
+ }
+
+ def leave_Call(self, original: cst.Call, updated: cst.Call) -> cst.CSTNode:
+ try:
+ key = original.func.attr.value
+ kword_params = self.METHOD_TO_PARAMS[key]
+ except (AttributeError, KeyError):
+ # Either not a method from the API or too convoluted to be sure.
+ return updated
+
+ # If the existing code is valid, keyword args come after positional args.
+ # Therefore, all positional args must map to the first parameters.
+ args, kwargs = partition(lambda a: not bool(a.keyword), updated.args)
+ if any(k.keyword.value == "request" for k in kwargs):
+ # We've already fixed this file, don't fix it again.
+ return updated
+
+ kwargs, ctrl_kwargs = partition(
+ lambda a: a.keyword.value not in self.CTRL_PARAMS,
+ kwargs
+ )
+
+ args, ctrl_args = args[:len(kword_params)], args[len(kword_params):]
+ ctrl_kwargs.extend(cst.Arg(value=a.value, keyword=cst.Name(value=ctrl))
+ for a, ctrl in zip(ctrl_args, self.CTRL_PARAMS))
+
+ request_arg = cst.Arg(
+ value=cst.Dict([
+ cst.DictElement(
+ cst.SimpleString("'{}'".format(name)),
+cst.Element(value=arg.value)
+ )
+ # Note: the args + kwargs looks silly, but keep in mind that
+ # the control parameters had to be stripped out, and that
+ # those could have been passed positionally or by keyword.
+ for name, arg in zip(kword_params, args + kwargs)]),
+ keyword=cst.Name("request")
+ )
+
+ return updated.with_changes(
+ args=[request_arg] + ctrl_kwargs
+ )
+
+
+def fix_files(
+ in_dir: pathlib.Path,
+ out_dir: pathlib.Path,
+ *,
+ transformer=websecurityscannerCallTransformer(),
+):
+ """Duplicate the input dir to the output dir, fixing file method calls.
+
+ Preconditions:
+ * in_dir is a real directory
+ * out_dir is a real, empty directory
+ """
+ pyfile_gen = (
+ pathlib.Path(os.path.join(root, f))
+ for root, _, files in os.walk(in_dir)
+ for f in files if os.path.splitext(f)[1] == ".py"
+ )
+
+ for fpath in pyfile_gen:
+ with open(fpath, 'r') as f:
+ src = f.read()
+
+ # Parse the code and insert method call fixes.
+ tree = cst.parse_module(src)
+ updated = tree.visit(transformer)
+
+ # Create the path and directory structure for the new file.
+ updated_path = out_dir.joinpath(fpath.relative_to(in_dir))
+ updated_path.parent.mkdir(parents=True, exist_ok=True)
+
+ # Generate the updated source file at the corresponding path.
+ with open(updated_path, 'w') as f:
+ f.write(updated.code)
+
+
+if __name__ == '__main__':
+ parser = argparse.ArgumentParser(
+ description="""Fix up source that uses the websecurityscanner client library.
+
+The existing sources are NOT overwritten but are copied to output_dir with changes made.
+
+Note: This tool operates at a best-effort level at converting positional
+ parameters in client method calls to keyword based parameters.
+ Cases where it WILL FAIL include
+ A) * or ** expansion in a method call.
+ B) Calls via function or method alias (includes free function calls)
+ C) Indirect or dispatched calls (e.g. the method is looked up dynamically)
+
+ These all constitute false negatives. The tool will also detect false
+ positives when an API method shares a name with another method.
+""")
+ parser.add_argument(
+ '-d',
+ '--input-directory',
+ required=True,
+ dest='input_dir',
+ help='the input directory to walk for python files to fix up',
+ )
+ parser.add_argument(
+ '-o',
+ '--output-directory',
+ required=True,
+ dest='output_dir',
+ help='the directory to output files fixed via un-flattening',
+ )
+ args = parser.parse_args()
+ input_dir = pathlib.Path(args.input_dir)
+ output_dir = pathlib.Path(args.output_dir)
+ if not input_dir.is_dir():
+ print(
+ f"input directory '{input_dir}' does not exist or is not a directory",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ if not output_dir.is_dir():
+ print(
+ f"output directory '{output_dir}' does not exist or is not a directory",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ if os.listdir(output_dir):
+ print(
+ f"output directory '{output_dir}' is not empty",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ fix_files(input_dir, output_dir)
diff --git a/packages/google-cloud-websecurityscanner/scripts/fixup_websecurityscanner_v1beta_keywords.py b/packages/google-cloud-websecurityscanner/scripts/fixup_websecurityscanner_v1beta_keywords.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/scripts/fixup_websecurityscanner_v1beta_keywords.py
@@ -0,0 +1,188 @@
+#! /usr/bin/env python3
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import argparse
+import os
+import libcst as cst
+import pathlib
+import sys
+from typing import (Any, Callable, Dict, List, Sequence, Tuple)
+
+
+def partition(
+ predicate: Callable[[Any], bool],
+ iterator: Sequence[Any]
+) -> Tuple[List[Any], List[Any]]:
+ """A stable, out-of-place partition."""
+ results = ([], [])
+
+ for i in iterator:
+ results[int(predicate(i))].append(i)
+
+ # Returns trueList, falseList
+ return results[1], results[0]
+
+
+class websecurityscannerCallTransformer(cst.CSTTransformer):
+ CTRL_PARAMS: Tuple[str] = ('retry', 'timeout', 'metadata')
+ METHOD_TO_PARAMS: Dict[str, Tuple[str]] = {
+ 'create_scan_config': ('parent', 'scan_config', ),
+ 'delete_scan_config': ('name', ),
+ 'get_finding': ('name', ),
+ 'get_scan_config': ('name', ),
+ 'get_scan_run': ('name', ),
+ 'list_crawled_urls': ('parent', 'page_token', 'page_size', ),
+ 'list_findings': ('parent', 'filter', 'page_token', 'page_size', ),
+ 'list_finding_type_stats': ('parent', ),
+ 'list_scan_configs': ('parent', 'page_token', 'page_size', ),
+ 'list_scan_runs': ('parent', 'page_token', 'page_size', ),
+ 'start_scan_run': ('name', ),
+ 'stop_scan_run': ('name', ),
+ 'update_scan_config': ('scan_config', 'update_mask', ),
+ }
+
+ def leave_Call(self, original: cst.Call, updated: cst.Call) -> cst.CSTNode:
+ try:
+ key = original.func.attr.value
+ kword_params = self.METHOD_TO_PARAMS[key]
+ except (AttributeError, KeyError):
+ # Either not a method from the API or too convoluted to be sure.
+ return updated
+
+ # If the existing code is valid, keyword args come after positional args.
+ # Therefore, all positional args must map to the first parameters.
+ args, kwargs = partition(lambda a: not bool(a.keyword), updated.args)
+ if any(k.keyword.value == "request" for k in kwargs):
+ # We've already fixed this file, don't fix it again.
+ return updated
+
+ kwargs, ctrl_kwargs = partition(
+ lambda a: a.keyword.value not in self.CTRL_PARAMS,
+ kwargs
+ )
+
+ args, ctrl_args = args[:len(kword_params)], args[len(kword_params):]
+ ctrl_kwargs.extend(cst.Arg(value=a.value, keyword=cst.Name(value=ctrl))
+ for a, ctrl in zip(ctrl_args, self.CTRL_PARAMS))
+
+ request_arg = cst.Arg(
+ value=cst.Dict([
+ cst.DictElement(
+ cst.SimpleString("'{}'".format(name)),
+cst.Element(value=arg.value)
+ )
+ # Note: the args + kwargs looks silly, but keep in mind that
+ # the control parameters had to be stripped out, and that
+ # those could have been passed positionally or by keyword.
+ for name, arg in zip(kword_params, args + kwargs)]),
+ keyword=cst.Name("request")
+ )
+
+ return updated.with_changes(
+ args=[request_arg] + ctrl_kwargs
+ )
+
+
+def fix_files(
+ in_dir: pathlib.Path,
+ out_dir: pathlib.Path,
+ *,
+ transformer=websecurityscannerCallTransformer(),
+):
+ """Duplicate the input dir to the output dir, fixing file method calls.
+
+ Preconditions:
+ * in_dir is a real directory
+ * out_dir is a real, empty directory
+ """
+ pyfile_gen = (
+ pathlib.Path(os.path.join(root, f))
+ for root, _, files in os.walk(in_dir)
+ for f in files if os.path.splitext(f)[1] == ".py"
+ )
+
+ for fpath in pyfile_gen:
+ with open(fpath, 'r') as f:
+ src = f.read()
+
+ # Parse the code and insert method call fixes.
+ tree = cst.parse_module(src)
+ updated = tree.visit(transformer)
+
+ # Create the path and directory structure for the new file.
+ updated_path = out_dir.joinpath(fpath.relative_to(in_dir))
+ updated_path.parent.mkdir(parents=True, exist_ok=True)
+
+ # Generate the updated source file at the corresponding path.
+ with open(updated_path, 'w') as f:
+ f.write(updated.code)
+
+
+if __name__ == '__main__':
+ parser = argparse.ArgumentParser(
+ description="""Fix up source that uses the websecurityscanner client library.
+
+The existing sources are NOT overwritten but are copied to output_dir with changes made.
+
+Note: This tool operates at a best-effort level at converting positional
+ parameters in client method calls to keyword based parameters.
+ Cases where it WILL FAIL include
+ A) * or ** expansion in a method call.
+ B) Calls via function or method alias (includes free function calls)
+ C) Indirect or dispatched calls (e.g. the method is looked up dynamically)
+
+ These all constitute false negatives. The tool will also detect false
+ positives when an API method shares a name with another method.
+""")
+ parser.add_argument(
+ '-d',
+ '--input-directory',
+ required=True,
+ dest='input_dir',
+ help='the input directory to walk for python files to fix up',
+ )
+ parser.add_argument(
+ '-o',
+ '--output-directory',
+ required=True,
+ dest='output_dir',
+ help='the directory to output files fixed via un-flattening',
+ )
+ args = parser.parse_args()
+ input_dir = pathlib.Path(args.input_dir)
+ output_dir = pathlib.Path(args.output_dir)
+ if not input_dir.is_dir():
+ print(
+ f"input directory '{input_dir}' does not exist or is not a directory",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ if not output_dir.is_dir():
+ print(
+ f"output directory '{output_dir}' does not exist or is not a directory",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ if os.listdir(output_dir):
+ print(
+ f"output directory '{output_dir}' is not empty",
+ file=sys.stderr,
+ )
+ sys.exit(-1)
+
+ fix_files(input_dir, output_dir)
diff --git a/packages/google-cloud-websecurityscanner/scripts/readme-gen/readme_gen.py b/packages/google-cloud-websecurityscanner/scripts/readme-gen/readme_gen.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/scripts/readme-gen/readme_gen.py
@@ -0,0 +1,69 @@
+#!/usr/bin/env python
+
+# Copyright 2016 Google Inc
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Generates READMEs using configuration defined in yaml."""
+
+import argparse
+import io
+import os
+import subprocess
+
+import jinja2
+import yaml
+
+
+jinja_env = jinja2.Environment(
+ trim_blocks=True,
+ loader=jinja2.FileSystemLoader(
+ os.path.abspath(os.path.join(os.path.dirname(__file__), "templates"))
+ ),
+ autoescape=True,
+)
+
+README_TMPL = jinja_env.get_template('README.tmpl.rst')
+
+
+def get_help(file):
+ return subprocess.check_output(['python', file, '--help']).decode()
+
+
+def main():
+ parser = argparse.ArgumentParser()
+ parser.add_argument('source')
+ parser.add_argument('--destination', default='README.rst')
+
+ args = parser.parse_args()
+
+ source = os.path.abspath(args.source)
+ root = os.path.dirname(source)
+ destination = os.path.join(root, args.destination)
+
+ jinja_env.globals['get_help'] = get_help
+
+ with io.open(source, 'r') as f:
+ config = yaml.load(f)
+
+ # This allows get_help to execute in the right directory.
+ os.chdir(root)
+
+ output = README_TMPL.render(config)
+
+ with io.open(destination, 'w') as f:
+ f.write(output)
+
+
+if __name__ == '__main__':
+ main()
diff --git a/packages/google-cloud-websecurityscanner/setup.py b/packages/google-cloud-websecurityscanner/setup.py
new file mode 100644
--- /dev/null
+++ b/packages/google-cloud-websecurityscanner/setup.py
@@ -0,0 +1,92 @@
+# -*- coding: utf-8 -*-
+# Copyright 2022 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import io
+import os
+
+import setuptools # type: ignore
+
+package_root = os.path.abspath(os.path.dirname(__file__))
+
+name = "google-cloud-websecurityscanner"
+
+
+description = "Google Cloud Websecurityscanner API client library"
+
+version = {}
+with open(
+ os.path.join(package_root, "google/cloud/websecurityscanner/gapic_version.py")
+) as fp:
+ exec(fp.read(), version)
+version = version["__version__"]
+
+if version[0] == "0":
+ release_status = "Development Status :: 4 - Beta"
+else:
+ release_status = "Development Status :: 5 - Production/Stable"
+
+dependencies = [
+ "google-api-core[grpc] >= 1.34.0, <3.0.0dev,!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,!=2.10.*",
+ "proto-plus >= 1.22.0, <2.0.0dev",
+ "proto-plus >= 1.22.2, <2.0.0dev; python_version>='3.11'",
+ "protobuf>=3.19.5,<5.0.0dev,!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5",
+]
+url = "https://github.com/googleapis/google-cloud-python"
+
+package_root = os.path.abspath(os.path.dirname(__file__))
+
+readme_filename = os.path.join(package_root, "README.rst")
+with io.open(readme_filename, encoding="utf-8") as readme_file:
+ readme = readme_file.read()
+
+packages = [
+ package
+ for package in setuptools.PEP420PackageFinder.find()
+ if package.startswith("google")
+]
+
+namespaces = ["google", "google.cloud"]
+
+setuptools.setup(
+ name=name,
+ version=version,
+ description=description,
+ long_description=readme,
+ author="Google LLC",
+ author_email="googleapis-packages@google.com",
+ license="Apache 2.0",
+ url=url,
+ classifiers=[
+ release_status,
+ "Intended Audience :: Developers",
+ "License :: OSI Approved :: Apache Software License",
+ "Programming Language :: Python",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3.7",
+ "Programming Language :: Python :: 3.8",
+ "Programming Language :: Python :: 3.9",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
+ "Operating System :: OS Independent",
+ "Topic :: Internet",
+ ],
+ platforms="Posix; MacOS X; Windows",
+ packages=packages,
+ python_requires=">=3.7",
+ namespace_packages=namespaces,
+ install_requires=dependencies,
+ include_package_data=True,
+ zip_safe=False,
+)
| Scheduler needs to shutdown in a way compatible with SimpleQueue and Queue types for worker queue.
Currently one test is failing for the scheduler. The issue is Py3.7 changed the _worker_queue to be a SimpleQueue.
Traceback (most recent call last):
File "/Users/crwilcox/workspace/google-cloud-python/pubsub/tests/unit/pubsub_v1/subscriber/test_scheduler.py", line 51, in test_schedule
scheduler_.shutdown()
File "/Users/crwilcox/workspace/google-cloud-python/pubsub/google/cloud/pubsub_v1/subscriber/scheduler.py", line 121, in shutdown
self._executor._work_queue.
AttributeError: '_queue.SimpleQueue' object has no attribute 'queue'
Big query TIMESTAMP field off by factor of 1000
Using master branch. In BigQuery TIMESTAMP fields are off by a factor of 1000 when using fetch_data.
See my in-line comment - https://github.com/GoogleCloudPlatform/gcloud-python/commit/41e64395661a31af8352fb2358e5c95ff5437830#diff-c56e349a3e511bfabb4e8dfcfe686bc8R739
Add documentation for storage.Blob's special methods (contains, next, etc)
Right now it's not really clear how to check for a key's existence (do I need to do if `Key('name') in bucket`? or just `'name' in bucket`?
We need to make sure these special methods are documented with examples in the API docs for the Key class.
Upgrading version to 0.6.0.
Using last release PR (#813) as a template.
|
Sorry just saw this. Closing in favor of #1125 (a bit more detail there)
The implementation of `bucket.__contains__` "wants" a string, but is actually willing to take a string and convert to a key. I would be in favor of dropping that myself, and asking the user to create the key, either via `bucket.new_key(name)` or by instantiating `Key` directly.
I think it'd be annoying to need to create the key to check for existence -- I kind of like the magical "we know what to do if you pass us a string".
So:
``` python
print 'my-key-name' in bucket
print Key('my-key-name') in bucket
print bucket.new_key('my-key-name') in bucket
```
should all return a boolean IMO.
You're in luck then: we already DTRT.
/cc @mwitkow-io
Running
``` bash
$ git log -1 --pretty=%H
d8bd10bb045ec1d24b854bacf16bcee7a1135185
$ git log 0.5.0..HEAD \
> | grep 'Merge pull request ' \
> | awk -F ' ' '{print $4}' \
> | sort
```
yields
#809
#810
#812
#814
#816
#820
#822
#823
#824
#826
#834
#835
#840
#842
#843
#844
#849
#852
#853
#854
#855
#856
#857
#859
#862
#863
#864
#873
#875
#876
#877
#878
#879
#880
#881
#882
#888
#891
#894
#897
#898
#901
#904
[![Coverage Status](https://coveralls.io/builds/2731090/badge)](https://coveralls.io/builds/2731090)
Coverage remained the same at 100.0% when pulling **d8bd10bb045ec1d24b854bacf16bcee7a1135185 on dhermes:cut-release-0.5.1** into **9491053699708cef29cdbefeba65572e8481f067 on GoogleCloudPlatform:master**.
Given that there are a couple of new features, as well as some backward-incompatible changes, I think this release needs to be `0.6`, rather than `0.5.1`.
This is my plan for the release notes on the tag:
``````
### General
- Adding `from_service_account_json`, `from_service_account_p12`
and `from_environment` factories to `Connection` base class.
This allows creating a connection without first creating a
set of credentials.
### Datastore
- Fixing serious bug where the transaction ID was not passed
along to transactions, effectively making a transaction act as
a batch.
- Adding scope to un-scoped credentials passed to connection
constructor.
- Enabling connection as an optional argument to all methods
and functions which use a connection.
- Requiring `exclude_from_indexes` to be a tuple or list in
`Entity` constructor.
- Requiring `projection`, `order` and `group_by` to be a
tuple or list in `Query` constructor.
- Re-purposing `datastore.get` to take a single key and moving
previous functionality to `datastore.get_multi`. Similar
changes were made for
```
datastore.put --> datastore.put_multi
```
and
```
datastore.delete --> datastore.delete_multi
```
### Storage
- Adding futures in to allow GET/PATCH to
work correctly in a batch.
- Remove default chunk size in upload/download.
- Adding lazy loaded default connection and project.
- Decoupling connection property from `Blob`/`ACL`/`Bucket`
classes.
- Enabling connection as an optional argument to all methods
and functions which use a connection.
- Adding scope to un-scoped credentials passed to connection
constructor.
- Updating public blob URL to use `storage.googleapis.com`.
- Remove `Bucket.__iter__` and `Bucket.__contains__`.
- Limiting `Bucket.make_public(recursive=True)` to cowardly refuse
if more than 256 objects are in the bucket. This is similar to
what is done in `Bucket.delete(force=True)`.
### Pub/Sub
- Auto-adding timestamp to messages.
- Adding an optional message timestamp so that
messages can be sorted.
- Adding scope to un-scoped credentials passed to connection
constructor.
- Enabling connection as an optional argument to all methods
and functions which use a connection.
- Fixing bug in `Subscription.pull` for cases when `receivedMessages`
is not returned.
``````
---
@tseaver Please give this some extra scrutiny, I'm a bit unsure of some parts.
@tseaver I upgraded to `0.6.0` but didn't change the branch name, so that we didn't have to open a new PR.
[![Coverage Status](https://coveralls.io/builds/2731280/badge)](https://coveralls.io/builds/2731280)
Coverage remained the same at 100.0% when pulling **ca4888fc4e341d45ece90e01032045230ef84c07 on dhermes:cut-release-0.5.1** into **9491053699708cef29cdbefeba65572e8481f067 on GoogleCloudPlatform:master**.
I think I would normalize the changelog into imperative mode. E.g.:
### General
- Update required version of `protobuf` to `3.0.0-alpha-1`.
- Add `from_service_account_json`, `from_service_account_p12` and `from_environment` factories to `Connection` base class, allowing creation of a connection without first creating a set of credentials.
### Datastore
- Fix serious bug where the transaction ID was not passed along to transactions, effectively making a transaction act as a batch.
- Add appropriate scope to un-scoped credentials passed to connection constructor.
- Enable `connection` as an optional argument to all methods and functions which use a connection.
- Require `exclude_from_indexes` to be a tuple or list in `Entity` constructor.
- Require `projection`, `order` and `group_by` to be a tuple or list in `Query` constructor.
- Re-purpose `datastore.get` to take a single key; move previous functionality to `datastore.get_multi`. Similar
changes were made for
```
datastore.put --> datastore.put_multi
```
and
```
datastore.delete --> datastore.delete_multi
```
### Storage
- Add support for "futures", to allow GET/PATCH to work correctly in a batch.
- Remove default chunk size in upload/download.
- Add lazily-loaded default connection and project.
- Remove connection property from `Blob`/`ACL`/`Bucket` classes.
- Enable `connection` as an optional argument to all methods and functions which use a connection.
- Add appropriate scope to un-scoped credentials passed to connection constructor.
- Update public blob URL to use `storage.googleapis.com`.
- Remove `Bucket.__iter__` and `Bucket.__contains__`.
- Limit `Bucket.make_public(recursive=True)` to abort if more than 256 objects are in the bucket. This is similar to what is done in `Bucket.delete(force=True)`.
### Pub/Sub
- Add an option to auto-add a timestamp to published, messages, allowing them to be sorted / logged.
- Add appropriate scope to un-scoped credentials passed to connection constructor.
- Enable `connection` as an optional argument to all methods and functions which use a connection.
- Fix a bug in `Subscription.pull` for cases when `receivedMessages` is not returned.
Cool. Are we all good to merge then?
FWIW I didn't include "Update required version of protobuf to 3.0.0-alpha-1." because that isn't a feature to users. But am happy to include if you think it is important.
LGTM. I figured we should expose the dependency pin because it might affect how existing users upgrade (or even whether they could).
Release created https://github.com/GoogleCloudPlatform/gcloud-python/releases/tag/0.6.0
Now to wait for Travis to run and publish to PyPI (and our docs)
**UPDATE**: Like whoa it started immediately https://travis-ci.org/GoogleCloudPlatform/gcloud-python/builds/65477730
Awesome!
| 2023-06-05T17:01:06Z | [] | [] |
Traceback (most recent call last):
File "/Users/crwilcox/workspace/google-cloud-python/pubsub/tests/unit/pubsub_v1/subscriber/test_scheduler.py", line 51, in test_schedule
scheduler_.shutdown()
File "/Users/crwilcox/workspace/google-cloud-python/pubsub/google/cloud/pubsub_v1/subscriber/scheduler.py", line 121, in shutdown
self._executor._work_queue.
AttributeError: '_queue.SimpleQueue' object has no attribute 'queue'
| 5,740 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-1368 | 1ec6d636ebf2c4d618aca6b2485fbbfa5f0fde29 | diff --git a/gcloud/dns/zone.py b/gcloud/dns/zone.py
--- a/gcloud/dns/zone.py
+++ b/gcloud/dns/zone.py
@@ -15,7 +15,7 @@
"""Define API ManagedZones."""
import six
-from gcloud._helpers import _datetime_from_microseconds
+from gcloud._helpers import _rfc3339_to_datetime
from gcloud.exceptions import NotFound
from gcloud.dns.changes import Changes
from gcloud.dns.resource_record_set import ResourceRecordSet
@@ -92,10 +92,7 @@ def created(self):
:rtype: ``datetime.datetime``, or ``NoneType``
:returns: the creation time (None until set from the server).
"""
- creation_time = self._properties.get('creationTime')
- if creation_time is not None:
- # creation_time will be in milliseconds.
- return _datetime_from_microseconds(1000.0 * creation_time)
+ return self._properties.get('creationTime')
@property
def name_servers(self):
@@ -215,7 +212,8 @@ def _set_properties(self, api_response):
self._properties.clear()
cleaned = api_response.copy()
if 'creationTime' in cleaned:
- cleaned['creationTime'] = float(cleaned['creationTime'])
+ cleaned['creationTime'] = _rfc3339_to_datetime(
+ cleaned['creationTime'])
self._properties.update(cleaned)
def _build_resource(self):
| DNS doesn't parse creationTime field
I know DNS is not oficially supported yet, but I'm reporting it anyway. I'm using the master `gcloud-python` repo with this code:
``` python
client = dns.Client.from_service_account_json(
'service-account.json', project='xyz-eu')
zones = client.list_zones()
```
This is the error I get:
``` python
Traceback (most recent call last):
..
zones = client.list_zones()
File "/app/src/gcloud/gcloud/dns/client.py", line 94, in list_zones
for resource in resp['managedZones']]
File "/app/src/gcloud/gcloud/dns/zone.py", line 67, in from_api_repr
zone._set_properties(resource)
File "/app/src/gcloud/gcloud/dns/zone.py", line 218, in _set_properties
cleaned['creationTime'] = float(cleaned['creationTime'])
ValueError: invalid literal for float(): 2015-04-02T13:21:24.943Z
```
| Thanks for the report!
Will be "easy" to do once #1336 is in.
Verified from the docs: https://cloud.google.com/dns/api/v1/managedZones#resource
> `creationTime`: The time that this resource was created on the server. This is in RFC3339 text format.
| 2016-01-08T00:57:50Z | [] | [] |
Traceback (most recent call last):
..
zones = client.list_zones()
File "/app/src/gcloud/gcloud/dns/client.py", line 94, in list_zones
for resource in resp['managedZones']]
File "/app/src/gcloud/gcloud/dns/zone.py", line 67, in from_api_repr
zone._set_properties(resource)
File "/app/src/gcloud/gcloud/dns/zone.py", line 218, in _set_properties
cleaned['creationTime'] = float(cleaned['creationTime'])
ValueError: invalid literal for float(): 2015-04-02T13:21:24.943Z
| 5,766 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-1440 | d56ea3d08a29969fe560ba2fed80f313a2228b23 | diff --git a/gcloud/pubsub/client.py b/gcloud/pubsub/client.py
--- a/gcloud/pubsub/client.py
+++ b/gcloud/pubsub/client.py
@@ -77,7 +77,7 @@ def list_topics(self, page_size=None, page_token=None):
resp = self.connection.api_request(method='GET', path=path,
query_params=params)
topics = [Topic.from_api_repr(resource, self)
- for resource in resp['topics']]
+ for resource in resp.get('topics', ())]
return topics, resp.get('nextPageToken')
def list_subscriptions(self, page_size=None, page_token=None,
@@ -128,7 +128,7 @@ def list_subscriptions(self, page_size=None, page_token=None,
topics = {}
subscriptions = [Subscription.from_api_repr(resource, self,
topics=topics)
- for resource in resp['subscriptions']]
+ for resource in resp.get('subscriptions', ())]
return subscriptions, resp.get('nextPageToken')
def topic(self, name, timestamp_messages=False):
| Pubsub.list_topics fails when there are no topics
[Offending line](https://github.com/GoogleCloudPlatform/gcloud-python/blob/0910f9979a45af8cc2826dd4c6ff38d9efa5ccec/gcloud/pubsub/client.py#L80). Reproduce via:
``` python
client = pubsub.Client()
>>> client.list_topics()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "gcloud/pubsub/client.py", line 80, in list_topics
for resource in resp['topics']]
KeyError: 'topics'
```
@tseaver ISTM we should locate all instances where we assume a key is present and just protect against this. The time between releases behooves us to be "protective" of users. (I realize that we've usually done it this way based on documented outputs.)
| Found while reproducing #1438
| 2016-02-02T23:07:36Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "gcloud/pubsub/client.py", line 80, in list_topics
for resource in resp['topics']]
KeyError: 'topics'
| 5,773 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-1481 | 75a395f8b7bdfc64aa69d94240e7e60d7e39b69e | diff --git a/gcloud/pubsub/message.py b/gcloud/pubsub/message.py
--- a/gcloud/pubsub/message.py
+++ b/gcloud/pubsub/message.py
@@ -71,6 +71,6 @@ def from_api_repr(cls, api_repr):
:type api_repr: dict or None
:param api_repr: The API representation of the message
"""
- data = base64.b64decode(api_repr['data'])
+ data = base64.b64decode(api_repr.get('data', b''))
return cls(data=data, message_id=api_repr['messageId'],
attributes=api_repr.get('attributes'))
| pubsub fails if data key is not present
If a message is published with a string of 0 length (`topic.publish( '', url=url, title=title)`) when the message is received there is no data field in the message and a key error is thrown when trying to transform the message from the PubSub API representation.
https://github.com/GoogleCloudPlatform/gcloud-python/blob/master/gcloud/pubsub/message.py#L74
```
Traceback (most recent call last):
File "/en_notifications/en_notifications.py", line 51, in <module>
received = PS_SUBSCRIPTION.pull(max_messages=PULL_COUNT)
File "/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/subscription.py", line 212, in pull
File "/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/message.py", line 74, in from_api_repr
for info in response.get('receivedMessages', ())]
data = base64.b64decode(api_repr['data'])
KeyError: 'data'
```
| Thank you for the report!
| 2016-02-17T13:41:11Z | [] | [] |
Traceback (most recent call last):
File "/en_notifications/en_notifications.py", line 51, in <module>
received = PS_SUBSCRIPTION.pull(max_messages=PULL_COUNT)
| 5,777 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-1580 | c18e96ecf33f5f7e4877480648d31e97099a8a00 | diff --git a/gcloud/_helpers.py b/gcloud/_helpers.py
--- a/gcloud/_helpers.py
+++ b/gcloud/_helpers.py
@@ -19,9 +19,10 @@
import calendar
import datetime
import os
-from threading import local as Local
+import re
import socket
import sys
+from threading import local as Local
from google.protobuf import timestamp_pb2
import six
@@ -388,6 +389,43 @@ def _datetime_to_pb_timestamp(when):
return timestamp_pb2.Timestamp(seconds=seconds, nanos=nanos)
+def _name_from_project_path(path, project, template):
+ """Validate a URI path and get the leaf object's name.
+
+ :type path: string
+ :param path: URI path containing the name.
+
+ :type project: string
+ :param project: The project associated with the request. It is
+ included for validation purposes.
+
+ :type template: string
+ :param template: Template regex describing the expected form of the path.
+ The regex must have two named groups, 'project' and
+ 'name'.
+
+ :rtype: string
+ :returns: Name parsed from ``path``.
+ :raises: :class:`ValueError` if the ``path`` is ill-formed or if
+ the project from the ``path`` does not agree with the
+ ``project`` passed in.
+ """
+ if isinstance(template, str):
+ template = re.compile(template)
+
+ match = template.match(path)
+
+ if not match:
+ raise ValueError('path did not match: %s' % (template.pattern,))
+
+ found_project = match.group('project')
+ if found_project != project:
+ raise ValueError('Project from client should agree with '
+ 'project from resource.')
+
+ return match.group('name')
+
+
try:
from pytz import UTC # pylint: disable=unused-import,wrong-import-order
except ImportError:
diff --git a/gcloud/pubsub/_helpers.py b/gcloud/pubsub/_helpers.py
--- a/gcloud/pubsub/_helpers.py
+++ b/gcloud/pubsub/_helpers.py
@@ -14,6 +14,8 @@
"""Helper functions for shared behavior."""
+from gcloud._helpers import _name_from_project_path
+
def topic_name_from_path(path, project):
"""Validate a topic URI path and get the topic name.
@@ -31,15 +33,25 @@ def topic_name_from_path(path, project):
the project from the ``path`` does not agree with the
``project`` passed in.
"""
- # PATH = 'projects/%s/topics/%s' % (PROJECT, TOPIC_NAME)
- path_parts = path.split('/')
- if (len(path_parts) != 4 or path_parts[0] != 'projects' or
- path_parts[2] != 'topics'):
- raise ValueError('Expected path to be of the form '
- 'projects/{project}/topics/{topic_name}')
- if (len(path_parts) != 4 or path_parts[0] != 'projects' or
- path_parts[2] != 'topics' or path_parts[1] != project):
- raise ValueError('Project from client should agree with '
- 'project from resource.')
-
- return path_parts[3]
+ template = r'projects/(?P<project>\w+)/topics/(?P<name>\w+)'
+ return _name_from_project_path(path, project, template)
+
+
+def subscription_name_from_path(path, project):
+ """Validate a subscription URI path and get the subscription name.
+
+ :type path: string
+ :param path: URI path for a subscription API request.
+
+ :type project: string
+ :param project: The project associated with the request. It is
+ included for validation purposes.
+
+ :rtype: string
+ :returns: subscription name parsed from ``path``.
+ :raises: :class:`ValueError` if the ``path`` is ill-formed or if
+ the project from the ``path`` does not agree with the
+ ``project`` passed in.
+ """
+ template = r'projects/(?P<project>\w+)/subscriptions/(?P<name>\w+)'
+ return _name_from_project_path(path, project, template)
diff --git a/gcloud/pubsub/client.py b/gcloud/pubsub/client.py
--- a/gcloud/pubsub/client.py
+++ b/gcloud/pubsub/client.py
@@ -80,8 +80,7 @@ def list_topics(self, page_size=None, page_token=None):
for resource in resp.get('topics', ())]
return topics, resp.get('nextPageToken')
- def list_subscriptions(self, page_size=None, page_token=None,
- topic_name=None):
+ def list_subscriptions(self, page_size=None, page_token=None):
"""List subscriptions for the project associated with this client.
See:
@@ -99,10 +98,6 @@ def list_subscriptions(self, page_size=None, page_token=None,
passed, the API will return the first page of
topics.
- :type topic_name: string
- :param topic_name: limit results to subscriptions bound to the given
- topic.
-
:rtype: tuple, (list, str)
:returns: list of :class:`gcloud.pubsub.subscription.Subscription`,
plus a "next page token" string: if not None, indicates that
@@ -117,11 +112,7 @@ def list_subscriptions(self, page_size=None, page_token=None,
if page_token is not None:
params['pageToken'] = page_token
- if topic_name is None:
- path = '/projects/%s/subscriptions' % (self.project,)
- else:
- path = '/projects/%s/topics/%s/subscriptions' % (self.project,
- topic_name)
+ path = '/projects/%s/subscriptions' % (self.project,)
resp = self.connection.api_request(method='GET', path=path,
query_params=params)
diff --git a/gcloud/pubsub/topic.py b/gcloud/pubsub/topic.py
--- a/gcloud/pubsub/topic.py
+++ b/gcloud/pubsub/topic.py
@@ -19,6 +19,7 @@
from gcloud._helpers import _datetime_to_rfc3339
from gcloud._helpers import _NOW
from gcloud.exceptions import NotFound
+from gcloud.pubsub._helpers import subscription_name_from_path
from gcloud.pubsub._helpers import topic_name_from_path
from gcloud.pubsub.subscription import Subscription
@@ -212,6 +213,51 @@ def delete(self, client=None):
client = self._require_client(client)
client.connection.api_request(method='DELETE', path=self.path)
+ def list_subscriptions(self, page_size=None, page_token=None, client=None):
+ """List subscriptions for the project associated with this client.
+
+ See:
+ https://cloud.google.com/pubsub/reference/rest/v1/projects.topics.subscriptions/list
+
+ :type page_size: int
+ :param page_size: maximum number of topics to return, If not passed,
+ defaults to a value set by the API.
+
+ :type page_token: string
+ :param page_token: opaque marker for the next "page" of topics. If not
+ passed, the API will return the first page of
+ topics.
+
+ :type client: :class:`gcloud.pubsub.client.Client` or ``NoneType``
+ :param client: the client to use. If not passed, falls back to the
+ ``client`` stored on the current topic.
+
+ :rtype: tuple, (list, str)
+ :returns: list of :class:`gcloud.pubsub.subscription.Subscription`,
+ plus a "next page token" string: if not None, indicates that
+ more topics can be retrieved with another call (pass that
+ value as ``page_token``).
+ """
+ client = self._require_client(client)
+ params = {}
+
+ if page_size is not None:
+ params['pageSize'] = page_size
+
+ if page_token is not None:
+ params['pageToken'] = page_token
+
+ path = '/projects/%s/topics/%s/subscriptions' % (
+ self.project, self.name)
+
+ resp = client.connection.api_request(method='GET', path=path,
+ query_params=params)
+ subscriptions = []
+ for sub_path in resp.get('subscriptions', ()):
+ sub_name = subscription_name_from_path(sub_path, self.project)
+ subscriptions.append(Subscription(sub_name, self))
+ return subscriptions, resp.get('nextPageToken')
+
class Batch(object):
"""Context manager: collect messages to publish via a single API call.
| list_subscriptions does not work properly
`list_subscriptions()` works but `list_subscriptions(topic_name = 'topic_name')` does not work properly
``` python
>>> from gcloud import pubsub
>>> client = pubsub.Client(project = 'my_project')
>>> client.list_subscriptions(topic_name = 'my_topic')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/client.py", line 131, in list_subscriptions
for resource in resp['subscriptions']]
File "/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/subscription.py", line 68, in from_api_repr
topic_path = resource['topic']
TypeError: string indices must be integers
```
| ATM, `topic_name` is the third positional argument: you need to call it as a named parameter:
``` python
client.list_subscriptions(topic_name='topic_name')
```
@jgeewax, @dhermes I'm reading this issue as a request to make `topic_name` the first positional argument. Opinions?
:+1:
@tseaver
Sorry, what I really want to say is I wrote the command as:
``` python
subscription, next_page_tokens = client.list_subscriptions(topic_name = 'topic_name')
```
It does not work. And the issue is this line:
``` python
subscriptions = [Subscription.from_api_repr(resource, self,
topics=topics)
for resource in resp.get('subscriptions', ())]
```
The code pass an empty value of resource (maybe) to subscriptions.
The output is like:
``` python
>>> from gcloud import pubsub
>>> client = pubsub.Client(project = 'my_project'')
>>> client.list_subscriptions(topic_name = 'my_topic')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/client.py", line 131, in list_subscriptions
for resource in resp['subscriptions']]
File "/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/subscription.py", line 68, in from_api_repr
topic_path = resource['topic']
TypeError: string indices must be integers
```
@tseaver Have you reproduced?
@tseaver I reproduced:
``` python
>>> import gcloud.pubsub
>>> from gcloud import _helpers
>>> from gcloud.environment_vars import TESTS_PROJECT
>>> _helpers.PROJECT = TESTS_PROJECT
>>> C = gcloud.pubsub.Client()
>>> C.project
'FOOOBAR'
>>> C.list_subscriptions()
([], None)
>>> C.list_topics()
([], None)
>>> T = C.topic('hi-mom')
>>> T.create()
>>> S = T.subscription('this-is-mine')
>>> S.create()
>>> C.list_topics()
([<gcloud.pubsub.topic.Topic object at 0x7f7c21812310>], None)
>>> C.list_subscriptions()
([<gcloud.pubsub.subscription.Subscription object at 0x7f7c21812390>], None)
>>> C.list_subscriptions(topic_name=T.name)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "gcloud/pubsub/client.py", line 131, in list_subscriptions
for resource in resp.get('subscriptions', ())]
File "gcloud/pubsub/subscription.py", line 71, in from_api_repr
topic_path = resource['topic']
TypeError: string indices must be integers
```
``` python
In [16]: %debug
> .../gcloud-python/gcloud/pubsub/subscription.py(71)from_api_repr()
69 if topics is None:
70 topics = {}
---> 71 topic_path = resource['topic']
72 topic = topics.get(topic_path)
73 if topic is None:
ipdb> resource
u'projects/omega-moonlight-697/subscriptions/this-is-mine'
ipdb> u
> .../gcloud-python/gcloud/pubsub/client.py(131)list_subscriptions()
129 subscriptions = [Subscription.from_api_repr(resource, self,
130 topics=topics)
--> 131 for resource in resp.get('subscriptions', ())]
132 return subscriptions, resp.get('nextPageToken')
133
ipdb> resp
{u'subscriptions': [u'projects/omega-moonlight-697/subscriptions/this-is-mine']}
```
This is in contrast to what the output looks like without the topic name:
``` json
{
"subscriptions": [
{
"topic": "projects/omega-moonlight-697/topics/hi-mom",
"ackDeadlineSeconds": 10,
"pushConfig": {},
"name": "projects/omega-moonlight-697/subscriptions/this-is-mine"
}
]
}
```
OK, I see the issue: [`projects.subscriptions/list'](https://cloud.google.com/pubsub/reference/rest/v1/projects.subscriptions/list) is documented to return the full `Subscriptions` resource, whereas [`projects.topics.subscriptions/list`](https://cloud.google.com/pubsub/reference/rest/v1/projects.topics.subscriptions/list) is documented to return only a list of subscription URLs.
The API design choice here is utterly bogus, but we have to work around it.
Given the disparity, I would prefer to handle topic-based subscriptions by adding a `list_subscriptions` method to the `Topic` class.
| 2016-03-07T18:44:49Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/client.py", line 131, in list_subscriptions
for resource in resp['subscriptions']]
File "/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/subscription.py", line 68, in from_api_repr
topic_path = resource['topic']
TypeError: string indices must be integers
| 5,786 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-1785 | 22634abd342268f5f77b2637e9c13fb0a9492309 | diff --git a/gcloud/_helpers.py b/gcloud/_helpers.py
--- a/gcloud/_helpers.py
+++ b/gcloud/_helpers.py
@@ -45,7 +45,7 @@
\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2} # YYYY-MM-DDTHH:MM:SS
)
\. # decimal point
- (?P<nanos>\d{9}) # nanoseconds
+ (?P<nanos>\d{1,9}) # nanoseconds, maybe truncated
Z # Zulu
""", re.VERBOSE)
@@ -344,7 +344,9 @@ def _rfc3339_nanos_to_datetime(dt_str):
dt_str, _RFC3339_NANOS.pattern))
bare_seconds = datetime.datetime.strptime(
with_nanos.group('no_fraction'), _RFC3339_NO_FRACTION)
- nanos = int(with_nanos.group('nanos'))
+ fraction = with_nanos.group('nanos')
+ scale = 9 - len(fraction)
+ nanos = int(fraction) * (10 ** scale)
micros = nanos // 1000
return bare_seconds.replace(microsecond=micros, tzinfo=UTC)
| Timestamp nanos parsing broken for '2016-05-09T18:49:00.655319Z'
```
ERROR: test_log_text (logging_.TestLogging)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/system_tests/logging_.py", line 79, in test_log_text
entries, _ = logger.list_entries()
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests3/lib/python3.4/site-packages/gcloud/logging/logger.py", line 317, in list_entries
page_size=page_size, page_token=page_token)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests3/lib/python3.4/site-packages/gcloud/logging/client.py", line 141, in list_entries
for resource in resp.get('entries', ())]
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests3/lib/python3.4/site-packages/gcloud/logging/client.py", line 141, in <listcomp>
for resource in resp.get('entries', ())]
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests3/lib/python3.4/site-packages/gcloud/logging/client.py", line 79, in _entry_from_resource
return TextEntry.from_api_repr(resource, self, loggers)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests3/lib/python3.4/site-packages/gcloud/logging/entries.py", line 92, in from_api_repr
timestamp = _rfc3339_nanos_to_datetime(timestamp)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests3/lib/python3.4/site-packages/gcloud/_helpers.py", line 344, in _rfc3339_nanos_to_datetime
dt_str, _RFC3339_NANOS.pattern))
ValueError: Timestamp: '2016-05-09T18:49:00.655319Z', does not match pattern: '\n (?P<no_fraction>\n \\d{4}-\\d{2}-\\d{2}T\\d{2}:\\d{2}:\\d{2} # YYYY-MM-DDTHH:MM:SS\n )\n \\. # decimal point\n (?P<nanos>\\d{9}) # nanoseconds\n Z # Zulu\n'
```
From https://travis-ci.org/GoogleCloudPlatform/gcloud-python/builds/128927443
I haven't traced down the codepath just yet.
| Rotten tomotoes for the back-end team. "Nanosecond precision" strongly requires that the trailing zeroes be present: that is the very _definition_ of precision.
It's likely that they are optimizing for payload size; if that's the case they get an A in that category.
WDYT of changing the [nanos check](https://github.com/GoogleCloudPlatform/gcloud-python/blob/22634abd342268f5f77b2637e9c13fb0a9492309/gcloud/_helpers.py#L48) to `(?P<nanos>\d{6-9})` and then using `zfill(9)`?
| 2016-05-10T17:21:22Z | [] | [] |
Traceback (most recent call last):
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/system_tests/logging_.py", line 79, in test_log_text
entries, _ = logger.list_entries()
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests3/lib/python3.4/site-packages/gcloud/logging/logger.py", line 317, in list_entries
page_size=page_size, page_token=page_token)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests3/lib/python3.4/site-packages/gcloud/logging/client.py", line 141, in list_entries
for resource in resp.get('entries', ())]
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests3/lib/python3.4/site-packages/gcloud/logging/client.py", line 141, in <listcomp>
for resource in resp.get('entries', ())]
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests3/lib/python3.4/site-packages/gcloud/logging/client.py", line 79, in _entry_from_resource
return TextEntry.from_api_repr(resource, self, loggers)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests3/lib/python3.4/site-packages/gcloud/logging/entries.py", line 92, in from_api_repr
timestamp = _rfc3339_nanos_to_datetime(timestamp)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests3/lib/python3.4/site-packages/gcloud/_helpers.py", line 344, in _rfc3339_nanos_to_datetime
dt_str, _RFC3339_NANOS.pattern))
ValueError: Timestamp: '2016-05-09T18:49:00.655319Z', does not match pattern: '\n (?P<no_fraction>\n \\d{4}-\\d{2}-\\d{2}T\\d{2}:\\d{2}:\\d{2} # YYYY-MM-DDTHH:MM:SS\n )\n \\. # decimal point\n (?P<nanos>\\d{9}) # nanoseconds\n Z # Zulu\n'
| 5,806 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-1825 | d4ff022d09e08b21fc6632aae461375eaccd9cb4 | diff --git a/gcloud/logging/metric.py b/gcloud/logging/metric.py
--- a/gcloud/logging/metric.py
+++ b/gcloud/logging/metric.py
@@ -14,39 +14,9 @@
"""Define Logging API Metrics."""
-import re
-
-from gcloud._helpers import _name_from_project_path
from gcloud.exceptions import NotFound
-_METRIC_TEMPLATE = re.compile(r"""
- projects/ # static prefix
- (?P<project>[^/]+) # initial letter, wordchars + hyphen
- /metrics/ # static midfix
- (?P<name>[^/]+) # initial letter, wordchars + allowed punc
-""", re.VERBOSE)
-
-
-def _metric_name_from_path(path, project):
- """Validate a metric URI path and get the metric name.
-
- :type path: string
- :param path: URI path for a metric API request.
-
- :type project: string
- :param project: The project associated with the request. It is
- included for validation purposes.
-
- :rtype: string
- :returns: Metric name parsed from ``path``.
- :raises: :class:`ValueError` if the ``path`` is ill-formed or if
- the project from the ``path`` does not agree with the
- ``project`` passed in.
- """
- return _name_from_project_path(path, project, _METRIC_TEMPLATE)
-
-
class Metric(object):
"""Metrics represent named filters for log entries.
diff --git a/gcloud/logging/sink.py b/gcloud/logging/sink.py
--- a/gcloud/logging/sink.py
+++ b/gcloud/logging/sink.py
@@ -14,36 +14,9 @@
"""Define Logging API Sinks."""
-import re
-
-from gcloud._helpers import _name_from_project_path
from gcloud.exceptions import NotFound
-_SINK_TEMPLATE = re.compile(r"""
- projects/ # static prefix
- (?P<project>[^/]+) # initial letter, wordchars + hyphen
- /sinks/ # static midfix
- (?P<name>[^/]+) # initial letter, wordchars + allowed punc
-""", re.VERBOSE)
-
-
-def _sink_name_from_path(path, project):
- """Validate a sink URI path and get the sink name.
- :type path: string
- :param path: URI path for a sink API request.
- :type project: string
- :param project: The project associated with the request. It is
- included for validation purposes.
- :rtype: string
- :returns: Metric name parsed from ``path``.
- :raises: :class:`ValueError` if the ``path`` is ill-formed or if
- the project from the ``path`` does not agree with the
- ``project`` passed in.
- """
- return _name_from_project_path(path, project, _SINK_TEMPLATE)
-
-
class Sink(object):
"""Sinks represent filtered exports for log entries.
@@ -107,7 +80,7 @@ def from_api_repr(cls, resource, client):
project from the resource does not agree with the project
from the client.
"""
- sink_name = _sink_name_from_path(resource['name'], client.project)
+ sink_name = resource['name']
filter_ = resource['filter']
destination = resource['destination']
return cls(sink_name, filter_, destination, client=client)
| path "mysink" did not match expected pattern
Looks like the Sink resource name is not the fully qualified version. Docs agree: https://cloud.google.com/logging/docs/api/ref_v2beta1/rest/v2beta1/projects.sinks#resource-logsink. I don't see a good reason why it wouldn't be the full name. Once again, I will bug people but for now I think we should disable the validation.
```
{u'filter': u'logName=projects/bill-stackdriver-experiment/logs/syslog AND severity>=ERROR', u'destination': u'storage.googleapis.com/bill-new-stackdriver', u'name': u'mysink', u'outputVersionFormat': u'V2'}
Validating sink name mysink bill-stackdriver-experiment
Traceback (most recent call last):
File "export.py", line 102, in <module>
main(args.project_id, args.destination_bucket)
File "export.py", line 84, in main
list_sinks(client)
File "export.py", line 54, in list_sinks
sinks = client.list_sinks()[0]
File "/Users/waprin/.virtualenvs/docsamples10/lib/python2.7/site-packages/gcloud/logging/client.py", line 196, in list_sinks
for resource in resp.get('sinks', ())]
File "/Users/waprin/.virtualenvs/docsamples10/lib/python2.7/site-packages/gcloud/logging/sink.py", line 112, in from_api_repr
sink_name = _sink_name_from_path(resource['name'], client.project)
File "/Users/waprin/.virtualenvs/docsamples10/lib/python2.7/site-packages/gcloud/logging/sink.py", line 45, in _sink_name_from_path
return _name_from_project_path(path, project, _SINK_TEMPLATE)
File "/Users/waprin/.virtualenvs/docsamples10/lib/python2.7/site-packages/gcloud/_helpers.py", line 458, in _name_from_project_path
path, template.pattern,))
ValueError: path "mysink" did not match expected pattern "
projects/ # static prefix
(?P<project>[^/]+) # initial letter, wordchars + hyphen
/sinks/ # static midfix
(?P<name>[^/]+) # initial letter, wordchars + allowed punc
```
| Same idea as #1808?
Yep, except it will never get fixed because they don't consider it important enough to change the API for.
@waprin Thanks for the report! 61004b72 made a similar change for metrics: we are missing the system test for `list_sinks` which would have surfaced this issue.
| 2016-05-26T19:12:48Z | [] | [] |
Traceback (most recent call last):
File "export.py", line 102, in <module>
main(args.project_id, args.destination_bucket)
File "export.py", line 84, in main
list_sinks(client)
File "export.py", line 54, in list_sinks
sinks = client.list_sinks()[0]
File "/Users/waprin/.virtualenvs/docsamples10/lib/python2.7/site-packages/gcloud/logging/client.py", line 196, in list_sinks
for resource in resp.get('sinks', ())]
File "/Users/waprin/.virtualenvs/docsamples10/lib/python2.7/site-packages/gcloud/logging/sink.py", line 112, in from_api_repr
sink_name = _sink_name_from_path(resource['name'], client.project)
File "/Users/waprin/.virtualenvs/docsamples10/lib/python2.7/site-packages/gcloud/logging/sink.py", line 45, in _sink_name_from_path
return _name_from_project_path(path, project, _SINK_TEMPLATE)
File "/Users/waprin/.virtualenvs/docsamples10/lib/python2.7/site-packages/gcloud/_helpers.py", line 458, in _name_from_project_path
path, template.pattern,))
ValueError: path "mysink" did not match expected pattern "
| 5,812 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-1843 | 92e9362f5246ecea1a9f96ea104353d3caf7623c | diff --git a/gcloud/bigtable/happybase/connection.py b/gcloud/bigtable/happybase/connection.py
--- a/gcloud/bigtable/happybase/connection.py
+++ b/gcloud/bigtable/happybase/connection.py
@@ -20,6 +20,14 @@
import six
+from grpc.beta import interfaces
+from grpc.framework.interfaces.face import face
+
+try:
+ from happybase.hbase.ttypes import AlreadyExists
+except ImportError:
+ from gcloud.exceptions import Conflict as AlreadyExists
+
from gcloud.bigtable.client import Client
from gcloud.bigtable.column_family import GCRuleIntersection
from gcloud.bigtable.column_family import MaxAgeGCRule
@@ -338,7 +346,13 @@ def create_table(self, name, families):
# Create table instance and then make API calls.
name = self._table_name(name)
low_level_table = _LowLevelTable(name, self._cluster)
- low_level_table.create()
+ try:
+ low_level_table.create()
+ except face.NetworkError as network_err:
+ if network_err.code == interfaces.StatusCode.ALREADY_EXISTS:
+ raise AlreadyExists(name)
+ else:
+ raise
for column_family_name, gc_rule in gc_rule_dict.items():
column_family = low_level_table.column_family(
| Bigtable happybase create_table should throw AlreadyExists when table already exists.
HappyBase `happybase.Connection.create_table` throws `happybase.hbase.ttypes.AlreadyExists` when a table already exists.
https://github.com/wbolster/happybase/blob/3565b2b43b377b3e539315570d3e51d053d7a8a1/happybase/Hbase.thrift#L274
In the gcloud-python implementation, this throws an internal gRPC error.
```
Create table Hello-Bigtable
Traceback (most recent call last):
File "hello.py", line 119, in <module>
main()
File "hello.py", line 95, in main
table = create_table(connection, TABLE_NAME, COLUMN_FAMILY_NAME)
File "hello.py", line 76, in create_table
column_family_name: dict() # Use default options.
File "/usr/local/google/home/swast/venv/bigtable_helloworld/local/lib/python2.7/site-packages/gcloud/bigtable/happybase/connection.py", line 341, in create_table
low_level_table.create()
File "/usr/local/google/home/swast/venv/bigtable_helloworld/local/lib/python2.7/site-packages/gcloud/bigtable/table.py", line 177, in create
client._table_stub.CreateTable(request_pb, client.timeout_seconds)
File "/usr/local/google/home/swast/venv/bigtable_helloworld/local/lib/python2.7/site-packages/grpc/framework/crust/implementations.py", line 75, in __call__
protocol_options, metadata, request)
File "/usr/local/google/home/swast/venv/bigtable_helloworld/local/lib/python2.7/site-packages/grpc/framework/crust/_calls.py", line 109, in blocking_unary_unary
return next(rendezvous)
File "/usr/local/google/home/swast/venv/bigtable_helloworld/local/lib/python2.7/site-packages/grpc/framework/crust/_control.py", line 415, in next
raise self._termination.abortion_error
grpc.framework.interfaces.face.face.NetworkError: NetworkError(code=StatusCode.ALREADY_EXISTS, details="Error while creating table Hello-Bigtable in cluster projects/swast-bigtable-examples/zones/us-central1-c/clusters/my-cluster : Transaction Failed : Cannot re-create existing table: Hello-Bigtable")
```
I can catch this with the following, but it seems weird to differ from vanilla HappyBase in this way.
```
from gcloud import bigtable
from gcloud.bigtable import happybase
from grpc.framework.interfaces.face import face as grpcface
TABLE_NAME = 'Hello-Bigtable'
COLUMN_FAMILY_NAME = 'cf1'
def create_table(connection, table_name, column_family_name):
try:
connection.create_table(
table_name,
{
column_family_name: dict() # Use default options.
})
except grpcface.NetworkError as exp:
if exp.code != interfaces.StatusCode.ALREADY_EXISTS:
raise
print("Table already exists.")
return connection.table(table_name)
client = bigtable.Client(project=args.project, admin=True)
with client:
cluster = client.cluster('us-central1-c', 'my-cluster')
cluster.reload()
connection = happybase.Connection(cluster=cluster)
print connection.tables()
print('Create table {0}'.format(TABLE_NAME))
table = create_table(connection, TABLE_NAME, COLUMN_FAMILY_NAME)
```
| Example of catching AlreadyExists in the wild. https://github.com/openstack/aodh/blob/dd72badd5b460d2d0c969414bb1c8c2de20be727/aodh/storage/hbase/utils.py#L221
Sorry for the delay, 15 days eek! Digging in now
BTW I really appreciate having someone familiar with vanilla HappyBase!
| 2016-06-07T17:39:36Z | [] | [] |
Traceback (most recent call last):
File "hello.py", line 119, in <module>
main()
File "hello.py", line 95, in main
table = create_table(connection, TABLE_NAME, COLUMN_FAMILY_NAME)
File "hello.py", line 76, in create_table
column_family_name: dict() # Use default options.
File "/usr/local/google/home/swast/venv/bigtable_helloworld/local/lib/python2.7/site-packages/gcloud/bigtable/happybase/connection.py", line 341, in create_table
low_level_table.create()
File "/usr/local/google/home/swast/venv/bigtable_helloworld/local/lib/python2.7/site-packages/gcloud/bigtable/table.py", line 177, in create
client._table_stub.CreateTable(request_pb, client.timeout_seconds)
File "/usr/local/google/home/swast/venv/bigtable_helloworld/local/lib/python2.7/site-packages/grpc/framework/crust/implementations.py", line 75, in __call__
protocol_options, metadata, request)
File "/usr/local/google/home/swast/venv/bigtable_helloworld/local/lib/python2.7/site-packages/grpc/framework/crust/_calls.py", line 109, in blocking_unary_unary
return next(rendezvous)
File "/usr/local/google/home/swast/venv/bigtable_helloworld/local/lib/python2.7/site-packages/grpc/framework/crust/_control.py", line 415, in next
raise self._termination.abortion_error
grpc.framework.interfaces.face.face.NetworkError: NetworkError(code=StatusCode.ALREADY_EXISTS, details="Error while creating table Hello-Bigtable in cluster projects/swast-bigtable-examples/zones/us-central1-c/clusters/my-cluster : Transaction Failed : Cannot re-create existing table: Hello-Bigtable")
| 5,814 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-1854 | 6fa89d868bf70df6db28a5c8fd58c43add40edfb | diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -26,7 +26,7 @@
setup(
name='gcloud',
- version='0.15.0',
+ version='0.16.0',
description='API Client library for Google Cloud',
author='Google Cloud Platform',
author_email='jjg+gcloud-python@google.com',
| GAX API hangs during (second) 'publish'
/cc @bjwatson
``` python
======================================================================
ERROR: test_message_pull_mode_e2e (pubsub.TestPubsub)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/tseaver/projects/agendaless/Google/src/gcloud-python/system_tests/pubsub.py", line 157, in test_message_pull_mode_e2e
topic.publish(MESSAGE_1, extra=EXTRA_1)
File "/home/tseaver/projects/agendaless/Google/src/gcloud-python/gcloud/pubsub/topic.py", line 246, in publish
message_ids = api.topic_publish(self.full_name, [message_data])
File "/home/tseaver/projects/agendaless/Google/src/gcloud-python/gcloud/pubsub/_gax.py", line 171, in topic_publish
event.wait()
AttributeError: 'exceptions.AttributeError' object has no attribute 'message_ids'
```
| 2016-06-13T19:29:45Z | [] | [] |
Traceback (most recent call last):
File "/home/tseaver/projects/agendaless/Google/src/gcloud-python/system_tests/pubsub.py", line 157, in test_message_pull_mode_e2e
topic.publish(MESSAGE_1, extra=EXTRA_1)
File "/home/tseaver/projects/agendaless/Google/src/gcloud-python/gcloud/pubsub/topic.py", line 246, in publish
message_ids = api.topic_publish(self.full_name, [message_data])
File "/home/tseaver/projects/agendaless/Google/src/gcloud-python/gcloud/pubsub/_gax.py", line 171, in topic_publish
event.wait()
AttributeError: 'exceptions.AttributeError' object has no attribute 'message_ids'
| 5,817 |
||||
googleapis/google-cloud-python | googleapis__google-cloud-python-1910 | f2fae7b881dacf17096b0319b221a98782c6df9b | diff --git a/gcloud/pubsub/_gax.py b/gcloud/pubsub/_gax.py
--- a/gcloud/pubsub/_gax.py
+++ b/gcloud/pubsub/_gax.py
@@ -162,17 +162,17 @@ def topic_publish(self, topic_path, messages):
:raises: :exc:`gcloud.exceptions.NotFound` if the topic does not
exist
"""
+ options = CallOptions(is_bundling=False)
message_pbs = [_message_pb_from_dict(message)
for message in messages]
try:
- event = self._gax_api.publish(topic_path, message_pbs)
- if not event.is_set():
- event.wait()
+ result = self._gax_api.publish(topic_path, message_pbs,
+ options=options)
except GaxError as exc:
if exc_to_code(exc.cause) == StatusCode.NOT_FOUND:
raise NotFound(topic_path)
raise
- return event.result.message_ids
+ return result.message_ids
def topic_list_subscriptions(self, topic_path, page_size=0,
page_token=None):
| GAX API hangs during (second) 'publish'
/cc @bjwatson
``` python
======================================================================
ERROR: test_message_pull_mode_e2e (pubsub.TestPubsub)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/tseaver/projects/agendaless/Google/src/gcloud-python/system_tests/pubsub.py", line 157, in test_message_pull_mode_e2e
topic.publish(MESSAGE_1, extra=EXTRA_1)
File "/home/tseaver/projects/agendaless/Google/src/gcloud-python/gcloud/pubsub/topic.py", line 246, in publish
message_ids = api.topic_publish(self.full_name, [message_data])
File "/home/tseaver/projects/agendaless/Google/src/gcloud-python/gcloud/pubsub/_gax.py", line 171, in topic_publish
event.wait()
AttributeError: 'exceptions.AttributeError' object has no attribute 'message_ids'
```
| To reproduce:
``` bash
$ git clone git@github.com:tseaver/gcloud
$ cd gcloud
$ git checkout 1855-moar_gax_paging_fixes
$ tox -e system-tests --notest
$ PYTHONPATH= .tox/system-tests/bin/python system_tests/run_system_test.py --package=pubsub
```
Seems like the [PublishResponse](http://pythonhosted.org/gax-google-pubsub-v1/google.pubsub.v1.pubsub_pb2.html#google.pubsub.v1.pubsub_pb2.PublishResponse) object is being returned without `message_ids` nor an exception.
Have you gotten the API `publish(...)` call working in any context? IOW, is this a problem is general, or a problem in this specific context?
@bjwatson I had an earlier error in my system test masking this problem, so I have never actually seen `publish()` succeed with GAX.
@bjwatson One thing different from the other Pubsub APIs (which are working AFAICS, except `pull()` which is masked by this one) is that `push()` gets bundled: the `AttributeError` gets set by the code which unpacks the response during `event.wait()`.
@bjwatson could it be an issue that the event unpacking code isn't handling repeated elements (e.g., `PublishResponse.message_ids`) properly?
@bjwatson To reproduce (updated):
``` bash
$ git clone git@github.com:GoogleCloudPlatform/gcloud
$ cd gcloud
$ tox -e system-tests --notest
$ GCLOUD_ENABLE_GAX=1 PYTHONPATH= .tox/system-tests/bin/python system_tests/run_system_test.py --package=pubsub
```
cc @tbetbetbe
Hi @tseaver. Sorry for the delay; I was looking into a few other things. I'll try what you suggested and let you know what I find.
I have successfully reproduced this issue on my workstation and am looking into it.
So what I've discovered so far is that `event.result` is assigned to `AttributeError: Assignment not allowed to repeated field "message_ids" in protocol message object.`
Based on code analysis, I believe this is being thrown by https://github.com/googleapis/gax-python/blob/master/google/gax/bundling.py#L193. I'm not sure why we haven't discovered this before. Is this the first time the `bundle_descriptor.subresponse_field` is a repeated field, or should that always be the case? (I don't understand bundling that well, yet)
The bundling configuration for the `publish(...)` call is:
```
bundling:
thresholds:
element_count_threshold: 10
element_count_limit: 1000 # TO BE REMOVED LATER
request_byte_threshold: 1024 # 1 Kb
request_byte_limit: 10485760 # TO BE REMOVED LATER
delay_threshold_millis: 10
bundle_descriptor:
bundled_field: messages
discriminator_fields:
- topic
subresponse_field: message_ids
```
I'm about to sign off for the night. @tbetbetbe Do you have some time to look into this during your morning shift?
@tseaver This is a bug in gax-python. I created issue https://github.com/googleapis/gax-python/issues/110 to track it, and plan to fix it today.
@bjwatson Thanks very much!
@tseaver No problem. I have a fix in code review now: https://github.com/googleapis/gax-python/pull/111
Once this is merged, I will publish new versions of gax-python, pubsub, and logging. And then I will verify that this issue is closed. This should be finished tomorrow.
@tseaver I have fixed this issue with https://pypi.python.org/pypi/google-gax/0.12.1 and https://pypi.python.org/pypi/gax-google-pubsub-v1/0.7.10.
Now the behavior I'm seeing is that the `test_message_pull_mode_e2e` test hangs indefinitely on the [_second_ call to `publish(...)`](https://github.com/GoogleCloudPlatform/gcloud-python/blob/master/system_tests/pubsub.py#L158). I'm not sure where in the stack the problem is occurring.
I'm about to wrap up for the day. Can you take a look and see what you think?
I also updated logging, although I have not yet looked into https://github.com/GoogleCloudPlatform/gcloud-python/issues/1889
@tseaver I got caught up in some other things today. Can you or @daspecster look into the indefinite hang on the _second_ call to `publish(...)`, and let me know whether you think it's at the gcloud, gapic, or service layer?
This seems to be a [gax-python](https://github.com/googleapis/gax-python issue). The hang is happening at https://github.com/GoogleCloudPlatform/gcloud-python/blob/master/gcloud/pubsub/_gax.py#L170, but only happens for the second call.
I'll look at it some more on Monday, unless @tbetbetbe or @geigerj get to it first.
- This is indeed a gax-python issue. The problem is that the delay threshold for bundling is handled using a timer object, but we forget to reset the timer object after it fires the first time... Therefore, [this condition](https://github.com/googleapis/gax-python/blob/master/google/gax/bundling.py#L335) will only ever succeed once. I've filed googleapis/gax-python#113 for this.
- https://github.com/GoogleCloudPlatform/gcloud-python/blob/master/gcloud/pubsub/_gax.py#L170 is somewhat roundabout... The intended way of achieving the same effect in GAX is just
```
try:
result = self._gax_api.publish(topic_path, message_pbs,
options=CallOptions(is_bundling=False))
return result.message_ids
except:
...
```
| 2016-06-26T20:36:05Z | [] | [] |
Traceback (most recent call last):
File "/home/tseaver/projects/agendaless/Google/src/gcloud-python/system_tests/pubsub.py", line 157, in test_message_pull_mode_e2e
topic.publish(MESSAGE_1, extra=EXTRA_1)
File "/home/tseaver/projects/agendaless/Google/src/gcloud-python/gcloud/pubsub/topic.py", line 246, in publish
message_ids = api.topic_publish(self.full_name, [message_data])
File "/home/tseaver/projects/agendaless/Google/src/gcloud-python/gcloud/pubsub/_gax.py", line 171, in topic_publish
event.wait()
AttributeError: 'exceptions.AttributeError' object has no attribute 'message_ids'
| 5,823 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-1956 | 628ee9a1b9a1b6a3741d486dc1c4f61eed9cc171 | diff --git a/gcloud/pubsub/topic.py b/gcloud/pubsub/topic.py
--- a/gcloud/pubsub/topic.py
+++ b/gcloud/pubsub/topic.py
@@ -443,6 +443,9 @@ def commit(self, client=None):
:param client: the client to use. If not passed, falls back to the
``client`` stored on the current batch.
"""
+ if not self.messages:
+ return
+
if client is None:
client = self.client
api = client.publisher_api
| Pub/Sub empty topic batch fails
``` python
with topic.batch() as batch:
for obj in rows:
batch.publish(b'bla')
```
fails if rows is empty:
Traceback (most recent call last):
…
File "/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/topic.py", line 420, in **exit**
self.commit()
File "/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/topic.py", line 449, in commit
message_ids = api.topic_publish(self.topic.full_name, self.messages[:])
File "/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/connection.py", line 205, in topic_publish
method='POST', path='/%s:publish' % (topic_path,), data=data)
File "/usr/local/lib/python2.7/dist-packages/gcloud/connection.py", line 344, in api_request
error_info=method + ' ' + url)
BadRequest: 400 The value for message_count is too small. You passed 0 in the request, but the minimum value is 1. (POST https://pubsub.googleapis.com/v1/projects/karos-api/topics/api-prod:publish)
| Thanks for the report! `gcloud.pubsub.topic.Batch.commit` should indeed be a no-op if there are no messages accumulated.
| 2016-07-04T15:23:41Z | [] | [] |
Traceback (most recent call last):
…
File "/usr/local/lib/python2.7/dist-packages/gcloud/pubsub/topic.py", line 420, in **exit**
| 5,831 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-1976 | a1620a986800fce26796732759a53a2406ee4e67 | diff --git a/gcloud/_helpers.py b/gcloud/_helpers.py
--- a/gcloud/_helpers.py
+++ b/gcloud/_helpers.py
@@ -194,10 +194,11 @@ def _default_service_project_id():
search_paths.append(os.path.expanduser(DEFAULT_CONFIGURATION_PATH))
except ImportError:
pass
- win32_config_path = os.path.join(os.getenv('APPDATA', ''),
- 'gcloud', 'configurations',
- 'config_default')
- search_paths.append(win32_config_path)
+
+ windows_config_path = os.path.join(os.getenv('APPDATA', ''),
+ 'gcloud', 'configurations',
+ 'config_default')
+ search_paths.append(windows_config_path)
config = configparser.RawConfigParser()
config.read(search_paths)
| _get_default_service_project_id() fails in appveyor
``` python
======================================================================
FAIL: test_read_from_cli_info (gcloud.test__helpers.Test__get_default_service_project_id)
----------------------------------------------------------------------
Traceback (most recent call last):
File "C:\projects\gcloud-python\gcloud\test__helpers.py", line 223, in test_read_from_cli_info
self.assertEqual('test-project-id', project_id)
AssertionError: 'test-project-id' != None
```
See also #1973
| 2016-07-12T04:28:53Z | [] | [] |
Traceback (most recent call last):
File "C:\projects\gcloud-python\gcloud\test__helpers.py", line 223, in test_read_from_cli_info
self.assertEqual('test-project-id', project_id)
AssertionError: 'test-project-id' != None
| 5,833 |
||||
googleapis/google-cloud-python | googleapis__google-cloud-python-2094 | f02d7a28cc835790cf7991da7323416769463888 | diff --git a/gcloud/logging/_gax.py b/gcloud/logging/_gax.py
--- a/gcloud/logging/_gax.py
+++ b/gcloud/logging/_gax.py
@@ -120,7 +120,12 @@ def logger_delete(self, project, logger_name):
"""
options = None
path = 'projects/%s/logs/%s' % (project, logger_name)
- self._gax_api.delete_log(path, options)
+ try:
+ self._gax_api.delete_log(path, options)
+ except GaxError as exc:
+ if exc_to_code(exc.cause) == StatusCode.NOT_FOUND:
+ raise NotFound(path)
+ raise
class _SinksAPI(object):
| Logging system tests: gRPC/GaxError for 'NotFound' in tearDown / logger.delete
From: https://travis-ci.org/GoogleCloudPlatform/gcloud-python/builds/151551907#L647-L675
``` python
======================================================================
ERROR: test_log_struct_w_metadata (logging_.TestLogging)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/system_tests/logging_.py", line 69, in tearDown
retry(doomed.delete)()
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/system_tests/retry.py", line 77, in wrapped_function
return to_wrap(*args, **kwargs)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/gcloud/logging/logger.py", line 268, in delete
client.logging_api.logger_delete(self.project, self.name)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/gcloud/logging/_gax.py", line 123, in logger_delete
self._gax_api.delete_log(path, options)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/google/cloud/logging/v2/logging_service_v2_api.py", line 216, in delete_log
self._delete_log(request, options)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/google/gax/api_callable.py", line 480, in inner
return api_caller(api_call, this_settings, request)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/google/gax/api_callable.py", line 468, in base_caller
return api_call(*args)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/google/gax/api_callable.py", line 433, in inner
raise_with_traceback(GaxError('RPC failed', cause=exception))
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/google/gax/api_callable.py", line 430, in inner
return a_func(*args, **kwargs)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/google/gax/api_callable.py", line 64, in inner
return a_func(*updated_args, **kwargs)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/grpc/beta/_client_adaptations.py", line 305, in __call__
self._request_serializer, self._response_deserializer)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/grpc/beta/_client_adaptations.py", line 203, in _blocking_unary_unary
raise _abortion_error(rpc_error_call)
GaxError: GaxError(RPC failed, caused by AbortionError(code=StatusCode.NOT_FOUND, details="Requested entity was not found."))
```
| @tseaver Looks like `StatusCode.NOT_FOUND` is an eventual consistency error?
We are already retrying for `gcloud.exceptions.NotFound`: we just need to update our GAX helper to map the gRPC `NotFound` onto that type.
| 2016-08-11T18:41:00Z | [] | [] |
Traceback (most recent call last):
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/system_tests/logging_.py", line 69, in tearDown
retry(doomed.delete)()
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/system_tests/retry.py", line 77, in wrapped_function
return to_wrap(*args, **kwargs)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/gcloud/logging/logger.py", line 268, in delete
client.logging_api.logger_delete(self.project, self.name)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/gcloud/logging/_gax.py", line 123, in logger_delete
self._gax_api.delete_log(path, options)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/google/cloud/logging/v2/logging_service_v2_api.py", line 216, in delete_log
self._delete_log(request, options)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/google/gax/api_callable.py", line 480, in inner
return api_caller(api_call, this_settings, request)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/google/gax/api_callable.py", line 468, in base_caller
return api_call(*args)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/google/gax/api_callable.py", line 433, in inner
raise_with_traceback(GaxError('RPC failed', cause=exception))
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/google/gax/api_callable.py", line 430, in inner
return a_func(*args, **kwargs)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/google/gax/api_callable.py", line 64, in inner
return a_func(*updated_args, **kwargs)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/grpc/beta/_client_adaptations.py", line 305, in __call__
self._request_serializer, self._response_deserializer)
File "/home/travis/build/GoogleCloudPlatform/gcloud-python/.tox/system-tests/lib/python2.7/site-packages/grpc/beta/_client_adaptations.py", line 203, in _blocking_unary_unary
raise _abortion_error(rpc_error_call)
GaxError: GaxError(RPC failed, caused by AbortionError(code=StatusCode.NOT_FOUND, details="Requested entity was not found."))
| 5,847 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-2303 | 15e8aecc28cfca4ecc73ff4bd2304d9a58756531 | diff --git a/google/cloud/datastore/batch.py b/google/cloud/datastore/batch.py
--- a/google/cloud/datastore/batch.py
+++ b/google/cloud/datastore/batch.py
@@ -183,9 +183,13 @@ def put(self, entity):
:type entity: :class:`google.cloud.datastore.entity.Entity`
:param entity: the entity to be saved.
- :raises: ValueError if entity has no key assigned, or if the key's
+ :raises: :class:`~exceptions.ValueError` if the batch is not in
+ progress, if entity has no key assigned, or if the key's
``project`` does not match ours.
"""
+ if self._status != self._IN_PROGRESS:
+ raise ValueError('Batch must be in progress to put()')
+
if entity.key is None:
raise ValueError("Entity must have a key")
@@ -206,9 +210,13 @@ def delete(self, key):
:type key: :class:`google.cloud.datastore.key.Key`
:param key: the key to be deleted.
- :raises: ValueError if key is not complete, or if the key's
+ :raises: :class:`~exceptions.ValueError` if the batch is not in
+ progress, if key is not complete, or if the key's
``project`` does not match ours.
"""
+ if self._status != self._IN_PROGRESS:
+ raise ValueError('Batch must be in progress to delete()')
+
if key.is_partial:
raise ValueError("Key must be complete")
@@ -255,7 +263,13 @@ def commit(self):
This is called automatically upon exiting a with statement,
however it can be called explicitly if you don't want to use a
context manager.
+
+ :raises: :class:`~exceptions.ValueError` if the batch is not
+ in progress.
"""
+ if self._status != self._IN_PROGRESS:
+ raise ValueError('Batch must be in progress to commit()')
+
try:
self._commit()
finally:
@@ -267,12 +281,19 @@ def rollback(self):
Marks the batch as aborted (can't be used again).
Overridden by :class:`google.cloud.datastore.transaction.Transaction`.
+
+ :raises: :class:`~exceptions.ValueError` if the batch is not
+ in progress.
"""
+ if self._status != self._IN_PROGRESS:
+ raise ValueError('Batch must be in progress to rollback()')
+
self._status = self._ABORTED
def __enter__(self):
- self._client._push_batch(self)
self.begin()
+ # NOTE: We make sure begin() succeeds before pushing onto the stack.
+ self._client._push_batch(self)
return self
def __exit__(self, exc_type, exc_val, exc_tb):
diff --git a/google/cloud/datastore/client.py b/google/cloud/datastore/client.py
--- a/google/cloud/datastore/client.py
+++ b/google/cloud/datastore/client.py
@@ -348,6 +348,7 @@ def put_multi(self, entities):
if not in_batch:
current = self.batch()
+ current.begin()
for entity in entities:
current.put(entity)
@@ -384,6 +385,7 @@ def delete_multi(self, keys):
if not in_batch:
current = self.batch()
+ current.begin()
for key in keys:
current.delete(key)
diff --git a/google/cloud/datastore/transaction.py b/google/cloud/datastore/transaction.py
--- a/google/cloud/datastore/transaction.py
+++ b/google/cloud/datastore/transaction.py
@@ -90,6 +90,8 @@ class Transaction(Batch):
:param client: the client used to connect to datastore.
"""
+ _status = None
+
def __init__(self, client):
super(Transaction, self).__init__(client)
self._id = None
@@ -125,10 +127,15 @@ def begin(self):
statement, however it can be called explicitly if you don't want
to use a context manager.
- :raises: :class:`ValueError` if the transaction has already begun.
+ :raises: :class:`~exceptions.ValueError` if the transaction has
+ already begun.
"""
super(Transaction, self).begin()
- self._id = self.connection.begin_transaction(self.project)
+ try:
+ self._id = self.connection.begin_transaction(self.project)
+ except:
+ self._status = self._ABORTED
+ raise
def rollback(self):
"""Rolls back the current transaction.
| Transaction is left open when error happens in Datastore API
This one is hard to reproduce, but I noticed that when using a transaction with a **with** statement and an error happens inside the transaction itself (a 500 error for example), it appears that the transaction is left open.
``` python
with self.client.transaction():
# do stuff
# create entities, put, put_multi, etc
```
Exception:
``` python
Traceback (most recent call last):
File "my_client.py", line 190, in do_stuff
with self.client.transaction():
File "site-packages/gcloud/datastore/batch.py", line 275, in __enter__
self.begin()
File "site-packages/gcloud/datastore/transaction.py", line 131, in begin
self._id = self.connection.begin_transaction(self.project)
File "site-packages/gcloud/datastore/connection.py", line 307, in begin_transaction
_datastore_pb2.BeginTransactionResponse)
File "site-packages/gcloud/datastore/connection.py", line 124, in _rpc
data=request_pb.SerializeToString())
File "site-packages/gcloud/datastore/connection.py", line 98, in _request
raise make_exception(headers, error_status.message, use_json=False)
InternalServerError: 500 internal error.
```
After the exception, any operations to the Datastore using the same client have no effect, and no error is reported.
So my guess is that for some reason the transaction was left open and the changes are never committed.
| Thanks for the report!
I was able to reproduce this, digging in now.
After just faking the 500
``` diff
diff --git a/google/cloud/datastore/transaction.py b/google/cloud/datastore/transaction.py
index 66ef8bb..0318643 100644
--- a/google/cloud/datastore/transaction.py
+++ b/google/cloud/datastore/transaction.py
@@ -128,6 +128,10 @@ class Transaction(Batch):
:raises: :class:`ValueError` if the transaction has already begun.
"""
super(Transaction, self).begin()
+ import httplib2
+ from google.cloud.exceptions import make_exception
+ resp = httplib2.Response({'status': 500})
+ raise make_exception(resp, {'error': {'message': 'internal error'}})
self._id = self.connection.begin_transaction(self.project)
def rollback(self):
```
I ran the following:
``` python
>>> from google.cloud import datastore
>>> client = datastore.Client()
>>> q = client.query(kind='Foo')
>>> client.connection._datastore_api
<google.cloud.datastore.connection._DatastoreAPIOverGRPC object at 0x7f4997d6c390>
>>> list(q.fetch())
[]
>>> k = client.key('Foo', 1234)
>>> e = datastore.Entity(key=k)
>>> e['food'] = 'noms'
>>> client.put(e)
>>> list(q.fetch())
[<Entity[{'kind': u'Foo', 'id': 1234L}] {u'food': 'noms'}>]
>>> with client.transaction():
... e['goodbye'] = 123
... client.put(e)
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File ".tox/py27/local/lib/python2.7/site-packages/google/cloud/datastore/batch.py", line 275, in __enter__
self.begin()
File ".tox/py27/local/lib/python2.7/site-packages/google/cloud/datastore/transaction.py", line 134, in begin
raise make_exception(resp, {'error': {'message': 'internal error'}})
google.cloud.exceptions.InternalServerError: 500 internal error
>>> list(q.fetch())
[<Entity[{'kind': u'Foo', 'id': 1234L}] {u'food': 'noms'}>]
>>> k2 = client.key('Foo', 123456)
>>> e2 = datastore.Entity(key=k2)
>>> e2['other'] = 'brother'
>>> client.put(e2)
>>>
>>> list(q.fetch())
[<Entity[{'kind': u'Foo', 'id': 1234L}] {u'food': 'noms'}>]
>>>
>>> # Just use a FRESH client
>>> client = datastore.Client()
>>> client.put(e2)
>>> list(q.fetch())
[<Entity[{'kind': u'Foo', 'id': 1234L}] {u'food': 'noms'}>, <Entity[{'kind': u'Foo', 'id': 123456L}] {u'other': 'brother'}>]
```
The issue is certainly in `Batch.__enter__`, hope to have it sorted out soon.
| 2016-09-12T19:29:16Z | [] | [] |
Traceback (most recent call last):
File "my_client.py", line 190, in do_stuff
with self.client.transaction():
File "site-packages/gcloud/datastore/batch.py", line 275, in __enter__
self.begin()
File "site-packages/gcloud/datastore/transaction.py", line 131, in begin
self._id = self.connection.begin_transaction(self.project)
File "site-packages/gcloud/datastore/connection.py", line 307, in begin_transaction
_datastore_pb2.BeginTransactionResponse)
File "site-packages/gcloud/datastore/connection.py", line 124, in _rpc
data=request_pb.SerializeToString())
File "site-packages/gcloud/datastore/connection.py", line 98, in _request
raise make_exception(headers, error_status.message, use_json=False)
InternalServerError: 500 internal error.
| 5,862 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-2313 | 3ab8459ea424ea1277da0c07b5e59681c152d570 | diff --git a/google/cloud/bigquery/job.py b/google/cloud/bigquery/job.py
--- a/google/cloud/bigquery/job.py
+++ b/google/cloud/bigquery/job.py
@@ -359,7 +359,12 @@ def begin(self, client=None):
``NoneType``
:param client: the client to use. If not passed, falls back to the
``client`` stored on the current dataset.
+
+ :raises: :exc:`ValueError` if the job has already begin.
"""
+ if self.state is not None:
+ raise ValueError("Job already begun.")
+
client = self._require_client(client)
path = '/projects/%s/jobs' % (self.project,)
api_response = client.connection.api_request(
diff --git a/google/cloud/bigquery/table.py b/google/cloud/bigquery/table.py
--- a/google/cloud/bigquery/table.py
+++ b/google/cloud/bigquery/table.py
@@ -868,7 +868,8 @@ def upload_from_file(self,
:rtype: :class:`google.cloud.bigquery.jobs.LoadTableFromStorageJob`
:returns: the job instance used to load the data (e.g., for
- querying status).
+ querying status). Note that the job is already started:
+ do not call ``job.begin()``.
:raises: :class:`ValueError` if ``size`` is not passed in and can not
be determined, or if the ``file_obj`` can be detected to be
a file opened in text mode.
| Calling 'job.begin' on job returned from 'Table.upload_from_file' raises a 400
Trying to load data into bigquery with `upload_from_file` returns a 400.
Sample code:
``` python
def load_data_from_file(dataset_name, table_name, source_file_name):
bigquery_client = bigquery.Client()
dataset = bigquery_client.dataset(dataset_name)
table = dataset.table(table_name)
# Reload the table to get the schema.
table.reload()
with open(source_file_name, 'rb') as source_file:
# This example uses CSV, but you can use other formats.
# See https://cloud.google.com/bigquery/loading-data
job = table.upload_from_file(
source_file, source_format='text/csv')
job.begin()
wait_for_job(job)
print('Loaded {} rows into {}:{}.'.format(
job.output_rows, dataset_name, table_name))
```
Error:
```
Traceback (most recent call last):
File "load_data_from_file.py", line 77, in <module>
args.source_file_anme)
File "load_data_from_file.py", line 45, in load_data_from_file
job.begin()
File "/Users/jonwayne/workspace/python-docs-samples/bigquery/cloud-client/env/lib/python3.4/site-packages/gcloud/bigquery/job.py", line 364, in begin
method='POST', path=path, data=self._build_resource())
File "/Users/jonwayne/workspace/python-docs-samples/bigquery/cloud-client/env/lib/python3.4/site-packages/gcloud/connection.py", line 347, in api_request
error_info=method + ' ' + url)
gcloud.exceptions.BadRequest: 400 Load configuration must specify at least one source URI (POST https://www.googleapis.com/bigquery/v2/projects/python-docs-samples-tests/jobs)
```
| Any updates on this issue? My project is stuck because of it...
Do we know what the problem is yet ? @tseaver
/cc @omaray
There is a workaround: first upload the file to Storage, and then load the table from there. E.g., see [this system test](https://github.com/GoogleCloudPlatform/google-cloud-python/blob/3ab8459ea424ea1277da0c07b5e59681c152d570/system_tests/bigquery.py#L291-L360).
I will look into reproducing the issue with a new system test.
| 2016-09-13T20:39:08Z | [] | [] |
Traceback (most recent call last):
File "load_data_from_file.py", line 77, in <module>
args.source_file_anme)
File "load_data_from_file.py", line 45, in load_data_from_file
job.begin()
File "/Users/jonwayne/workspace/python-docs-samples/bigquery/cloud-client/env/lib/python3.4/site-packages/gcloud/bigquery/job.py", line 364, in begin
method='POST', path=path, data=self._build_resource())
File "/Users/jonwayne/workspace/python-docs-samples/bigquery/cloud-client/env/lib/python3.4/site-packages/gcloud/connection.py", line 347, in api_request
error_info=method + ' ' + url)
gcloud.exceptions.BadRequest: 400 Load configuration must specify at least one source URI (POST https://www.googleapis.com/bigquery/v2/projects/python-docs-samples-tests/jobs)
| 5,863 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-2348 | ddee54898273625cd056cabc2e082c6212c98d99 | diff --git a/google/cloud/exceptions.py b/google/cloud/exceptions.py
--- a/google/cloud/exceptions.py
+++ b/google/cloud/exceptions.py
@@ -21,6 +21,8 @@
import json
import six
+from google.cloud._helpers import _to_bytes
+
_HTTP_CODE_TO_EXCEPTION = {} # populated at end of module
@@ -41,7 +43,10 @@ def __init__(self, message, errors=()):
self._errors = errors
def __str__(self):
- return '%d %s' % (self.code, self.message)
+ result = u'%d %s' % (self.code, self.message)
+ if six.PY2:
+ result = _to_bytes(result, 'utf-8')
+ return result
@property
def errors(self):
| raised UnicodeEncodeError when printing GCloudError exception
When exception was received from google storage and UnicodeEncodeError exception was raised
``` python
try:
client = storage.Client()
bucket = client.get_bucket(bucket_name)
except exceptions.NotFound as e:
print e
```
raised
```
unexpected: UnicodeEncodeError('ascii', u'502 <!DOCTYPE html>\n<html lang=en>\n <meta charset=utf-8>\n <meta name=viewport content="initial-scale=1, minimum-scale=1, width=device-width">\n <title>Error 502 (Server Error)!!1</title>\n <style>\n *{margin:0;padding:0}html,code{font:15px/22px arial,sans-serif}html{background:#fff;color:#222;padding:15px}body{margin:7% auto 0;max-width:390px;min-height:180px;padding:30px 0 15px}* > body{background:url(//www.google.com/images/errors/robot.png) 100% 5px no-repeat;padding-right:205px}p{margin:11px 0 22px;overflow:hidden}ins{color:#777;text-decoration:none}a img{border:0}@media screen and (max-width:772px){body{background:none;margin-top:0;max-width:none;padding-right:0}}#logo{background:url(//www.google.com/images/branding/googlelogo/1x/googlelogo_color_150x54dp.png) no-repeat;margin-left:-5px}@media only screen and (min-resolution:192dpi){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat 0% 0%/100% 100%;-moz-border-image:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) 0}}@media only screen and (-webkit-min-device-pixel-ratio:2){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat;-webkit-background-size:100% 100%}}#logo{display:inline-block;height:54px;width:150px}\n </style>\n <a href=//www.google.com/><span id=logo aria-label=Google></span></a>\n <p><b>502.</b> <ins>That\u2019s an error.</ins>\n <p>The server encountered a temporary error and could not complete your request.<p>Please try again in 30 seconds. <ins>That\u2019s all we know.</ins>\n (GET https://www.googleapis.com/storage/v1/b/anura-recordings-ar?projection=noAcl)', 1445, 1446, 'ordinal not in range(128)')
```
This is the unicode string that caused the problem -> `That\u2019s all we know.`
The problem seems to be the `__str__` function in your base exception class.
``` python
class GCloudError(Exception):
....
def __str__(self):
return '%d %s' % (self.code, self.message)
```
HOW TO REPRODUCE
``` python
try:
a = u'That\u2019s all we know'
print a
raise exceptions.NotFound(a)
except exceptions.GCloudError as e:
print e
```
which gives
```
$ python test_gs2.py
That’s all we know
Traceback (most recent call last):
File "test_gs2.py", line 11, in <module>
print e
UnicodeEncodeError: 'ascii' codec can't encode character u'\u2019' in position 8: ordinal not in range(128)
```
| @cesterlizi thanks for reporting!
@dhermes would making [this](https://github.com/GoogleCloudPlatform/google-cloud-python/blob/master/google/cloud/exceptions.py#L44) a unicode string solve this properly for py2 and py3?
``` python
def __str__(self):
return u'%d %s' % (self.code, self.message)
```
@daspecster I'm not 100% on that. We probably want to implement `__unicode__` as well. I always consult http://www.rafekettler.com/magicmethods.html in these situations
| 2016-09-19T16:48:34Z | [] | [] |
Traceback (most recent call last):
File "test_gs2.py", line 11, in <module>
print e
UnicodeEncodeError: 'ascii' codec can't encode character u'\u2019' in position 8: ordinal not in range(128)
| 5,866 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-2365 | 7b3646994c1c4f67551e7f04774de3b85f9fe0d6 | diff --git a/google/cloud/bigquery/table.py b/google/cloud/bigquery/table.py
--- a/google/cloud/bigquery/table.py
+++ b/google/cloud/bigquery/table.py
@@ -18,12 +18,15 @@
import json
import os
+import httplib2
import six
from google.cloud._helpers import _datetime_from_microseconds
from google.cloud._helpers import _microseconds_from_datetime
from google.cloud._helpers import _millis_from_datetime
from google.cloud.exceptions import NotFound
+from google.cloud.exceptions import make_exception
+from google.cloud.streaming.exceptions import HttpError
from google.cloud.streaming.http_wrapper import Request
from google.cloud.streaming.http_wrapper import make_api_request
from google.cloud.streaming.transfer import RESUMABLE_UPLOAD
@@ -776,6 +779,16 @@ def insert_data(self,
return errors
+ @staticmethod
+ def _check_response_error(request, http_response):
+ """Helper for :meth:`upload_from_file`."""
+ info = http_response.info
+ status = int(info['status'])
+ if not 200 <= status < 300:
+ faux_response = httplib2.Response({'status': status})
+ raise make_exception(faux_response, http_response.content,
+ error_info=request.url)
+
# pylint: disable=too-many-arguments,too-many-locals
def upload_from_file(self,
file_obj,
@@ -947,13 +960,21 @@ def upload_from_file(self,
request.url = connection.build_api_url(api_base_url=base_url,
path=path,
query_params=query_params)
- upload.initialize_upload(request, connection.http)
+ try:
+ upload.initialize_upload(request, connection.http)
+ except HttpError as err_response:
+ faux_response = httplib2.Response(err_response.response)
+ raise make_exception(faux_response, err_response.content,
+ error_info=request.url)
if upload.strategy == RESUMABLE_UPLOAD:
http_response = upload.stream_file(use_chunks=True)
else:
http_response = make_api_request(connection.http, request,
retries=num_retries)
+
+ self._check_response_error(request, http_response)
+
response_content = http_response.content
if not isinstance(response_content,
six.string_types): # pragma: NO COVER Python3
| bigquery.table.Table.upload_from_file() doesn't check the initial request response status
If you try to add data to a table from a file without a schema:
``` python
bigquery_client = bigquery.Client()
dataset = bigquery_client.dataset(dataset_name)
table = dataset.table(table_name)
with open(source_file_name, 'rb') as source_file:
job = table.upload_from_file(
source_file, source_format='text/csv', rewind=True)
job.begin()
```
This will fail with an opaque error:
```
Traceback (most recent call last):
File "load_data_from_file.py", line 74, in <module>
args.source_file_anme)
File "load_data_from_file.py", line 40, in load_data_from_gcs
source_file, source_format='text/csv', rewind=True)
File "/Users/jonwayne/workspace/python-docs-samples/bigquery/cloud-client/env/lib/python3.4/site-packages/gcloud/bigquery/table.py", line 913, in upload_from_file
return client.job_from_resource(json.loads(response_content))
File "/Users/jonwayne/workspace/python-docs-samples/bigquery/cloud-client/env/lib/python3.4/site-packages/gcloud/bigquery/client.py", line 119, in job_from_resource
config = resource['configuration']
KeyError: 'configuration'
```
Inspection of the response from the initial upload request reveals the problem:
```
status: 400
body:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "invalid",
"message": "Empty schema specified for the load job. Please specify a schema that describes the data being loaded.",
...
```
It seems we should check the response before constructing the job.
| Weirdly enough you can upload data to a table from GCS without needing a schema.
Issue is similar to #1712 for Storage: look at #1717 for the fix there.
| 2016-09-20T19:08:52Z | [] | [] |
Traceback (most recent call last):
File "load_data_from_file.py", line 74, in <module>
args.source_file_anme)
File "load_data_from_file.py", line 40, in load_data_from_gcs
source_file, source_format='text/csv', rewind=True)
File "/Users/jonwayne/workspace/python-docs-samples/bigquery/cloud-client/env/lib/python3.4/site-packages/gcloud/bigquery/table.py", line 913, in upload_from_file
return client.job_from_resource(json.loads(response_content))
File "/Users/jonwayne/workspace/python-docs-samples/bigquery/cloud-client/env/lib/python3.4/site-packages/gcloud/bigquery/client.py", line 119, in job_from_resource
config = resource['configuration']
KeyError: 'configuration'
| 5,868 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-2379 | 1f223f20663d4d50ea528b619d5093f114368bd6 | diff --git a/google/cloud/bigquery/job.py b/google/cloud/bigquery/job.py
--- a/google/cloud/bigquery/job.py
+++ b/google/cloud/bigquery/job.py
@@ -375,7 +375,7 @@ def cancel(self, client=None):
api_response = client.connection.api_request(
method='POST', path='%s/cancel' % (self.path,))
- self._set_properties(api_response)
+ self._set_properties(api_response['job'])
class _LoadConfiguration(object):
| '_AsyncQuery.cancel' fails to update from returned resource
E.g.:
``` python
Traceback (most recent call last):
File ...
job.cancel()
File ".../google/cloud/bigquery/job.py", line 378, in cancel
self._set_properties(api_response)
File ".../google/cloud/bigquery/job.py", line 262, in _set_properties
self._scrub_local_properties(cleaned)
File ".../google/cloud/bigquery/job.py", line 1050, in _scrub_local_properties
configuration = cleaned['configuration']['query']
KeyError: 'configuration'
```
The [docs for `job.cancel`](https://cloud.google.com/bigquery/docs/reference/v2/jobs/cancel#response) show that the job resource is in a `job` subkey of the response.
| @tseaver Germane to this change, I've run the following before to get a sense of our actual usage of `streaming`:
```
nosetests system_tests/storage.py system_tests/bigquery.py \
--with-coverage \
--cover-branches \
--cover-tests \
--cover-package=gcloud.streaming \
--cover-package=gcloud.bigquery \
--cover-package=gcloud.storage \
--nocapture
```
Could be worthwhile to do it and see what code paths we aren't testing against the backend?
(**NOTE**: I'm aware this is using the old `gcloud` package and `nosetests`, sorry I am in a hurry and didn't update for the `py.test` equivalent.)
@dhermes I don't think that is germane to this issue, which doesn't mess with streaming at all. Maybe you meant #1431?
No I just meant that if you got a sense for lines uncovered during system tests then an error like this would be detectable. The streaming bit was just an example so you could do the same locally.
Essentially this error is due to the fact that our code and unit tests make a bad assumption about the payload.
| 2016-09-21T18:17:11Z | [] | [] |
Traceback (most recent call last):
File ...
job.cancel()
File ".../google/cloud/bigquery/job.py", line 378, in cancel
self._set_properties(api_response)
File ".../google/cloud/bigquery/job.py", line 262, in _set_properties
self._scrub_local_properties(cleaned)
File ".../google/cloud/bigquery/job.py", line 1050, in _scrub_local_properties
configuration = cleaned['configuration']['query']
KeyError: 'configuration'
| 5,871 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-2423 | 0c0c8adf36b181d683316c0c64d21172594211fe | diff --git a/datastore/google/cloud/datastore/connection.py b/datastore/google/cloud/datastore/connection.py
--- a/datastore/google/cloud/datastore/connection.py
+++ b/datastore/google/cloud/datastore/connection.py
@@ -23,6 +23,7 @@
from google.cloud import connection as connection_module
from google.cloud.environment_vars import DISABLE_GRPC
from google.cloud.environment_vars import GCD_HOST
+from google.cloud.exceptions import BadRequest
from google.cloud.exceptions import Conflict
from google.cloud.exceptions import GrpcRendezvous
from google.cloud.exceptions import make_exception
@@ -313,8 +314,11 @@ def commit(self, project, request_pb):
try:
return self._stub.Commit(request_pb)
except GrpcRendezvous as exc:
- if exc.code() == StatusCode.ABORTED:
+ error_code = exc.code()
+ if error_code == StatusCode.ABORTED:
raise Conflict(exc.details())
+ if error_code == StatusCode.INVALID_ARGUMENT:
+ raise BadRequest(exc.details())
raise
def rollback(self, project, request_pb):
| Saving large values to indexed properties raises _Rendezvous exception
Given this code:
``` python
from google.cloud import datastore
from google.cloud.datastore import Entity
client = datastore.Client()
with client.transaction() as xact:
e = Entity(client.key("Test", "123"))
e.update({"x": "a" * 1500000})
xact.put(e)
```
version 0.19 of this library raises the following exception on `.put`:
``` python
Traceback (most recent call last):
File "test.py", line 8, in <module>
xact.put(e)
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/google/cloud/datastore/batch.py", line 302, in __exit__
self.commit()
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/google/cloud/datastore/transaction.py", line 167, in commit
super(Transaction, self).commit()
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/google/cloud/datastore/batch.py", line 274, in commit
self._commit()
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/google/cloud/datastore/batch.py", line 251, in _commit
self.project, self._commit_request, self._id)
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/google/cloud/datastore/connection.py", line 584, in commit
response = self._datastore_api.commit(project, request)
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/google/cloud/datastore/connection.py", line 314, in commit
return self._stub.Commit(request_pb)
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/grpc/_channel.py", line 481, in __call__
return _end_unary_response_blocking(state, False, deadline)
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/grpc/_channel.py", line 432, in _end_unary_response_blocking
raise _Rendezvous(state, None, None, deadline)
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with (StatusCode.INVALID_ARGUMENT, The value of property "x" is longer than 1048487 bytes.)>
```
Previously, this used to raise `gcloud.exceptions.BadRequest`. It seems like this exception should be caught by the library and a more appropriate one should be raised in its stead.
_EDIT_: It turns out the following snippet has the same issue so this is not transaction specific:
``` python
from google.cloud import datastore
from google.cloud.datastore import Entity
client = datastore.Client()
e = Entity(client.key("Test", "123"))
e.update({"x": "a" * 1500000})
client.put(e)
```
_EDIT 2_: This does not work either, raising the same exception:
``` python
from google.cloud import datastore
from google.cloud.datastore import Entity
client = datastore.Client()
e = Entity(client.key("Test", "123"), exclude_from_indexes=("x",))
e.update({"x": "a" * 1500000})
client.put(e)
```
| @Bogdanp Thanks for the heads up. We currently just have an [ad-hoc translation](https://github.com/GoogleCloudPlatform/google-cloud-python/blob/4258b764f401f8c890aefb5a6316d0d73ced66e7/google/cloud/datastore/connection.py#L315-L317) for such errors, so it's kind of a "wait and see".
| 2016-09-26T17:23:03Z | [] | [] |
Traceback (most recent call last):
File "test.py", line 8, in <module>
xact.put(e)
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/google/cloud/datastore/batch.py", line 302, in __exit__
self.commit()
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/google/cloud/datastore/transaction.py", line 167, in commit
super(Transaction, self).commit()
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/google/cloud/datastore/batch.py", line 274, in commit
self._commit()
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/google/cloud/datastore/batch.py", line 251, in _commit
self.project, self._commit_request, self._id)
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/google/cloud/datastore/connection.py", line 584, in commit
response = self._datastore_api.commit(project, request)
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/google/cloud/datastore/connection.py", line 314, in commit
return self._stub.Commit(request_pb)
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/grpc/_channel.py", line 481, in __call__
return _end_unary_response_blocking(state, False, deadline)
File "/Users/bogdan/.virtualenvs/bread/lib/python2.7/site-packages/grpc/_channel.py", line 432, in _end_unary_response_blocking
raise _Rendezvous(state, None, None, deadline)
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with (StatusCode.INVALID_ARGUMENT, The value of property "x" is longer than 1048487 bytes.)>
| 5,874 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-2730 | 6dde514a8184cbc34ea598bfea72e740692f2c35 | diff --git a/core/google/cloud/streaming/transfer.py b/core/google/cloud/streaming/transfer.py
--- a/core/google/cloud/streaming/transfer.py
+++ b/core/google/cloud/streaming/transfer.py
@@ -387,7 +387,7 @@ def initialize_download(self, http_request, http):
# Unless the user has requested otherwise, we want to just
# go ahead and pump the bytes now.
if self.auto_transfer:
- self.stream_file(use_chunks=True)
+ self.stream_file(use_chunks=True, headers=http_request.headers)
def _normalize_start_end(self, start, end=None):
"""Validate / fix up byte range.
@@ -487,7 +487,7 @@ def _compute_end_byte(self, start, end=None, use_chunks=True):
return end_byte
- def _get_chunk(self, start, end):
+ def _get_chunk(self, start, end, headers=None):
"""Retrieve a chunk of the file.
:type start: int
@@ -496,11 +496,14 @@ def _get_chunk(self, start, end):
:type end: int
:param end: (Optional) end byte of the range.
+ :type headers: dict
+ :param headers: (Optional) Headers to be used for the ``Request``.
+
:rtype: :class:`google.cloud.streaming.http_wrapper.Response`
:returns: response from the chunk request.
"""
self._ensure_initialized()
- request = Request(url=self.url)
+ request = Request(url=self.url, headers=headers)
self._set_range_header(request, start, end=end)
return make_api_request(
self.bytes_http, request, retries=self.num_retries)
@@ -589,7 +592,7 @@ def get_range(self, start, end=None, use_chunks=True):
raise TransferRetryError(
'Zero bytes unexpectedly returned in download response')
- def stream_file(self, use_chunks=True):
+ def stream_file(self, use_chunks=True, headers=None):
"""Stream the entire download.
Writes retrieved bytes into :attr:`stream`.
@@ -598,6 +601,9 @@ def stream_file(self, use_chunks=True):
:param use_chunks: If False, ignore :attr:`chunksize`
and stream this download in a single request.
If True, streams via chunks.
+
+ :type headers: dict
+ :param headers: (Optional) Headers to be used for the ``Request``.
"""
self._ensure_initialized()
while True:
@@ -607,7 +613,8 @@ def stream_file(self, use_chunks=True):
else:
end_byte = self._compute_end_byte(self.progress,
use_chunks=use_chunks)
- response = self._get_chunk(self.progress, end_byte)
+ response = self._get_chunk(self.progress, end_byte,
+ headers=headers)
if self.total_size is None:
self._set_total(response.info)
response = self._process_response(response)
| Can't download encrypted files larger than 1 MB from cloud storage
I'm using `upload_from_file` with `encryption_key` to store files inside a Google Cloud Storage bucket. This works very well.
Now I'm trying to download these files again. Using `download_to_file` with the same `encryption_key`. This crashes with the following stack trace:
```
Traceback (most recent call last):
File "restore.py", line 40, in <module>
main()
File "restore.py", line 37, in main
download_from_storage(args.file, args.bucket, args.credentials, key)
File "restore.py", line 17, in download_from_storage
blob.download_to_file(my_file, encryption_key=encryption_key)
File "/Users/daniel/Projects/python/gcs-backup/env/lib/python2.7/site-packages/google/cloud/storage/blob.py", line 354, in download_to_file
download.initialize_download(request, client._connection.http)
File "/Users/daniel/Projects/python/gcs-backup/env/lib/python2.7/site-packages/google/cloud/streaming/transfer.py", line 390, in initialize_download
self.stream_file(use_chunks=True)
File "/Users/daniel/Projects/python/gcs-backup/env/lib/python2.7/site-packages/google/cloud/streaming/transfer.py", line 614, in stream_file
response = self._process_response(response)
File "/Users/daniel/Projects/python/gcs-backup/env/lib/python2.7/site-packages/google/cloud/streaming/transfer.py", line 528, in _process_response
raise TransferRetryError(response.content)
google.cloud.streaming.exceptions.TransferRetryError: The target object is encrypted by a customer-supplied encryption key.
```
I'm using `google-cloud-storage==0.20.0`.
This only occurs when downloading files larger than 1mb. Any file less than 1mb is downloaded and decrypted as expected. A file larger than 1mb might be downloaded in chunks and only the first 1mb chunk is saved to disk.
**Steps to reproduce**
1) Upload a file larger than 1mb using a custom encryption key
2) Try to download this file using the same encryption key
3) Find a 1mb chunk of your file at the destination
**demo code**
```
#!/usr/bin/env python
from tempfile import NamedTemporaryFile
import os
from google.cloud import storage
from google.cloud.storage import Blob
CREDENTIALS = './credentials.json'
BUCKET_NAME = '< SET BUCKET >'
ENCRYPTION_KEY = 'v3CtoFyEJmj52RGsSqze7C8fD6mzgpnd'
FILE_NAME = 'enc-test-file'
client = storage.Client.from_service_account_json(CREDENTIALS)
bucket = client.get_bucket(BUCKET_NAME)
def upload_to_storage(filename):
blob = Blob(FILE_NAME, bucket)
with open(filename, 'rb') as my_file:
blob.upload_from_file(my_file, encryption_key=ENCRYPTION_KEY)
def download_from_storage(filename):
blob = Blob(FILE_NAME, bucket)
with open(filename, 'wb') as my_file:
blob.download_to_file(my_file, encryption_key=ENCRYPTION_KEY)
def main():
size = 2097152 # 2mb file
f = NamedTemporaryFile(delete=False)
f.write("\0" * size)
f.close()
print('Uploading {} to Google Cloud Storage...'.format(f.name))
upload_to_storage(f.name)
print('Downloading {} from Google Cloud Storage...'.format(f.name))
download_from_storage(f.name)
os.remove(f.name)
if __name__ == '__main__':
main()
```
This might be because the encryption headers aren't added to the request for the next chunk.
| @dbanck thanks for reporting!
@dbanck, I _think_ that the encryption key shouldn't get passed to `download_to_file()` but rather is passed to the `Blob` constructor.
See: https://github.com/GoogleCloudPlatform/google-cloud-python/blob/491519dc54cf2204703374ff374cabbd4cd14476/storage/google/cloud/storage/blob.py#L310-L311
This might be true for the current _master_, but the docs have a different example: https://googlecloudplatform.github.io/google-cloud-python/stable/storage-blobs.html#google.cloud.storage.blob.Blob.download_to_file
The files inside my local `site-packages` are looking like this: https://github.com/GoogleCloudPlatform/google-cloud-python/blob/umbrella-0.20.0/storage/google/cloud/storage/blob.py#L287
Should I update to a newer version? But there is no newer version than `0.20.0` on pypi: https://pypi.python.org/pypi/google-cloud-storage/0.20.0
@dbanck, thanks! I forgot that it changed since the last release.
> This might be because the encryption headers aren't added to the request for the next chunk.
I'll take a look at this and see if I can track down the issue.
@dbanck sorry for the slow update on this. I've been playing with it and I thought it was completely fixed on master but it appears that it may not be. I'm confirming that and then I'll try and get a fix in.
Don't worry, take your time. Thanks for the feedback and looking into it :)
| 2016-11-14T17:25:14Z | [] | [] |
Traceback (most recent call last):
File "restore.py", line 40, in <module>
main()
File "restore.py", line 37, in main
download_from_storage(args.file, args.bucket, args.credentials, key)
File "restore.py", line 17, in download_from_storage
blob.download_to_file(my_file, encryption_key=encryption_key)
File "/Users/daniel/Projects/python/gcs-backup/env/lib/python2.7/site-packages/google/cloud/storage/blob.py", line 354, in download_to_file
download.initialize_download(request, client._connection.http)
File "/Users/daniel/Projects/python/gcs-backup/env/lib/python2.7/site-packages/google/cloud/streaming/transfer.py", line 390, in initialize_download
self.stream_file(use_chunks=True)
File "/Users/daniel/Projects/python/gcs-backup/env/lib/python2.7/site-packages/google/cloud/streaming/transfer.py", line 614, in stream_file
response = self._process_response(response)
File "/Users/daniel/Projects/python/gcs-backup/env/lib/python2.7/site-packages/google/cloud/streaming/transfer.py", line 528, in _process_response
raise TransferRetryError(response.content)
google.cloud.streaming.exceptions.TransferRetryError: The target object is encrypted by a customer-supplied encryption key.
| 5,902 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-2750 | b92f0cf5af596dd41951a8505a7eea037fd1a48a | diff --git a/vision/google/cloud/vision/image.py b/vision/google/cloud/vision/image.py
--- a/vision/google/cloud/vision/image.py
+++ b/vision/google/cloud/vision/image.py
@@ -217,20 +217,22 @@ def detect_text(self, limit=10):
def _entity_from_response_type(feature_type, results):
"""Convert a JSON result to an entity type based on the feature."""
-
- detected_objects = []
feature_key = _REVERSE_TYPES[feature_type]
+ annotations = results.get(feature_key, ())
+ if not annotations:
+ return []
+ detected_objects = []
if feature_type == _FACE_DETECTION:
detected_objects.extend(
- Face.from_api_repr(face) for face in results[feature_key])
+ Face.from_api_repr(face) for face in annotations)
elif feature_type == _IMAGE_PROPERTIES:
detected_objects.append(
- ImagePropertiesAnnotation.from_api_repr(results[feature_key]))
+ ImagePropertiesAnnotation.from_api_repr(annotations))
elif feature_type == _SAFE_SEARCH_DETECTION:
- result = results[feature_key]
- detected_objects.append(SafeSearchAnnotation.from_api_repr(result))
+ detected_objects.append(
+ SafeSearchAnnotation.from_api_repr(annotations))
else:
- for result in results[feature_key]:
+ for result in annotations:
detected_objects.append(EntityAnnotation.from_api_repr(result))
return detected_objects
| Image.detect_faces raises KeyError when the image doesn't contain any faces
Expected behavior: An empty list is returned from detect_faces.
Ubuntu 16.04
Python 3.5.2
google-cloud version: 0.21.0
## Code example
~~~python
import google.cloud.vision.client
import base64
test_image_base64 = '''iVBORw0KGgoAAAANSUhEUgAAAOEAAADhCAMAAAAJbSJIAAAAZlBMVEU7lNn1+Pr6+/v7/Ps0kdg2
ktguj9jy9/pxruHS5PPi7vfo8fj4+fsijNdBl9rs8/jB2O/F3PDY6PWx0Oxgp9/O4vNWod15s+On
zOtoquC41e6Mveafx+mIueXD2e9/t+Sawudaot4YIx/WAAAFu0lEQVR4nO2d23biOgxAE8dyIDHJ
ALkAgYH+/0+euECBTqFcbEnhaL+0j+wlWb7GjiJBEARBEARBEARBEARBEARBEARBEARBEP4BtDET
hzFaU/8Y/4AxH928zsq4zOp592EMUP8kn4DWy0WWWmVtHMe2/5tmi6XWbyNpis04VU7uhFXpeFMY
6p/mBW26LLnUO0gmWWeG3yDBbH/2OzhuB94ewVT1Vb+9Y10N2RGidapu+DlUuo4Gq6ib6W9+n47T
ZqCtcdKW9wj2imU7of6xTwCTubrVAs+xaj4ZXKZCMU3u9HMk02JgirqpHxHsFethNUbdzO5rgidU
NiRF3eSPCrp6MxxFXZX31phzbPkxEEVdZc8IukStBqEITfZ4ih4VmwFUVICHi8yZ4oz/CA7MC4JO
kf04XK8e6we/k6yYN0W9eyWCn1HcsVbUoxf9HCPGilDkz/UT59ic8RDVjF/NUYcas12gMmsfgn21
WTNVhMqLn6NimacQzV5vhHssz47fzP3kqEPNGeapHqXeBOM45ddl9MNRXznqsDPglqfm5cHMJWrH
LE8h8urnYFZs9MpvCPsg8hqCe+wKT7DqFOGu5fvHUFNGhl6mFP/CqMfQXnuKI3bGxlC3AfwcLRdF
Mw4Rwj6IXKZRMPJfZvaoEY9iYwIU0oPhlEUQoQkl2M+FWawQ+5w1fYfFLAqaMphgHOcMgmi615aA
b5PQTzE8zwu/w2A9A8IM2E6QD92M92nTJWpFnKZQhKwzjpJ4BVxvQrZCh93Qpukk2HjmiJrSnpfS
oQV7RdIY6jZkZ7gnIZ1Dha6kDtJq6mW/8DdI9xNhGT5J+zRd0hmaRfgk7dN0QZemEGj54hI7Josh
NDmCIOUUCqOvcND1FzjNkLIhmqBTwxNkq4p9b4giSNcjgtd97VukROumOugKzTkJ0Vm3kMuIl1At
Kk6QCo1bj6KZIxokPwdJDKHBCmEfRJJRjW4RDUlGNb5P0NyC5nQNXimlKqY6+DLbmeGUIkvDblhc
YmcUlQYQ1mi+DHOCGAKg+TlFgqOKUGCNSh0JwewCKlRDgkNueoRqSLCNqLd4nUXfXWwJDNFmh46k
wzc0a0xDRfCRCdZC28GQYLkNc1hKMzDF2Fg7MyTYYkM2/ENg+OftDf+iGv59+xi+vyFFlr5/LX3/
/tDTR793GhKM2t5/5K23qIYEsyecszRfhgRnaqBBNSTYuIAIs9IkFMe9MddL45xk76lGXBGuSfae
EAc1NCcwMbt8ig7//7BDCgWiIc2RoQlaMbU50VkMtFJDddRbd2iGBONuByzRzrVRnfSGJ2+AfBSb
kZ2gRZrm030r+/6noKFBSVNLeOcnzooixUriEZydbood7iNQIKSpzSi/IsXYJqX8KAjnUA3xzRGT
OvhXsjXxV7LhJ4nU1/CEuUDphJ1RXxuhN4FvHCD+WN3rnZ4/weBejMCDU9oPufdAFLCcqpo+hH0Q
lwENl/QhjEKOvynH3OdA9NRzFr9jqe81+cJ0gQw7HiGMQn16QfORxc9AE2Bx2HK45+sL0wa4v7Rl
k6MO/yeGuV09D8Zzv69qbo+VeF7QsFnBp8wcePKRpyuCJcd3kXSVens5IOUo+Hlfsh9FS3/R3hVM
6yWKNuXVT5zjRZGzYK+4vPNhzuuocslY0L0LmL3W9fN/BhGKl0bhquYyYboO6MXTJdXGC3YPd/wA
mFH5XKYm5YjbUO0KGlZ3P5R7FkC1AuZN8ASYdvago1WzdiAB3KOLXf5Iqqp8x2+ofRswxfzXV8e/
/NJ5MagA7ukdF+XNp+MP+ZmUiyH6OcDApo5vStokrjcwUD9HXxyr9axMfiw7ViXlbF31pZf6Z74G
GF11qzyNlVJ2L2pt/3+c5quu0gMO3xmgjWna3Xw6zsrUpmU2ns53bWPM0KN3gbM0kyPmzewEQRAE
QRAEQRAEQRAEQRAEQRAEQRAEwQ//AapBX5sm4PxNAAAAAElFTkSuQmCC'''
test_image_bytes = base64.b64decode(test_image_base64)
client = google.cloud.vision.Client()
image = client.image(
content=test_image_bytes,
)
faces = image.detect_faces()
print(faces)
~~~
## Stack trace
Traceback (most recent call last):
File "test.py", line 42, in <module>
faces = image.detect_faces()
File "/home/tal/virtualenv/lib/python3.5/site-packages/google/cloud/vision/image.py", line 138, in detect_faces
return self._detect_annotation(features)
File "/home/tal/virtualenv/lib/python3.5/site-packages/google/cloud/vision/image.py", line 125, in _detect_annotation
_entity_from_response_type(feature.feature_type, results))
File "/home/tal/virtualenv/lib/python3.5/site-packages/google/cloud/vision/image.py", line 226, in _entity_from_response_type
Face.from_api_repr(face) for face in results[feature_key])
KeyError: 'faceAnnotations'
| 2016-11-17T16:30:24Z | [] | [] |
Traceback (most recent call last):
File "test.py", line 42, in <module>
faces = image.detect_faces()
File "/home/tal/virtualenv/lib/python3.5/site-packages/google/cloud/vision/image.py", line 138, in detect_faces
return self._detect_annotation(features)
File "/home/tal/virtualenv/lib/python3.5/site-packages/google/cloud/vision/image.py", line 125, in _detect_annotation
_entity_from_response_type(feature.feature_type, results))
File "/home/tal/virtualenv/lib/python3.5/site-packages/google/cloud/vision/image.py", line 226, in _entity_from_response_type
Face.from_api_repr(face) for face in results[feature_key])
KeyError: 'faceAnnotations'
| 5,905 |
||||
googleapis/google-cloud-python | googleapis__google-cloud-python-2787 | d775489da48d594ce5173a4078af4a0d3d2863f4 | diff --git a/bigquery/google/cloud/bigquery/_helpers.py b/bigquery/google/cloud/bigquery/_helpers.py
--- a/bigquery/google/cloud/bigquery/_helpers.py
+++ b/bigquery/google/cloud/bigquery/_helpers.py
@@ -58,10 +58,11 @@ def _record_from_json(value, field):
"""Coerce 'value' to a mapping, if set or not nullable."""
if _not_null(value, field):
record = {}
- for subfield, cell in zip(field.fields, value['f']):
+ record_iter = zip(field.fields, value['f'])
+ for subfield, cell in record_iter:
converter = _CELLDATA_FROM_JSON[subfield.field_type]
- if field.mode == 'REPEATED':
- value = [converter(item, subfield) for item in cell['v']]
+ if subfield.mode == 'REPEATED':
+ value = [converter(item['v'], subfield) for item in cell['v']]
else:
value = converter(cell['v'], subfield)
record[subfield.name] = value
@@ -103,7 +104,7 @@ def _row_from_json(row, schema):
for field, cell in zip(schema, row['f']):
converter = _CELLDATA_FROM_JSON[field.field_type]
if field.mode == 'REPEATED':
- row_data.append([converter(item, field)
+ row_data.append([converter(item['v'], field)
for item in cell['v']])
else:
row_data.append(converter(cell['v'], field))
| Support (non-legacy) SQL nested data types
`STRUCT<t>` and `ARRAY<t>` seem to be the types.
See #2342. h/t to @c0b for filing.
---
From #2585: [Sample app that reproduces this](https://github.com/GoogleCloudPlatform/python-docs-samples/blob/176f161cae3d4f01b01456a6e403faf6645414a1/bigquery/cloud-client/simple_app.py)
Stacktrace:
``` python
Traceback (most recent call last):
File "simple_app.py", line 57, in <module>
query_shakespeare()
File "simple_app.py", line 47, in query_shakespeare
page_token=page_token)
File "lib/python2.7/site-packages/google/cloud/bigquery/query.py", line 401, in fetch_data
rows_data = _rows_from_json(response.get('rows', ()), self.schema)
File "lib/python2.7/site-packages/google/cloud/bigquery/_helpers.py", line 98, in _rows_from_json
for item in cell['v']])
File "lib/python2.7/site-packages/google/cloud/bigquery/_helpers.py", line 61, in _record_from_json
for subfield, cell in zip(field.fields, value['f']):
KeyError: 'f'
```
---
Stepping into the debugger:
``` python
ipdb> value
{u'v': {u'f': [{u'v': u'hamlet'}, {u'v': u'5318'}]}}
ipdb> field.__dict__
{'field_type': u'RECORD', 'description': None, 'name': u'title', 'fields': [<google.cloud.bigquery.schema.SchemaField object at 0x7f4ba9227550>, <google.cloud.bigquery.schema.SchemaField object at 0x7f4ba9227690>], 'mode': u'REPEATED'}
```
(Added by @dhermes)
| A simple statement works fine
``` python
>>> from google.cloud import bigquery
>>> client = bigquery.Client()
>>> query = client.run_sync_query('SELECT 1')
>>> query.use_legacy_sql = False
>>> query.run()
>>> query.rows
[(1,)]
```
Sending a query with nested types also works fine
``` python
In [1]: from google.cloud import bigquery
In [2]: client = bigquery.Client()
In [3]: sql_str = (
...: 'SELECT ARRAY['
...: 'STRUCT("zx83" AS id, 1 AS a, 2 AS b), '
...: '("f8f7", 4, 7)'
...: '] '
...: 'd')
In [4]: query = client.run_sync_query(sql_str)
In [5]: query.use_legacy_sql = False
In [6]: query.run()
In [7]: query.__dict__
Out[7]:
{'_client': <google.cloud.bigquery.client.Client at 0x7fb7cf39a290>,
'_configuration': <google.cloud.bigquery.query._SyncQueryConfiguration at 0x7fb7cf3b0fd0>,
'_job': None,
'_properties': {u'cacheHit': False,
u'jobComplete': True,
u'jobReference': {u'jobId': u'job_ruoSHLwjeg_lSfYZq8dmrjQ0Eg4',
u'projectId': u'precise-truck-742'},
u'kind': u'bigquery#queryResponse',
u'rows': [{u'f': [{u'v': [{u'v': {u'f': [{u'v': u'zx83'},
{u'v': u'1'},
{u'v': u'2'}]}},
{u'v': {u'f': [{u'v': u'f8f7'}, {u'v': u'4'}, {u'v': u'7'}]}}]}]}],
u'schema': {u'fields': [{u'fields': [{u'mode': u'NULLABLE',
u'name': u'id',
u'type': u'STRING'},
{u'mode': u'NULLABLE', u'name': u'a', u'type': u'INTEGER'},
{u'mode': u'NULLABLE', u'name': u'b', u'type': u'INTEGER'}],
u'mode': u'REPEATED',
u'name': u'd',
u'type': u'RECORD'}]},
u'totalBytesProcessed': u'0',
u'totalRows': u'1'},
'_udf_resources': (),
'query': 'SELECT ARRAY[STRUCT("zx83" AS id, 1 AS a, 2 AS b), ("f8f7", 4, 7)] d'}
```
but the `rows` accessor fails
``` python
In [8]: query.rows
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-8-27b591eefa89> in <module>()
----> 1 query.rows
.../google/cloud/bigquery/query.pyc in rows(self)
223 :returns: fields describing the schema (None until set by the server).
224 """
--> 225 return _rows_from_json(self._properties.get('rows', ()), self.schema)
226
227 @property
.../google/cloud/bigquery/_helpers.pyc in _rows_from_json(rows, schema)
96 if field.mode == 'REPEATED':
97 row_data.append([converter(item, field)
---> 98 for item in cell['v']])
99 else:
100 row_data.append(converter(cell['v'], field))
.../google/cloud/bigquery/_helpers.pyc in _record_from_json(value, field)
59 if _not_null(value, field):
60 record = {}
---> 61 for subfield, cell in zip(field.fields, value['f']):
62 converter = _CELLDATA_FROM_JSON[subfield.field_type]
63 if field.mode == 'REPEATED':
KeyError: 'f'
```
Just for fun, here are those last properties as JSON:
``` json
{
"cacheHit": false,
"jobComplete": true,
"jobReference": {
"jobId": "job_ruoSHLwjeg_lSfYZq8dmrjQ0Eg4",
"projectId": "precise-truck-742"
},
"kind": "bigquery#queryResponse",
"rows": [
{
"f": [
{
"v": [
{
"v": {
"f": [
{
"v": "zx83"
},
{
"v": "1"
},
{
"v": "2"
}
]
}
},
{
"v": {
"f": [
{
"v": "f8f7"
},
{
"v": "4"
},
{
"v": "7"
}
]
}
}
]
}
]
}
],
"schema": {
"fields": [
{
"fields": [
{
"mode": "NULLABLE",
"name": "id",
"type": "STRING"
},
{
"mode": "NULLABLE",
"name": "a",
"type": "INTEGER"
},
{
"mode": "NULLABLE",
"name": "b",
"type": "INTEGER"
}
],
"mode": "REPEATED",
"name": "d",
"type": "RECORD"
}
]
},
"totalBytesProcessed": "0",
"totalRows": "1"
}
```
and where can I get alias names? in a query like below I am expecting to access results in a dict way like `result.a` `result.b` or `result["a"]` `result["b"]` because in my actual app (in nodejs) I have more complicated query generated schema, there are more fields, to access all those fields in the tuple `(1, 2)` way is not convenient and error prone; please let me know should it be another issue by its own? or can be documentation request if you already have it (but I don't see an example in current doc)
https://googlecloudplatform.github.io/google-cloud-python/stable/bigquery-query.html
``` py
In [56]: query = client.run_sync_query('SELECT 1 AS a, 2 AS b'); query.use_legacy_sql = False; query.run()
In [57]: query.__dict__
Out[57]:
{'_client': <gcloud.bigquery.client.Client at 0x7f89a047ad90>,
'_configuration': <gcloud.bigquery.query._SyncQueryConfiguration at 0x7f89a0404710>,
'_job': None,
'_properties': {u'cacheHit': False,
u'jobComplete': True,
u'jobReference': {u'jobId': u'job_POdEXqw6MgHtL-WDZX21BA7_fII',
u'projectId': u'hb-inspector'},
u'kind': u'bigquery#queryResponse',
u'rows': [{u'f': [{u'v': u'1'}, {u'v': u'2'}]}],
u'schema': {u'fields': [{u'mode': u'NULLABLE',
u'name': u'a',
u'type': u'INTEGER'},
{u'mode': u'NULLABLE', u'name': u'b', u'type': u'INTEGER'}]},
u'totalBytesProcessed': u'0',
u'totalRows': u'1'},
'_udf_resources': (),
'query': 'SELECT 1 AS a, 2 AS b'}
In [58]: query.rows
Out[58]: [(1, 2)]
```
I just found bq tool has a command switch --apilog can save the detail log, exposed the library used by bq tool's python code is `google-api-python-client`; that one is able to handle nested google-api-python-client; at least not crashing, but the data types handling are still not there google/google-api-python-client#281
INFO:root:user-agent: google-api-python-client/1.3.1 (gzip)
Bump - this is blocking a critical sample.
@tseaver You still planning to handle?
Another bump here.
**blocked on this** can't use google-cloud-python with the new BigQuery standard SQL
Another bump, this is blocking samples and usage of this service.
/cc @omaray @jgeewax
/cc @stephenplusplus
Another bump, blocked on this.
Currently using a **manual** fix to:
`google/cloud/bigquery/_helpers.py, line 61, in _record_from_json`
Changed from
```python
for subfield, cell in zip(field.fields, value['f']):
```
to
```python
record_iter = None
if 'f' in value:
record_iter = zip(field.fields, value['f'])
elif 'v' in value and 'f' in value['v']:
record_iter = zip(field.fields, value['v']['f'])
if record_iter:
for subfield, cell in record_iter:
```
Also changed `line 63`, it's referencing the parent field mode instead of the subfield.
Changed from
```python
if field.mode == 'REPEATED':
```
to
```python
if subfield.mode == 'REPEATED':
``` | 2016-11-30T20:03:07Z | [] | [] |
Traceback (most recent call last):
File "simple_app.py", line 57, in <module>
query_shakespeare()
File "simple_app.py", line 47, in query_shakespeare
page_token=page_token)
File "lib/python2.7/site-packages/google/cloud/bigquery/query.py", line 401, in fetch_data
rows_data = _rows_from_json(response.get('rows', ()), self.schema)
File "lib/python2.7/site-packages/google/cloud/bigquery/_helpers.py", line 98, in _rows_from_json
for item in cell['v']])
File "lib/python2.7/site-packages/google/cloud/bigquery/_helpers.py", line 61, in _record_from_json
for subfield, cell in zip(field.fields, value['f']):
KeyError: 'f'
| 5,911 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-2907 | 61d63e56599e18de6c1b8a54c33831a95c02769a | diff --git a/bigtable/google/cloud/bigtable/client.py b/bigtable/google/cloud/bigtable/client.py
--- a/bigtable/google/cloud/bigtable/client.py
+++ b/bigtable/google/cloud/bigtable/client.py
@@ -65,6 +65,14 @@
READ_ONLY_SCOPE = 'https://www.googleapis.com/auth/bigtable.data.readonly'
"""Scope for reading table data."""
+# NOTE: 'grpc.max_message_length' will no longer be recognized in
+# grpcio 1.1 and later.
+_MAX_MSG_LENGTH_100MB = 100 * 1024 * 1024
+_GRPC_MAX_LENGTH_OPTIONS = (
+ ('grpc.max_message_length', _MAX_MSG_LENGTH_100MB),
+ ('grpc.max_receive_message_length', _MAX_MSG_LENGTH_100MB),
+)
+
def _make_data_stub(client):
"""Creates gRPC stub to make requests to the Data API.
@@ -77,7 +85,8 @@ def _make_data_stub(client):
"""
if client.emulator_host is None:
return make_secure_stub(client.credentials, client.user_agent,
- bigtable_pb2.BigtableStub, DATA_API_HOST)
+ bigtable_pb2.BigtableStub, DATA_API_HOST,
+ extra_options=_GRPC_MAX_LENGTH_OPTIONS)
else:
return make_insecure_stub(bigtable_pb2.BigtableStub,
client.emulator_host)
diff --git a/core/google/cloud/_helpers.py b/core/google/cloud/_helpers.py
--- a/core/google/cloud/_helpers.py
+++ b/core/google/cloud/_helpers.py
@@ -465,7 +465,7 @@ def _name_from_project_path(path, project, template):
return match.group('name')
-def make_secure_channel(credentials, user_agent, host):
+def make_secure_channel(credentials, user_agent, host, extra_options=None):
"""Makes a secure channel for an RPC service.
Uses / depends on gRPC.
@@ -480,14 +480,21 @@ def make_secure_channel(credentials, user_agent, host):
:type host: str
:param host: The host for the service.
+ :type extra_options: tuple
+ :param extra_options: (Optional) Extra gRPC options used when creating the
+ channel.
+
:rtype: :class:`grpc._channel.Channel`
:returns: gRPC secure channel with credentials attached.
"""
target = '%s:%d' % (host, http_client.HTTPS_PORT)
http_request = google_auth_httplib2.Request(http=httplib2.Http())
- options = (
- ('grpc.primary_user_agent', user_agent),
- )
+
+ user_agent_option = ('grpc.primary_user_agent', user_agent)
+ if extra_options is not None:
+ options = (user_agent_option,) + extra_options
+ else:
+ options = (user_agent_option,)
return google.auth.transport.grpc.secure_authorized_channel(
credentials,
http_request,
@@ -495,7 +502,8 @@ def make_secure_channel(credentials, user_agent, host):
options=options)
-def make_secure_stub(credentials, user_agent, stub_class, host):
+def make_secure_stub(credentials, user_agent, stub_class, host,
+ extra_options=None):
"""Makes a secure stub for an RPC service.
Uses / depends on gRPC.
@@ -513,10 +521,15 @@ def make_secure_stub(credentials, user_agent, stub_class, host):
:type host: str
:param host: The host for the service.
+ :type extra_options: tuple
+ :param extra_options: (Optional) Extra gRPC options passed when creating
+ the channel.
+
:rtype: object, instance of ``stub_class``
:returns: The stub object used to make gRPC requests to a given API.
"""
- channel = make_secure_channel(credentials, user_agent, host)
+ channel = make_secure_channel(credentials, user_agent, host,
+ extra_options=extra_options)
return stub_class(channel)
| Bigtable Python Client Max Row size is 4mb
On the big table documents, it says the max cell size is 100mb. However, when I try to read a row with a cell of size 10mb using the Bigtable Python client, I get the following error:
```
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/google/cloud/happybase/table.py", line 190, in row
row, filter_=filter_)
File "/usr/local/lib/python2.7/dist-packages/google/cloud/bigtable/table.py", line 234, in read_row
rows_data.consume_all()
File "/usr/local/lib/python2.7/dist-packages/google/cloud/bigtable/row_data.py", line 323, in consume_all
self.consume_next()
File "/usr/local/lib/python2.7/dist-packages/google/cloud/bigtable/row_data.py", line 261, in consume_next
response = six.next(self._response_iterator)
File "/usr/local/lib/python2.7/dist-packages/grpc/_channel.py", line 344, in next
return self._next()
File "/usr/local/lib/python2.7/dist-packages/grpc/_channel.py", line 335, in _next
raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with (StatusCode.INTERNAL, Max message size exceeded)>
```
This max size seems to be hard coded in the grpc library. Has anybody been able to read large rows using the Bigtable Python client? Any idea for workarounds or how I can set the max size?
| @nathanielmanistaatgoogle WDYT?
A few gRPC users have had this kind of problem and we've started a discussion in [`grpc/grpc.github.io` issue 371](https://github.com/grpc/grpc.github.io/issues/371).
For now you can cross your fingers and see what happens when you pass a channel options value like `options=(('grpc.max_message_length', 100 * 1024 * 1024),)` to [`grpc.insecure_channel`](http://www.grpc.io/grpc/python/grpc.html#grpc.insecure_channel) or [`grpc.secure_channel`](http://www.grpc.io/grpc/python/grpc.html#grpc.secure_channel) (know that `grpc.max_message_length` is broken up into `grpc.max_send_message_length` and `grpc.max_receive_message_length` in gRPC Python 1.1-and-later). It might work.
If it doesn't work, then you've hit some limit inside gRPC that isn't overridable and you'll have to break up the large message into a stream of smaller ones.
@nathanielmanistaatgoogle The "you" in "when you pass a channel options value" here is the library maintainers, yes? As a user, @brandon-white isn't directly creating any gRPC object, he is dealing with our Bigtable classes (i.e. the nouns) and calling the API via their instance methods (i.e. the verbs).
Thanks @nathanielmanistaatgoogle I'll try this but, ideally, I would prefer a solution where I do not edit the Bigtable Python client.
For the Google Bigtable Client folks, have you encountered any use cases where the rows and cells are large? If so, how do you break up these large rows and cells into streams? Are large rows and cells supported through the Python Client?
@mbrukman Have you run into this in other clients?
While there's a per-response limit in gRPC, it is possible to retrieve larger cell values via streaming responses and reconstructing them client-side. [`ReadRowsResponse`](https://cloud.google.com/bigtable/docs/reference/data/rpc/google.bigtable.v2#readrowsresponse) includes a list of [`CellChunk`](https://cloud.google.com/bigtable/docs/reference/data/rpc/google.bigtable.v2#cellchunk)s which are the pieces of cells that can be reconstructed to form the larger values.
/cc: @sduskis, @garye
Thank you @mbrukman ! I don't see these calls in the BigTable Python Client. Does this mean I need to write my own client if I want to consume these large cell values? Are there any plans to incorporate this into existing clients?
The python client already deals with ReadRowResponse under the covers. This seems like something we need to change in the service.
I'll take that back. This seems like a client setting. In the java client, we set the value on the netty channel (maxMessageSize) to be 256MB. Is that a setting that the python client controls?
@sduskis I looked for that but didn't see any option to set that in the [Python client](https://googlecloudplatform.github.io/google-cloud-python/stable/_modules/google/cloud/bigtable/client.html#Client). So it seems like we need to add this max_message_size config to the BigTable Python Client?
@dhermes, do you have any idea where grpc settings are set, and if max_message_size is a settable property?
I do not, though @nathanielmanistaatgoogle would be a good person to ask. I'm not 100% clear on if "grpc settings" are a global thing or a per-call setting? The [`options=` described above][1] is how we'd set gRPC metadata on a per-call basis.
[1]: https://github.com/GoogleCloudPlatform/google-cloud-python/issues/2880#issuecomment-268020287
It looks like the channel options get set in [`core/google/cloud/_helpers.py:489`](https://github.com/GoogleCloudPlatform/google-cloud-python/blob/d20f3bb7458915d2601b670b7f4df31bc9ead473/core/google/cloud/_helpers.py#L489). I don't see a way at the moment to pass options specifically for Bigtable.
I think [`grpc.max_receive_message_length`](https://github.com/grpc/grpc/blob/5904a952747ce2c95bd95c5a2b27dcdf41029a43/include/grpc/impl/codegen/grpc_types.h#L157) would be the option to set.
We might ultimately want to set the max send message length for Bigtable also. It looks like it is [currently unlimited](https://github.com/grpc/grpc/blob/5904a952747ce2c95bd95c5a2b27dcdf41029a43/include/grpc/impl/codegen/grpc_types.h#L260), but there is a *TODO* there to change that.
So do I need to make the PR to this or do any of the committers have bandwidth to take a look at this setting? What is the process for getting fixes like this out?
Hello @brandon-white!
I think I have a patch for this, do you have any sample code that I can try with my branch?
@daspecster Thank you! I do not really have any custom code, I simply use the Bigtable Python API to query large cells.
```
from google.cloud import bigtable
client = bigtable.Client(project=project_id, admin=True)
instance = client.instance(instance_id)
table = instance.table(table_id)
table.read_row(row_key)
```
It appears that passing the `('grpc.max_receive_message_length', 100 * 1024 * 1024)` in `options` doesn't work. :(
@mbrukman it appears to me that there is some handling of [chunks](https://github.com/GoogleCloudPlatform/google-cloud-python/blob/455eaf62653dd2ff6b4202a5392e059cf8eda5d0/bigtable/google/cloud/bigtable/row_data.py#L255) on the client side.
But in this case it appears to not get the chunks back yet.
```python
row_data = table.read_rows('large-row')
print(row_data.consume_next())
```
```bash
Traceback (most recent call last):
File "/Documents/test_bigtable.py", line 14, in <module>
print(row_data.consume_next())
File "/Documents/test/.tox/py27/lib/python2.7/site-packages/google_cloud_bigtable-0.22.0-py2.7.egg/google/cloud/bigtable/row_data.py", line 261, in consume_next
response = six.next(self._response_iterator)
File "python_build/bdist.macosx-10.12-intel/egg/grpc/_channel.py", line 344, in next
File "python_build/bdist.macosx-10.12-intel/egg/grpc/_channel.py", line 335, in _next
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with (StatusCode.INTERNAL, Max message size exceeded)>
```
I'm missing how to get `ReadRowsRequest` to return a `ReadRowsResponse` where the chunks are 4mb or less?
`ReadRowsResponse`s are currently not chunked on the server sides, os there's no way to force it to happen. That's an enhancement we wanted to allow via the API, but didn't implement yet.
I think we need some help from someone on the grpc team to solve this.
I wasn't sure based on the previous conversation. Thanks for clearing that up for me @sduskis!
Let me know if there's anything I can do to help. I'm not entirely sure who to ask about this?
I'll ping some folks who might know how to help.
Thanks all! So for now, it looks like this is a feature request which needs an owner?
There's an [answer in this thread](https://github.com/GoogleCloudPlatform/google-cloud-python/issues/2880#issuecomment-268020287) to the question by the grpc python lead @nathanielmanistaatgoogle. @gamorris's [answer](https://github.com/GoogleCloudPlatform/google-cloud-python/issues/2880#issuecomment-268020287) points to [the code that needs to be changed](https://github.com/GoogleCloudPlatform/google-cloud-python/blob/d20f3bb7458915d2601b670b7f4df31bc9ead473/core/google/cloud/_helpers.py#L489).
@dhermes, are you the right person to change this?
@dhermes: the "you" in "when you pass a channel options value" is "the caller of the channel construction functions [`grpc.insecure_channel`](http://www.grpc.io/grpc/python/grpc.html#grpc.insecure_channel) and [`grpc.secure_channel`](http://www.grpc.io/grpc/python/grpc.html#grpc.secure_channel)". | 2016-12-29T17:18:47Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/google/cloud/happybase/table.py", line 190, in row
row, filter_=filter_)
File "/usr/local/lib/python2.7/dist-packages/google/cloud/bigtable/table.py", line 234, in read_row
rows_data.consume_all()
File "/usr/local/lib/python2.7/dist-packages/google/cloud/bigtable/row_data.py", line 323, in consume_all
self.consume_next()
File "/usr/local/lib/python2.7/dist-packages/google/cloud/bigtable/row_data.py", line 261, in consume_next
response = six.next(self._response_iterator)
File "/usr/local/lib/python2.7/dist-packages/grpc/_channel.py", line 344, in next
return self._next()
File "/usr/local/lib/python2.7/dist-packages/grpc/_channel.py", line 335, in _next
raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with (StatusCode.INTERNAL, Max message size exceeded)>
| 5,924 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-2941 | 7f64d75c66811398c169c733a30e4f0e9120cbbb | diff --git a/resource_manager/google/cloud/resource_manager/project.py b/resource_manager/google/cloud/resource_manager/project.py
--- a/resource_manager/google/cloud/resource_manager/project.py
+++ b/resource_manager/google/cloud/resource_manager/project.py
@@ -58,6 +58,7 @@ def __init__(self, project_id, client, name=None, labels=None):
self.number = None
self.labels = labels or {}
self.status = None
+ self.parent = None
def __repr__(self):
return '<Project: %r (%r)>' % (self.name, self.project_id)
@@ -85,6 +86,8 @@ def set_properties_from_api_repr(self, resource):
self.number = resource['projectNumber']
self.labels = resource.get('labels', {})
self.status = resource['lifecycleState']
+ if 'parent' in resource:
+ self.parent = resource['parent']
@property
def full_name(self):
@@ -202,7 +205,12 @@ def update(self, client=None):
"""
client = self._require_client(client)
- data = {'name': self.name, 'labels': self.labels}
+ data = {
+ 'name': self.name,
+ 'labels': self.labels,
+ 'parent': self.parent,
+ }
+
resp = client._connection.api_request(
method='PUT', path=self.path, data=data)
self.set_properties_from_api_repr(resp)
| Resource Manager Labels Update is broke
The resource manager `update` fails when setting a label, in the style according to the docs.
```python
>>> client = resource_manager.Client()
>>> project = client.fetch_project('purple-spaceship-124')
>>> project.labels['color'] = 'purple'
>>> project
<Project: None ('purple-spaceship-124')>
>>> project.update()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/waprin/.virtualenvs/resourcemanager/lib/python2.7/site-packages/google/cloud/resource_manager/project.py", line 207, in update
method='PUT', path=self.path, data=data)
File "/Users/waprin/.virtualenvs/resourcemanager/lib/python2.7/site-packages/google/cloud/_http.py", line 336, in api_request
error_info=method + ' ' + url)
google.cloud.exceptions.BadRequest: 400 Request contains an invalid argument. (PUT https://cloudresourcemanager.googleapis.com/v1beta1/projects/purple-spaceship-124)
```
Will likely submit a PR to fix this myself, digging into it, making issue in case I'm doing something silly.
| Thanks @waprin | 2017-01-18T00:29:10Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/waprin/.virtualenvs/resourcemanager/lib/python2.7/site-packages/google/cloud/resource_manager/project.py", line 207, in update
method='PUT', path=self.path, data=data)
File "/Users/waprin/.virtualenvs/resourcemanager/lib/python2.7/site-packages/google/cloud/_http.py", line 336, in api_request
error_info=method + ' ' + url)
google.cloud.exceptions.BadRequest: 400 Request contains an invalid argument. (PUT https://cloudresourcemanager.googleapis.com/v1beta1/projects/purple-spaceship-124)
| 5,926 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-2992 | 4d2a7d170ad1fb0f5dd70caae6eb5cae9497bd6b | diff --git a/bigtable/google/cloud/bigtable/row_data.py b/bigtable/google/cloud/bigtable/row_data.py
--- a/bigtable/google/cloud/bigtable/row_data.py
+++ b/bigtable/google/cloud/bigtable/row_data.py
@@ -366,8 +366,6 @@ def _validate_chunk_row_in_progress(self, chunk):
"""Helper for :meth:`_validate_chunk`"""
assert self.state == self.ROW_IN_PROGRESS
self._validate_chunk_status(chunk)
- if not chunk.HasField('commit_row') and not chunk.reset_row:
- _raise_if(not chunk.timestamp_micros or not chunk.value)
_raise_if(chunk.row_key and
chunk.row_key != self._row.row_key)
_raise_if(chunk.HasField('family_name') and
| [bigtable] InvalidChunk
1. OS type and version
Ubuntu 14.04
2. Python version and virtual environment information `python --version`
Python 2.7.6
3. google-cloud-python version `pip show google-cloud`, `pip show google-<service>` or `pip freeze`
Name: google-cloud-bigtable
Version: 0.22.0
4. Stacktrace if available
```
Traceback (most recent call last):
File "bt.py", line 13, in <module>
row = bt_table.read_row('mykey')
File "/home/jungho.ahn/tmp/test/local/lib/python2.7/site-packages/google/cloud/bigtable/table.py", line 234, in read_row
rows_data.consume_all()
File "/home/jungho.ahn/tmp/test/local/lib/python2.7/site-packages/google/cloud/bigtable/row_data.py", line 323, in consume_all
self.consume_next()
File "/home/jungho.ahn/tmp/test/local/lib/python2.7/site-packages/google/cloud/bigtable/row_data.py", line 275, in consume_next
self._validate_chunk(chunk)
File "/home/jungho.ahn/tmp/test/local/lib/python2.7/site-packages/google/cloud/bigtable/row_data.py", line 390, in _validate_chunk
self._validate_chunk_row_in_progress(chunk)
File "/home/jungho.ahn/tmp/test/local/lib/python2.7/site-packages/google/cloud/bigtable/row_data.py", line 370, in _validate_chunk_row_in_progress
_raise_if(not chunk.timestamp_micros or not chunk.value)
File "/home/jungho.ahn/tmp/test/local/lib/python2.7/site-packages/google/cloud/bigtable/row_data.py", line 441, in _raise_if
raise InvalidChunk(*args)
google.cloud.bigtable.row_data.InvalidChunk
```
5. Steps to reproduce
See the code example
6. Code example
```
from google.cloud import bigtable
bt_client = bigtable.client.Client(project='myproject', read_only=True)
bt_instance = bt_client.instance(instance_id='myinstance')
bt_table = bt_instance.table(table_id='mytable')
row = bt_table.read_row('mykey')
print row.cells
```
This exception is raised all the time. Did I use incorrectly?
Thanks,
| It's tough to say without actually seeing the failed chunk. Can you run this with a debugger? It may be easiest to just dump it in a script and run `python -m pdb script_name.py` or to run it in IPython and use the `%debug` magic after it fails.
```
(Pdb) print chunk
timestamp_micros: 1485886500000000
(Pdb) print not chunk.value
True
```
Actually the value is an empty string.
Gotcha, ISTM the chunk is equivalent to:
```python
>>> from google.cloud.bigtable._generated import bigtable_pb2
>>> Chunk = bigtable_pb2.ReadRowsResponse.CellChunk
>>> chunk = Chunk(timestamp_micros=1485886500000000)
>>> chunk
timestamp_micros: 1485886500000000
>>>
```
@tseaver The culprit is exactly as described by @junghoahnsc. I don't quite recognize the validation since it was re-worked for `v2` while I was on "paternity leave". Is there a reason we don't allow empty cells on read?
**UPDATE**: I just verified that writing an empty cell works:
```python
>>> row = table.row(b'row-key')
>>> row.set_cell(u'col-fam-id', b'col-name', b'')
>>> row.commit()
```
but then reading this row doesn't fail either (because the chunk type must be different):
```python
>>> partial_result = table.read_row(b'row-key')
>>> cells = partial_result[u'col-fam-id'][b'col-name']
>>> cells[0].value
''
>>> cells[0].timestamp
datetime.datetime(2017, 2, 2, 22, 5, 4, 333000, tzinfo=<UTC>)
```
@junghoahnsc The state of the chunks is also relevant here, so I need a little more info to reproduce.
[1]: https://github.com/GoogleCloudPlatform/google-cloud-python/blob/7e1632598365af1c03709bb5f5ff29f4d8a98847/bigtable/google/cloud/bigtable/row_data.py#L370
Too much water under the bridge since June :). I think we need to ask @garye, @mbrukman, or @sduskis. The [context](https://github.com/GoogleCloudPlatform/google-cloud-python/blob/7e1632598365af1c03709bb5f5ff29f4d8a98847/bigtable/google/cloud/bigtable/row_data.py#L365-L377) is:
- The state machine is in the "row in progress" state (no current partial cell)
- It is validating the chunk which would trigger a new cell (is doesn't reset or commit the row).
- It is asserting that there must be both a timestamp and a value set on that chunk.
Hi, I am trying to run this step by step:
The PartialRowData object's state is "row in progress"
The chunk it currently processing has only "qualifier" and "timestamp_micros". "reset_now" is False, no "commit_row" field, "value" is empty.
The error is triggered in _validate_chunk_row_in_progress method:
if not chunk.HasField('commit_row') and not chunk.reset_row:
_raise_if(not chunk.timestamp_micros or not chunk.value)
Yeah I think the chunk and timestamp check there is invalid.
Is there any workaround until it is fixed?
@dhermes do you need any more information from our side? Actually our bigtable was written by Go/Java apps and now we're trying to read it in Python.
Is there any way you can validate the same read using Go or Java? Just to gain confidence that the fix I suggested will work... And/or can you just remove those checks from your local copy and verify?
A timestamp of 0 is valid. I think that there needs to be a chunk.value.
Timestamp of 0 is fine and empty value for chunk.value is fine, but I think "not chunk.value" is true if it's an empty string which is wrong.
empty values are not fine in these cases. An empty cell isn't a thing in BT. If there's no reset, there needs to be a value, IIRC.
@junghoahnsc @yuanbaisc Can you consume the response using the [low-level API][1]?
```python
>>> # NOT: partial_result = table.read_row(b'row-key')
>>> from google.cloud.bigtable.table import _create_row_request
>>> request_pb = _create_row_request(table.name, row_key=b'row-key')
>>> response_iterator = client._data_stub.ReadRows(request_pb)
>>> all_responses = list(response_iterator)
```
**UPDATE**: If there is private data returned, you can redact it or feel free to email me (email in my GitHub profile)
[1]: https://github.com/GoogleCloudPlatform/google-cloud-python/blob/7e1632598365af1c03709bb5f5ff29f4d8a98847/bigtable/google/cloud/bigtable/table.py#L229-L230
@garye there is no issue in reading in Go/Java.
@dhermes Here is the output of `all_responses`. I trimmed it since it's private data. Plz let me know if you need full output.
```
[chunks {
row_key: "166657957cd9d730abe8f5f73bcfecda"
family_name {
value: "cluster"
}
qualifier {
value: "concept"
}
timestamp_micros: 1485887340000000
}
chunks {
timestamp_micros: 1485886500000000
}
...
chunks {
qualifier {
value: "debug"
}
timestamp_micros: 1485887340000000
value: "\022\332\222\001x..."
}
chunks {
timestamp_micros: 1485886500000000
value: "\022\242\222\001x\234..."
}
....
chunks {
timestamp_micros: 1485879798000000
value: "\n2\n\006OTR517\022(\n\022\tj\274t\223..."
}
chunks {
timestamp_micros: 1485877620000000
value: "\n2\n\006OTR517\022(\n..."
}
chunks {
qualifier {
value: "metastory"
}
timestamp_micros: 1485887340000000
}
chunks {
timestamp_micros: 1485886500000000
}
...
chunks {
qualifier {
value: "snapgraph"
}
timestamp_micros: 1485887340000000
}
chunks {
timestamp_micros: 1485886500000000
}
...
chunks {
timestamp_micros: 1485877620000000
commit_row: true
}
]
```
Regarding my earlier comment, @garye and I spoke offline. While there's no such thing as a Bigtable cell with an value, someone can set a filter that strips the value; that's useful in cases where all you want to do is check for existence of a row.
The check for value should be removed as well as @garye suggested. I authored an implementation guide which may be incorrect. Sorry about that.
IIUC, it looks like the current validation check is not correct. Is it easy to remove that?
Or should we use a low-level api until it is fixed? | 2017-02-09T15:22:06Z | [] | [] |
Traceback (most recent call last):
File "bt.py", line 13, in <module>
row = bt_table.read_row('mykey')
File "/home/jungho.ahn/tmp/test/local/lib/python2.7/site-packages/google/cloud/bigtable/table.py", line 234, in read_row
rows_data.consume_all()
File "/home/jungho.ahn/tmp/test/local/lib/python2.7/site-packages/google/cloud/bigtable/row_data.py", line 323, in consume_all
self.consume_next()
File "/home/jungho.ahn/tmp/test/local/lib/python2.7/site-packages/google/cloud/bigtable/row_data.py", line 275, in consume_next
self._validate_chunk(chunk)
File "/home/jungho.ahn/tmp/test/local/lib/python2.7/site-packages/google/cloud/bigtable/row_data.py", line 390, in _validate_chunk
self._validate_chunk_row_in_progress(chunk)
File "/home/jungho.ahn/tmp/test/local/lib/python2.7/site-packages/google/cloud/bigtable/row_data.py", line 370, in _validate_chunk_row_in_progress
_raise_if(not chunk.timestamp_micros or not chunk.value)
File "/home/jungho.ahn/tmp/test/local/lib/python2.7/site-packages/google/cloud/bigtable/row_data.py", line 441, in _raise_if
raise InvalidChunk(*args)
google.cloud.bigtable.row_data.InvalidChunk
| 5,936 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-3079 | de5d8e1b39f26f9638e990be2a4255b45914d466 | diff --git a/logging/google/cloud/logging/_gax.py b/logging/google/cloud/logging/_gax.py
--- a/logging/google/cloud/logging/_gax.py
+++ b/logging/google/cloud/logging/_gax.py
@@ -26,9 +26,9 @@
from google.gax import INITIAL_PAGE
from google.gax.errors import GaxError
from google.gax.grpc import exc_to_code
-from google.cloud.grpc.logging.v2.logging_config_pb2 import LogSink
-from google.cloud.grpc.logging.v2.logging_metrics_pb2 import LogMetric
-from google.cloud.grpc.logging.v2.log_entry_pb2 import LogEntry
+from google.cloud.proto.logging.v2.logging_config_pb2 import LogSink
+from google.cloud.proto.logging.v2.logging_metrics_pb2 import LogMetric
+from google.cloud.proto.logging.v2.log_entry_pb2 import LogEntry
from google.protobuf.json_format import MessageToDict
from google.protobuf.json_format import ParseDict
from grpc import StatusCode
diff --git a/logging/setup.py b/logging/setup.py
--- a/logging/setup.py
+++ b/logging/setup.py
@@ -52,12 +52,12 @@
REQUIREMENTS = [
'google-cloud-core >= 0.23.1, < 0.24dev',
'grpcio >= 1.0.2, < 2.0dev',
- 'gapic-google-cloud-logging-v2 >= 0.90.1, < 0.91dev',
+ 'gapic-google-cloud-logging-v2 >= 0.91.0, < 0.92dev',
]
setup(
name='google-cloud-logging',
- version='0.23.0',
+ version='0.23.1',
description='Python Client for Stackdriver Logging',
long_description=README,
namespace_packages=[
| Logging system tests broken
https://travis-ci.org/GoogleCloudPlatform/google-cloud-python/builds/205851892#L2768
```python
Traceback (most recent call last):
File "/home/travis/build/GoogleCloudPlatform/google-cloud-python/system_tests/logging_.py", line 440, in test_update_sink
self.assertFalse(sink.exists())
File "/home/travis/build/GoogleCloudPlatform/google-cloud-python/.tox/system-tests/lib/python2.7/site-packages/google/cloud/logging/sink.py", line 138, in exists
client.sinks_api.sink_get(self.project, self.name)
File "/home/travis/build/GoogleCloudPlatform/google-cloud-python/.tox/system-tests/lib/python2.7/site-packages/google/cloud/logging/client.py", line 134, in sinks_api
self._sinks_api = make_gax_sinks_api(self)
File "/home/travis/build/GoogleCloudPlatform/google-cloud-python/.tox/system-tests/lib/python2.7/site-packages/google/cloud/logging/_gax.py", line 573, in make_gax_sinks_api
channel=channel, lib_name='gccl', lib_version=__version__)
TypeError: __init__() got an unexpected keyword argument 'lib_name'
```
Preliminarily assigning to Luke.
| Yep, that belongs to me. Thanks. | 2017-02-27T22:03:41Z | [] | [] |
Traceback (most recent call last):
File "/home/travis/build/GoogleCloudPlatform/google-cloud-python/system_tests/logging_.py", line 440, in test_update_sink
self.assertFalse(sink.exists())
File "/home/travis/build/GoogleCloudPlatform/google-cloud-python/.tox/system-tests/lib/python2.7/site-packages/google/cloud/logging/sink.py", line 138, in exists
client.sinks_api.sink_get(self.project, self.name)
File "/home/travis/build/GoogleCloudPlatform/google-cloud-python/.tox/system-tests/lib/python2.7/site-packages/google/cloud/logging/client.py", line 134, in sinks_api
self._sinks_api = make_gax_sinks_api(self)
File "/home/travis/build/GoogleCloudPlatform/google-cloud-python/.tox/system-tests/lib/python2.7/site-packages/google/cloud/logging/_gax.py", line 573, in make_gax_sinks_api
channel=channel, lib_name='gccl', lib_version=__version__)
TypeError: __init__() got an unexpected keyword argument 'lib_name'
| 5,948 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-3159 | 69e2b5366f03907d89f84e8046b9cc71a9034fd3 | diff --git a/error_reporting/google/cloud/error_reporting/_gax.py b/error_reporting/google/cloud/error_reporting/_gax.py
--- a/error_reporting/google/cloud/error_reporting/_gax.py
+++ b/error_reporting/google/cloud/error_reporting/_gax.py
@@ -41,7 +41,7 @@ def make_report_error_api(client):
report_errors_service_client.ReportErrorsServiceClient.SERVICE_ADDRESS)
gax_client = report_errors_service_client.ReportErrorsServiceClient(
channel=channel, lib_name='gccl', lib_version=__version__)
- return _ErrorReportingGaxApi(gax_client, client.project)
+ return _ErrorReportingGaxApi(gax_client, client._project)
class _ErrorReportingGaxApi(object):
| Error reporting accessing client.project instead of client._project.
[Originally](https://github.com/GoogleCloudPlatform/google-cloud-python/issues/2687#issuecomment-286703711) from @DizzeePascall.
When running the examples from the documentation, I get the following error:
```
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/laurence/miniconda3/envs/sonalytic/lib/python3.5/site-packages/google/cloud/error_reporting/client.py", line 320, in report
report_location=report_location)
File "/home/laurence/miniconda3/envs/sonalytic/lib/python3.5/site-packages/google/cloud/error_reporting/client.py", line 277, in _send_error_report
self.report_errors_api.report_error_event(error_report)
File "/home/laurence/miniconda3/envs/sonalytic/lib/python3.5/site-packages/google/cloud/error_reporting/client.py", line 162, in report_errors_api
self._report_errors_api = make_report_error_api(self)
File "/home/laurence/miniconda3/envs/sonalytic/lib/python3.5/site-packages/google/cloud/error_reporting/_gax.py", line 44, in make_report_error_api
return _ErrorReportingGaxApi(gax_client, client.project)
AttributeError: 'Client' object has no attribute 'project'
```
Disabling GRPC via the `GOOGLE_CLOUD_DISABLE_GRPC` environment variable fixes this.
I'm running on an Ubuntu 14.04 compute instance with python 3.5.3 and version 023.1 of the google-cloud-error-reporting project
| 2017-03-16T19:11:10Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/laurence/miniconda3/envs/sonalytic/lib/python3.5/site-packages/google/cloud/error_reporting/client.py", line 320, in report
report_location=report_location)
File "/home/laurence/miniconda3/envs/sonalytic/lib/python3.5/site-packages/google/cloud/error_reporting/client.py", line 277, in _send_error_report
self.report_errors_api.report_error_event(error_report)
File "/home/laurence/miniconda3/envs/sonalytic/lib/python3.5/site-packages/google/cloud/error_reporting/client.py", line 162, in report_errors_api
self._report_errors_api = make_report_error_api(self)
File "/home/laurence/miniconda3/envs/sonalytic/lib/python3.5/site-packages/google/cloud/error_reporting/_gax.py", line 44, in make_report_error_api
return _ErrorReportingGaxApi(gax_client, client.project)
AttributeError: 'Client' object has no attribute 'project'
| 5,954 |
||||
googleapis/google-cloud-python | googleapis__google-cloud-python-3160 | 45c6a0a64f079d0c92f8f123525e3fe717970d03 | diff --git a/storage/google/cloud/storage/_helpers.py b/storage/google/cloud/storage/_helpers.py
--- a/storage/google/cloud/storage/_helpers.py
+++ b/storage/google/cloud/storage/_helpers.py
@@ -21,6 +21,25 @@
from hashlib import md5
+def _validate_name(name):
+ """Pre-flight ``Bucket`` name validation.
+
+ :type name: str or :data:`NoneType`
+ :param name: Proposed bucket name.
+
+ :rtype: str or :data:`NoneType`
+ :returns: ``name`` if valid.
+ """
+ if name is None:
+ return
+
+ # The first and las characters must be alphanumeric.
+ if not all([name[0].isalnum(), name[-1].isalnum()]):
+ raise ValueError(
+ 'Bucket names must start and end with a number or letter.')
+ return name
+
+
class _PropertyMixin(object):
"""Abstract mixin for cloud storage classes with associated propertties.
@@ -29,11 +48,12 @@ class _PropertyMixin(object):
- path
:type name: str
- :param name: The name of the object.
+ :param name: The name of the object. Bucket names must start and end with a
+ number or letter.
"""
def __init__(self, name=None):
- self.name = name
+ self.name = _validate_name(name)
self._properties = {}
self._changes = set()
diff --git a/storage/google/cloud/storage/bucket.py b/storage/google/cloud/storage/bucket.py
--- a/storage/google/cloud/storage/bucket.py
+++ b/storage/google/cloud/storage/bucket.py
@@ -81,7 +81,8 @@ class Bucket(_PropertyMixin):
for the bucket (which requires a project).
:type name: str
- :param name: The name of the bucket.
+ :param name: The name of the bucket. Bucket names must start and end with a
+ number or letter.
"""
_MAX_OBJECTS_FOR_ITERATION = 256
| Raise warning/exception when user tries to `get_bucket(x)` if x includes forward slash
Hi there.
I made a silly mistake when trying to list blobs in a bucket, and have a proposal that could prevent users from repeating said mistake.
When initializing a `Bucket` object, I did so with a forward slash ("/") in the bucket name string. This produced a supposedly valid bucket, as seen here (actual bucket name changed for sake of demonstration):
```pycon
>>> from gcloud import storage
>>> c = storage.Client()
>>> b0 = c.get_bucket('valid-bucket')
>>> b0.exists()
True
>>> b1 = c.get_bucket('valid-bucket/') # <-- A silly mistake
>>> b1.exists()
True
```
While both methods of initialization produce seemingly valid `Bucket` objects, initializing with a forward slash breaks further on, i.e:
```pycon
>>> len([x for x in b0.list_blobs()])
8112
>>> len([x for x in b1.list_blobs()])
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 1, in <listcomp>
File "/usr/local/lib/python3.5/dist-packages/gcloud/iterator.py", line 79, in __iter__
response = self.get_next_page_response()
File "/usr/local/lib/python3.5/dist-packages/gcloud/iterator.py", line 115, in get_next_page_response
method='GET', path=self.path, query_params=self.get_query_params())
File "/usr/local/lib/python3.5/dist-packages/gcloud/connection.py", line 347, in api_request
error_info=method + ' ' + url)
gcloud.exceptions.NotFound: 404 Not Found (GET https://www.googleapis.com/storage/v1/b/valid-bucket//o?projection=noAcl)
```
`gcloud` adds a second forward slash when constructing URLs internally, and this obviously breaks. I'd suggest raising a warning or exception when users try to initialize a bucket with a forward slash, or alternatively fixing it behind-the-scenes (simply dropping the slash).
Cheers,
Eilam
| Thanks for reporting!
>Bucket names must start and end with a number or letter
https://cloud.google.com/storage/docs/naming#requirements
I think it could be good for us to raise an exception in `Bucket.__init__` if the name isn't valid. | 2017-03-16T20:04:09Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 1, in <listcomp>
File "/usr/local/lib/python3.5/dist-packages/gcloud/iterator.py", line 79, in __iter__
response = self.get_next_page_response()
File "/usr/local/lib/python3.5/dist-packages/gcloud/iterator.py", line 115, in get_next_page_response
method='GET', path=self.path, query_params=self.get_query_params())
File "/usr/local/lib/python3.5/dist-packages/gcloud/connection.py", line 347, in api_request
error_info=method + ' ' + url)
gcloud.exceptions.NotFound: 404 Not Found (GET https://www.googleapis.com/storage/v1/b/valid-bucket//o?projection=noAcl)
| 5,955 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-3168 | 1923d6e40284dddb9bb352aceb6c494a6c9b70fa | diff --git a/error_reporting/google/cloud/error_reporting/client.py b/error_reporting/google/cloud/error_reporting/client.py
--- a/error_reporting/google/cloud/error_reporting/client.py
+++ b/error_reporting/google/cloud/error_reporting/client.py
@@ -23,7 +23,7 @@
except ImportError: # pragma: NO COVER
_HAVE_GAX = False
-from google.cloud._helpers import _determine_default_project
+from google.cloud.client import ClientWithProject
from google.cloud.error_reporting._logging import _ErrorReportingLoggingAPI
from google.cloud.environment_vars import DISABLE_GRPC
@@ -74,7 +74,7 @@ def __init__(self, method=None, url=None,
self.remoteIp = remote_ip
-class Client(object):
+class Client(ClientWithProject):
"""Error Reporting client. Currently Error Reporting is done by creating
a Logging client.
@@ -125,13 +125,8 @@ def __init__(self, project=None,
service=None,
version=None,
use_gax=None):
- if project is None:
- self._project = _determine_default_project()
- else:
- self._project = project
- self._credentials = credentials
- self._http = http
-
+ super(Client, self).__init__(project=project, credentials=credentials,
+ http=http)
self._report_errors_api = None
self.service = service if service else self.DEFAULT_SERVICE
@@ -162,7 +157,7 @@ def report_errors_api(self):
self._report_errors_api = make_report_error_api(self)
else:
self._report_errors_api = _ErrorReportingLoggingAPI(
- self._project, self._credentials, self._http)
+ self.project, self._credentials, self._http)
return self._report_errors_api
def _build_error_report(self,
| google.cloud.error_reporting AttributeError: 'Client' object has no attribute 'project'
I am trying to follow the sample code [here](https://googlecloudplatform.github.io/google-cloud-python/stable/error-reporting-usage.html) in an appengine custom flexible runtime and whenever I run the code, I get the following error:
```
Traceback (most recent call last):
File "/env/lib/python3.5/site-packages/flask/app.py", line 1994, in __call__
return self.wsgi_app(environ, start_response)
File "/env/lib/python3.5/site-packages/flask/app.py", line 1985, in wsgi_app
response = self.handle_exception(e)
File "/env/lib/python3.5/site-packages/flask/app.py", line 1540, in handle_exception
reraise(exc_type, exc_value, tb)
File "/env/lib/python3.5/site-packages/flask/_compat.py", line 33, in reraise
raise value
File "/env/lib/python3.5/site-packages/flask/app.py", line 1982, in wsgi_app
response = self.full_dispatch_request()
File "/env/lib/python3.5/site-packages/flask/app.py", line 1614, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/env/lib/python3.5/site-packages/flask/app.py", line 1517, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/env/lib/python3.5/site-packages/flask/_compat.py", line 33, in reraise
raise value
File "/env/lib/python3.5/site-packages/flask/app.py", line 1612, in full_dispatch_request
rv = self.dispatch_request()
File "/env/lib/python3.5/site-packages/flask/app.py", line 1598, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/home/vmagent/app/xxx.py", line 20, in simulate_error
ec.report_exception()
File "/env/lib/python3.5/site-packages/google/cloud/error_reporting/client.py", line 348, in report_exception
user=user)
File "/env/lib/python3.5/site-packages/google/cloud/error_reporting/client.py", line 277, in _send_error_report
self.report_errors_api.report_error_event(error_report)
File "/env/lib/python3.5/site-packages/google/cloud/error_reporting/client.py", line 162, in report_errors_api
self._report_errors_api = make_report_error_api(self)
File "/env/lib/python3.5/site-packages/google/cloud/error_reporting/_gax.py", line 44, in make_report_error_api
return _ErrorReportingGaxApi(gax_client, client.project)
AttributeError: 'Client' object has no attribute 'project'
```
In the same project I'm also creating clients for pubsub and storage with the same default constructor and they both find the project and credentials, both when running in appengine as well as when I run the container locally and specify credentials by setting GOOGLE_APPLICATION_CREDENTIALS to point to a json file containing credentials for a service account.
I have also tried explicitly passing the project explicitly using something like:
```
ec = error_reporting.Client(project='xyz')
```
and
```
ec = error_reporting.Client('xyz')
```
to no avail.
Here is the sample code that fails:
```
from flask import Flask
from google.cloud import error_reporting
app = Flask(__name__)
@app.route('/simulate_error')
def simulate_error():
ec = error_reporting.Client()
try:
# simulate calling a method that's not defined
raise NameError
except Exception:
ec.report_exception()
if __name__ == '__main__':
app.run(host='0.0.0.0', debug=True)
```
Here is the output from `pip freeze | grep ^google-cloud`:
```
google-cloud==0.23.0
google-cloud-bigquery==0.23.0
google-cloud-bigtable==0.23.1
google-cloud-core==0.23.1
google-cloud-datastore==0.23.0
google-cloud-dns==0.23.0
google-cloud-error-reporting==0.23.1
google-cloud-language==0.23.1
google-cloud-logging==0.23.1
google-cloud-monitoring==0.23.0
google-cloud-pubsub==0.23.0
google-cloud-resource-manager==0.23.0
google-cloud-runtimeconfig==0.23.0
google-cloud-spanner==0.23.1
google-cloud-speech==0.23.0
google-cloud-storage==0.23.1
google-cloud-translate==0.23.0
google-cloud-vision==0.23.3
```
| forgot to mention I get a similar error when I try to use the google.cloud.logging client as well.
Thanks for reporting, @cva. :-) Looking into this now. | 2017-03-20T15:17:42Z | [] | [] |
Traceback (most recent call last):
File "/env/lib/python3.5/site-packages/flask/app.py", line 1994, in __call__
return self.wsgi_app(environ, start_response)
File "/env/lib/python3.5/site-packages/flask/app.py", line 1985, in wsgi_app
response = self.handle_exception(e)
File "/env/lib/python3.5/site-packages/flask/app.py", line 1540, in handle_exception
reraise(exc_type, exc_value, tb)
File "/env/lib/python3.5/site-packages/flask/_compat.py", line 33, in reraise
raise value
File "/env/lib/python3.5/site-packages/flask/app.py", line 1982, in wsgi_app
response = self.full_dispatch_request()
File "/env/lib/python3.5/site-packages/flask/app.py", line 1614, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/env/lib/python3.5/site-packages/flask/app.py", line 1517, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/env/lib/python3.5/site-packages/flask/_compat.py", line 33, in reraise
raise value
File "/env/lib/python3.5/site-packages/flask/app.py", line 1612, in full_dispatch_request
rv = self.dispatch_request()
File "/env/lib/python3.5/site-packages/flask/app.py", line 1598, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/home/vmagent/app/xxx.py", line 20, in simulate_error
ec.report_exception()
File "/env/lib/python3.5/site-packages/google/cloud/error_reporting/client.py", line 348, in report_exception
user=user)
File "/env/lib/python3.5/site-packages/google/cloud/error_reporting/client.py", line 277, in _send_error_report
self.report_errors_api.report_error_event(error_report)
File "/env/lib/python3.5/site-packages/google/cloud/error_reporting/client.py", line 162, in report_errors_api
self._report_errors_api = make_report_error_api(self)
File "/env/lib/python3.5/site-packages/google/cloud/error_reporting/_gax.py", line 44, in make_report_error_api
return _ErrorReportingGaxApi(gax_client, client.project)
AttributeError: 'Client' object has no attribute 'project'
| 5,957 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-3177 | 1923d6e40284dddb9bb352aceb6c494a6c9b70fa | diff --git a/storage/google/cloud/storage/blob.py b/storage/google/cloud/storage/blob.py
--- a/storage/google/cloud/storage/blob.py
+++ b/storage/google/cloud/storage/blob.py
@@ -868,11 +868,14 @@ def rewrite(self, source, token=None, client=None):
method='POST', path=source.path + '/rewriteTo' + self.path,
query_params=query_params, data=self._properties, headers=headers,
_target_object=self)
- self._set_properties(api_response['resource'])
rewritten = int(api_response['totalBytesRewritten'])
size = int(api_response['objectSize'])
+ # The resource key is set if and only if the API response is
+ # completely done. Additionally, there is no rewrite token to return
+ # in this case.
if api_response['done']:
+ self._set_properties(api_response['resource'])
return None, rewritten, size
return api_response['rewriteToken'], rewritten, size
| blob.py is unable to handle a partial rewrite and fails with KeyError on lack of resource
https://github.com/GoogleCloudPlatform/google-cloud-python/blob/master/storage/google/cloud/storage/blob.py#L871
```
871: self._set_properties(api_response['resource'])
```
This relies on the API to always return 'resource'. However, it is optional as per the doc.
https://cloud.google.com/storage/docs/json_api/v1/objects/rewrite
"""
This property is present in the response only when copying completes.
"""
As such, when the write was partial, done: false is returned and resource not present. Thus failing with the following KeyError.
```
Traceback (most recent call last):
File "xxx.py", line 304, in test_011_CopyFileFromBucketToBucketNewName
new_destination_file_name=new_file_name)
File "xxx.py, line 276, in copyFileFromBucketToBucket
token, rewritten, total = destination_blob.rewrite(source_blob, token=token)
File "/home/jenkins/venv/lib/python2.7/site-packages/google/cloud/storage/blob.py", line 681, in rewrite
self._set_properties(api_response['resource'])
KeyError: 'resource'
```
The most easy fix would be to push _set_properties down under after done: false check:
```
E.g.;
api_response = client._connection.api_request(
method='POST', path=source.path + '/rewriteTo' + self.path,
query_params=query_params, data=self._properties, headers=headers,
_target_object=self)
# self._set_properties(api_response['resource']) ## Comment out this line
rewritten = int(api_response['totalBytesRewritten'])
size = int(api_response['objectSize'])
if api_response['done']:
return None, rewritten, size
self._set_properties(api_response['resource']) # and move here
return api_response['rewriteToken'], rewritten, size
```
| I don't think your solution will work. I came across the same issue, but when it is *not* done, it falls through. You probably want to move it inside the if `api_response['done']`.
Ah, you are totally right. Scrap my suggestion.
I wasn't sure how _set_properties is used in the API and I mistook the "done" check in the wrong way, of course.
Thanks for reporting @80nashi!
If I understand correctly from the docs, the `resource` is only available when the copy is complete?
>A resource containing the metadata for the copied-to object. This property is present in the response only when copying completes.
@tseaver, @dhermes maybe we should be raising an exception here?
I'm not sure from the docs, but is there polling that should be available for rewriting so you can check if the copy is done?
As I understand the method and the API, the method is expected to return the rewrite token without raising an exception, in case of partial write (`done: false` and `resource` non-existent).
Then the user program can call the method again with the non-null, non empty string rewrite token, repeat it until `done: true` and `resource` present.
So yeah, the following can be a possible fix as verba suggested.
However, I'm not sure what _set_properties is expected to do in case of `done: false`.
```
# self._set_properties(api_response['resource']) ## Comment out this line
rewritten = int(api_response['totalBytesRewritten'])
size = int(api_response['objectSize'])
if api_response['done']:
self._set_properties(api_response['resource']) # and move here
return None, rewritten, size
return api_response['rewriteToken'], rewritten, size
``` | 2017-03-21T07:29:57Z | [] | [] |
Traceback (most recent call last):
File "xxx.py", line 304, in test_011_CopyFileFromBucketToBucketNewName
new_destination_file_name=new_file_name)
| 5,959 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-3195 | b035bf78d726ddccc821e8884e6a4eda48e93e56 | diff --git a/bigquery/google/cloud/bigquery/_helpers.py b/bigquery/google/cloud/bigquery/_helpers.py
--- a/bigquery/google/cloud/bigquery/_helpers.py
+++ b/bigquery/google/cloud/bigquery/_helpers.py
@@ -144,7 +144,7 @@ def _bool_to_json(value):
def _bytes_to_json(value):
"""Coerce 'value' to an JSON-compatible representation."""
if isinstance(value, bytes):
- value = base64.standard_b64encode(value)
+ value = base64.standard_b64encode(value).decode('ascii')
return value
| New BigQuery system tests fail on Python 3.
@tseaver One of the new BigQuery system tests consistently fails on Python 3.x:
Running `tox -e system-tests3 -- bigquery` on current master:
```
test_create_dataset (bigquery.TestBigQuery) ... ok
test_create_table (bigquery.TestBigQuery) ... ok
test_create_table_insert_fetch_nested_schema (bigquery.TestBigQuery) ... ok
test_dump_table_w_public_data (bigquery.TestBigQuery) ... ok
test_insert_data_then_dump_table (bigquery.TestBigQuery) ... _has_rows. Trying again in 1 seconds...
ok
test_insert_nested_nested (bigquery.TestBigQuery) ... ok
test_job_cancel (bigquery.TestBigQuery) ... ok
test_list_datasets (bigquery.TestBigQuery) ... ok
test_list_tables (bigquery.TestBigQuery) ... ok
test_load_table_from_local_file_then_dump_table (bigquery.TestBigQuery) ... _job_done. Trying again in 1 seconds...
_job_done. Trying again in 2 seconds...
_job_done. Trying again in 4 seconds...
ok
test_load_table_from_storage_then_dump_table (bigquery.TestBigQuery) ... _job_done. Trying again in 1 seconds...
_job_done. Trying again in 2 seconds...
ok
test_patch_dataset (bigquery.TestBigQuery) ... ok
test_patch_table (bigquery.TestBigQuery) ... ok
test_reload_dataset (bigquery.TestBigQuery) ... ok
test_sync_query_w_legacy_sql_types (bigquery.TestBigQuery) ... ok
test_sync_query_w_query_params (bigquery.TestBigQuery) ... ERROR
test_sync_query_w_standard_sql_types (bigquery.TestBigQuery) ... ok
test_update_dataset (bigquery.TestBigQuery) ... ok
test_update_table (bigquery.TestBigQuery) ... ok
======================================================================
ERROR: test_sync_query_w_query_params (bigquery.TestBigQuery)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/local/google/home/lukesneeringer/Code/Google/google-cloud-python/system_tests/bigquery.py", line 770, in test_sync_query_w_query_params
query.run()
File "/usr/local/google/home/lukesneeringer/Code/Google/google-cloud-python/.tox/system-tests3/lib/python3.5/site-packages/google/cloud/bigquery/query.py", line 364, in run
method='POST', path=path, data=self._build_resource())
File "/usr/local/google/home/lukesneeringer/Code/Google/google-cloud-python/.tox/system-tests3/lib/python3.5/site-packages/google/cloud/_http.py", line 294, in api_request
data = json.dumps(data)
File "/usr/local/lib/python3.5/json/__init__.py", line 230, in dumps
return _default_encoder.encode(obj)
File "/usr/local/lib/python3.5/json/encoder.py", line 198, in encode
chunks = self.iterencode(o, _one_shot=True)
File "/usr/local/lib/python3.5/json/encoder.py", line 256, in iterencode
return _iterencode(o, 0)
File "/usr/local/lib/python3.5/json/encoder.py", line 179, in default
raise TypeError(repr(o) + " is not JSON serializable")
TypeError: b'REVBREJFRUY=' is not JSON serializable
----------------------------------------------------------------------
Ran 19 tests in 88.302s
FAILED (errors=1)
ERROR: InvocationError: '/usr/local/google/home/lukesneeringer/Code/Google/google-cloud-python/.tox/system-tests3/bin/python /usr/local/google/home/lukesneeringer/Code/Google/google-cloud-python/system_tests/attempt_system_tests.py bigquery'
_______________________________________________ summary ________________________________________________
ERROR: system-tests3: commands failed
```
| @lukesneeringer OK, should be a simple fix.
Great, thanks! I tried to fix it and was scratching my head this morning. But it was 7:00 AM, which might have been the issue. | 2017-03-23T18:27:58Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/google/home/lukesneeringer/Code/Google/google-cloud-python/system_tests/bigquery.py", line 770, in test_sync_query_w_query_params
query.run()
File "/usr/local/google/home/lukesneeringer/Code/Google/google-cloud-python/.tox/system-tests3/lib/python3.5/site-packages/google/cloud/bigquery/query.py", line 364, in run
method='POST', path=path, data=self._build_resource())
File "/usr/local/google/home/lukesneeringer/Code/Google/google-cloud-python/.tox/system-tests3/lib/python3.5/site-packages/google/cloud/_http.py", line 294, in api_request
data = json.dumps(data)
File "/usr/local/lib/python3.5/json/__init__.py", line 230, in dumps
return _default_encoder.encode(obj)
File "/usr/local/lib/python3.5/json/encoder.py", line 198, in encode
chunks = self.iterencode(o, _one_shot=True)
File "/usr/local/lib/python3.5/json/encoder.py", line 256, in iterencode
return _iterencode(o, 0)
File "/usr/local/lib/python3.5/json/encoder.py", line 179, in default
raise TypeError(repr(o) + " is not JSON serializable")
TypeError: b'REVBREJFRUY=' is not JSON serializable
| 5,962 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-3270 | 5a5d84beb8c5ee0ff46741dbb27778c074b4bf72 | diff --git a/logging/google/cloud/logging/_gax.py b/logging/google/cloud/logging/_gax.py
--- a/logging/google/cloud/logging/_gax.py
+++ b/logging/google/cloud/logging/_gax.py
@@ -243,6 +243,8 @@ def sink_get(self, project, sink_name):
if exc_to_code(exc.cause) == StatusCode.NOT_FOUND:
raise NotFound(path)
raise
+ # NOTE: LogSink message type does not have an ``Any`` field
+ # so `MessageToDict`` can safely be used.
return MessageToDict(sink_pb)
def sink_update(self, project, sink_name, filter_, destination):
@@ -270,11 +272,13 @@ def sink_update(self, project, sink_name, filter_, destination):
path = 'projects/%s/sinks/%s' % (project, sink_name)
sink_pb = LogSink(name=path, filter=filter_, destination=destination)
try:
- self._gax_api.update_sink(path, sink_pb, options=options)
+ sink_pb = self._gax_api.update_sink(path, sink_pb, options=options)
except GaxError as exc:
if exc_to_code(exc.cause) == StatusCode.NOT_FOUND:
raise NotFound(path)
raise
+ # NOTE: LogSink message type does not have an ``Any`` field
+ # so `MessageToDict`` can safely be used.
return MessageToDict(sink_pb)
def sink_delete(self, project, sink_name):
@@ -391,6 +395,8 @@ def metric_get(self, project, metric_name):
if exc_to_code(exc.cause) == StatusCode.NOT_FOUND:
raise NotFound(path)
raise
+ # NOTE: LogMetric message type does not have an ``Any`` field
+ # so `MessageToDict`` can safely be used.
return MessageToDict(metric_pb)
def metric_update(self, project, metric_name, filter_, description):
@@ -418,11 +424,14 @@ def metric_update(self, project, metric_name, filter_, description):
metric_pb = LogMetric(name=path, filter=filter_,
description=description)
try:
- self._gax_api.update_log_metric(path, metric_pb, options=options)
+ metric_pb = self._gax_api.update_log_metric(
+ path, metric_pb, options=options)
except GaxError as exc:
if exc_to_code(exc.cause) == StatusCode.NOT_FOUND:
raise NotFound(path)
raise
+ # NOTE: LogMetric message type does not have an ``Any`` field
+ # so `MessageToDict`` can safely be used.
return MessageToDict(metric_pb)
def metric_delete(self, project, metric_name):
@@ -444,6 +453,35 @@ def metric_delete(self, project, metric_name):
raise
+def _parse_log_entry(entry_pb):
+ """Special helper to parse ``LogEntry`` protobuf into a dictionary.
+
+ The ``proto_payload`` field in ``LogEntry`` is of type ``Any``. This
+ can be problematic if the type URL in the payload isn't in the
+ ``google.protobuf`` registry. To help with parsing unregistered types,
+ this function will remove ``proto_payload`` before parsing.
+
+ :type entry_pb: :class:`.log_entry_pb2.LogEntry`
+ :param entry_pb: Log entry protobuf.
+
+ :rtype: dict
+ :returns: The parsed log entry. The ``protoPayload`` key may contain
+ the raw ``Any`` protobuf from ``entry_pb.proto_payload`` if
+ it could not be parsed.
+ """
+ try:
+ return MessageToDict(entry_pb)
+ except TypeError:
+ if entry_pb.HasField('proto_payload'):
+ proto_payload = entry_pb.proto_payload
+ entry_pb.ClearField('proto_payload')
+ entry_mapping = MessageToDict(entry_pb)
+ entry_mapping['protoPayload'] = proto_payload
+ return entry_mapping
+ else:
+ raise
+
+
def _log_entry_mapping_to_pb(mapping):
"""Helper for :meth:`write_entries`, et aliae
@@ -451,6 +489,13 @@ def _log_entry_mapping_to_pb(mapping):
the keys expected in the JSON API.
"""
entry_pb = LogEntry()
+ # NOTE: We assume ``mapping`` was created in ``Batch.commit``
+ # or ``Logger._make_entry_resource``. In either case, if
+ # the ``protoPayload`` key is present, we assume that the
+ # type URL is registered with ``google.protobuf`` and will
+ # not cause any issues in the JSON->protobuf conversion
+ # of the corresponding ``proto_payload`` in the log entry
+ # (it is an ``Any`` field).
ParseDict(mapping, entry_pb)
return entry_pb
@@ -482,7 +527,7 @@ def _item_to_entry(iterator, entry_pb, loggers):
:rtype: :class:`~google.cloud.logging.entries._BaseEntry`
:returns: The next log entry in the page.
"""
- resource = MessageToDict(entry_pb)
+ resource = _parse_log_entry(entry_pb)
return entry_from_resource(resource, iterator.client, loggers)
@@ -499,6 +544,8 @@ def _item_to_sink(iterator, log_sink_pb):
:rtype: :class:`~google.cloud.logging.sink.Sink`
:returns: The next sink in the page.
"""
+ # NOTE: LogSink message type does not have an ``Any`` field
+ # so `MessageToDict`` can safely be used.
resource = MessageToDict(log_sink_pb)
return Sink.from_api_repr(resource, iterator.client)
@@ -516,6 +563,8 @@ def _item_to_metric(iterator, log_metric_pb):
:rtype: :class:`~google.cloud.logging.metric.Metric`
:returns: The next metric in the page.
"""
+ # NOTE: LogMetric message type does not have an ``Any`` field
+ # so `MessageToDict`` can safely be used.
resource = MessageToDict(log_metric_pb)
return Metric.from_api_repr(resource, iterator.client)
diff --git a/logging/google/cloud/logging/_http.py b/logging/google/cloud/logging/_http.py
--- a/logging/google/cloud/logging/_http.py
+++ b/logging/google/cloud/logging/_http.py
@@ -286,6 +286,9 @@ def sink_update(self, project, sink_name, filter_, destination):
:type destination: str
:param destination: destination URI for the entries exported by
the sink.
+
+ :rtype: dict
+ :returns: The returned (updated) resource.
"""
target = '/projects/%s/sinks/%s' % (project, sink_name)
data = {
@@ -293,7 +296,7 @@ def sink_update(self, project, sink_name, filter_, destination):
'filter': filter_,
'destination': destination,
}
- self.api_request(method='PUT', path=target, data=data)
+ return self.api_request(method='PUT', path=target, data=data)
def sink_delete(self, project, sink_name):
"""API call: delete a sink resource.
@@ -421,6 +424,9 @@ def metric_update(self, project, metric_name, filter_, description):
:type description: str
:param description: description of the metric.
+
+ :rtype: dict
+ :returns: The returned (updated) resource.
"""
target = '/projects/%s/metrics/%s' % (project, metric_name)
data = {
@@ -428,7 +434,7 @@ def metric_update(self, project, metric_name, filter_, description):
'filter': filter_,
'description': description,
}
- self.api_request(method='PUT', path=target, data=data)
+ return self.api_request(method='PUT', path=target, data=data)
def metric_delete(self, project, metric_name):
"""API call: delete a metric resource.
diff --git a/logging/google/cloud/logging/entries.py b/logging/google/cloud/logging/entries.py
--- a/logging/google/cloud/logging/entries.py
+++ b/logging/google/cloud/logging/entries.py
@@ -17,6 +17,7 @@
import json
import re
+from google.protobuf import any_pb2
from google.protobuf.json_format import Parse
from google.cloud._helpers import _name_from_project_path
@@ -47,7 +48,7 @@ def logger_name_from_path(path):
class _BaseEntry(object):
- """Base class for TextEntry, StructEntry.
+ """Base class for TextEntry, StructEntry, ProtobufEntry.
:type payload: text or dict
:param payload: The payload passed as ``textPayload``, ``jsonPayload``,
@@ -99,7 +100,7 @@ def from_api_repr(cls, resource, client, loggers=None):
(Optional) A mapping of logger fullnames -> loggers. If not
passed, the entry will have a newly-created logger.
- :rtype: :class:`google.cloud.logging.entries.TextEntry`
+ :rtype: :class:`google.cloud.logging.entries._BaseEntry`
:returns: Text entry parsed from ``resource``.
"""
if loggers is None:
@@ -144,9 +145,45 @@ class ProtobufEntry(_BaseEntry):
See:
https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry
+
+ :type payload: str, dict or any_pb2.Any
+ :param payload: The payload passed as ``textPayload``, ``jsonPayload``,
+ or ``protoPayload``. This also may be passed as a raw
+ :class:`.any_pb2.Any` if the ``protoPayload`` could
+ not be deserialized.
+
+ :type logger: :class:`~google.cloud.logging.logger.Logger`
+ :param logger: the logger used to write the entry.
+
+ :type insert_id: str
+ :param insert_id: (optional) the ID used to identify an entry uniquely.
+
+ :type timestamp: :class:`datetime.datetime`
+ :param timestamp: (optional) timestamp for the entry
+
+ :type labels: dict
+ :param labels: (optional) mapping of labels for the entry
+
+ :type severity: str
+ :param severity: (optional) severity of event being logged.
+
+ :type http_request: dict
+ :param http_request: (optional) info about HTTP request associated with
+ the entry
"""
_PAYLOAD_KEY = 'protoPayload'
+ def __init__(self, payload, logger, insert_id=None, timestamp=None,
+ labels=None, severity=None, http_request=None):
+ super(ProtobufEntry, self).__init__(
+ payload, logger, insert_id=insert_id, timestamp=timestamp,
+ labels=labels, severity=severity, http_request=http_request)
+ if isinstance(self.payload, any_pb2.Any):
+ self.payload_pb = self.payload
+ self.payload = None
+ else:
+ self.payload_pb = None
+
def parse_message(self, message):
"""Parse payload into a protobuf message.
@@ -155,4 +192,7 @@ def parse_message(self, message):
:type message: Protobuf message
:param message: the message to be logged
"""
+ # NOTE: This assumes that ``payload`` is already a deserialized
+ # ``Any`` field and ``message`` has come from an imported
+ # ``pb2`` module with the relevant protobuf message type.
Parse(json.dumps(self.payload), message)
diff --git a/logging/google/cloud/logging/logger.py b/logging/google/cloud/logging/logger.py
--- a/logging/google/cloud/logging/logger.py
+++ b/logging/google/cloud/logging/logger.py
@@ -14,9 +14,7 @@
"""Define API Loggers."""
-import json
-
-from google.protobuf.json_format import MessageToJson
+from google.protobuf.json_format import MessageToDict
from google.cloud._helpers import _datetime_to_rfc3339
@@ -106,24 +104,24 @@ def _make_entry_resource(self, text=None, info=None, message=None,
:type info: dict
:param info: (Optional) struct payload
- :type message: Protobuf message or :class:`NoneType`
- :param message: protobuf payload
+ :type message: :class:`~google.protobuf.message.Message`
+ :param message: (Optional) The protobuf payload to log.
:type labels: dict
:param labels: (Optional) labels passed in to calling method.
:type insert_id: str
- :param insert_id: (optional) unique ID for log entry.
+ :param insert_id: (Optional) unique ID for log entry.
:type severity: str
- :param severity: (optional) severity of event being logged.
+ :param severity: (Optional) severity of event being logged.
:type http_request: dict
- :param http_request: (optional) info about HTTP request associated with
+ :param http_request: (Optional) info about HTTP request associated with
the entry
:type timestamp: :class:`datetime.datetime`
- :param timestamp: (optional) timestamp of event being logged.
+ :param timestamp: (Optional) timestamp of event being logged.
:rtype: dict
:returns: The JSON resource created.
@@ -140,9 +138,13 @@ def _make_entry_resource(self, text=None, info=None, message=None,
resource['jsonPayload'] = info
if message is not None:
- as_json_str = MessageToJson(message)
- as_json = json.loads(as_json_str)
- resource['protoPayload'] = as_json
+ # NOTE: If ``message`` contains an ``Any`` field with an
+ # unknown type, this will fail with a ``TypeError``.
+ # However, since ``message`` will be provided by a user,
+ # the assumption is that any types needed for the
+ # protobuf->JSON conversion will be known from already
+ # imported ``pb2`` modules.
+ resource['protoPayload'] = MessageToDict(message)
if labels is None:
labels = self.labels
@@ -245,8 +247,8 @@ def log_proto(self, message, client=None, labels=None, insert_id=None,
See:
https://cloud.google.com/logging/docs/reference/v2/rest/v2/entries/list
- :type message: Protobuf message
- :param message: the message to be logged
+ :type message: :class:`~google.protobuf.message.Message`
+ :param message: The protobuf message to be logged.
:type client: :class:`~google.cloud.logging.client.Client` or
``NoneType``
@@ -462,9 +464,13 @@ def commit(self, client=None):
elif entry_type == 'struct':
info = {'jsonPayload': entry}
elif entry_type == 'proto':
- as_json_str = MessageToJson(entry)
- as_json = json.loads(as_json_str)
- info = {'protoPayload': as_json}
+ # NOTE: If ``entry`` contains an ``Any`` field with an
+ # unknown type, this will fail with a ``TypeError``.
+ # However, since ``entry`` was provided by a user in
+ # ``Batch.log_proto``, the assumption is that any types
+ # needed for the protobuf->JSON conversion will be known
+ # from already imported ``pb2`` modules.
+ info = {'protoPayload': MessageToDict(entry)}
else:
raise ValueError('Unknown entry type: %s' % (entry_type,))
if labels is not None:
| New 'MessageToDict' implementation breaks with unknown entry types
```python
>>> from google.cloud.logging import Client
>>> client = Client()
>>> def do_something_with(entry): pass
...
>>> for entry in client.list_entries():
... do_something_with(entry)
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/cloud/iterator.py", line 211, in _items_iter
for item in page:
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/cloud/iterator.py", line 155, in next
result = self._item_to_value(self._parent, item)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/cloud/logging/_gax.py", line 478, in _item_to_entry
resource = MessageToDict(entry_pb)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 133, in MessageToDict
return printer._MessageToJsonObject(message)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 164, in _MessageToJsonObject
return self._RegularMessageToJsonObject(message, js)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 196, in _RegularMessageToJsonObject
js[name] = self._FieldToJsonObject(field, value)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 230, in _FieldToJsonObject
return self._MessageToJsonObject(value)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 162, in _MessageToJsonObject
return methodcaller(_WKTJSONMETHODS[full_name][0], message)(self)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 266, in _AnyMessageToJsonObject
sub_message = _CreateMessageFromTypeUrl(type_url)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 341, in _CreateMessageFromTypeUrl
'Can not find message descriptor by type_url: {0}.'.format(type_url))
TypeError: Can not find message descriptor by type_url: type.googleapis.com/google.cloud.audit.AuditLog.
```
| @tseaver Yup, @bjwatson had reported this earlier and I confirmed with @waprin yesterday. Sorry for not filing. I'd been meaning to resolve this.
@tseaver @dhermes We're tracking this in https://github.com/googleapis/googleapis/issues/187.
What's the relative priority? We're focused on landing our beta surfaces this month, and the related issue https://github.com/GoogleCloudPlatform/google-cloud-python/issues/2572 took me down a long tangential path for the past week.
@bjwatson This is something that's my fault, not your fault. There isn't anything you need to fix, we'll always encounter types we don't know about. The proper behavior is just to leave them as `Any` instances, and I'll make that fix in `google-cloud-python`
@bjwatson I can exercise the new logging snippets with GRPC disabled for now.
Ok, thanks @dhermes and @tseaver.
ran into this issue as well. Just to document, you can disable gRPC using
```
os.environ["GOOGLE_CLOUD_DISABLE_GRPC"] = "true"
```
I may just end up reverting #2600.
It's fairly relevant that the **only** `Any` in the `logging` protobufs [is `LogEntry.proto_payload`][1] (which is part of the `payload` oneof)
Currently digging in to a [nice repro][2] provided by @Fkawala, I made it totally distinct of this library [here in a gist][3].
[1]: https://github.com/googleapis/googleapis/blob/c74a9367216c44b3ba8d8e3913d497d5b3a0be16/google/logging/v2/log_entry.proto#L68
[2]: https://github.com/GoogleCloudPlatform/google-cloud-python/issues/2572#issue-184013760
[3]: https://gist.github.com/dhermes/b147eebe41c28976fe0dd2d68694a7d2 | 2017-04-03T23:40:30Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/cloud/iterator.py", line 211, in _items_iter
for item in page:
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/cloud/iterator.py", line 155, in next
result = self._item_to_value(self._parent, item)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/cloud/logging/_gax.py", line 478, in _item_to_entry
resource = MessageToDict(entry_pb)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 133, in MessageToDict
return printer._MessageToJsonObject(message)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 164, in _MessageToJsonObject
return self._RegularMessageToJsonObject(message, js)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 196, in _RegularMessageToJsonObject
js[name] = self._FieldToJsonObject(field, value)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 230, in _FieldToJsonObject
return self._MessageToJsonObject(value)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 162, in _MessageToJsonObject
return methodcaller(_WKTJSONMETHODS[full_name][0], message)(self)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 266, in _AnyMessageToJsonObject
sub_message = _CreateMessageFromTypeUrl(type_url)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/.tox/docs/lib/python2.7/site-packages/google/protobuf/json_format.py", line 341, in _CreateMessageFromTypeUrl
'Can not find message descriptor by type_url: {0}.'.format(type_url))
TypeError: Can not find message descriptor by type_url: type.googleapis.com/google.cloud.audit.AuditLog.
| 5,972 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-3426 | 5b6eb55e016422b36dfe7d6bbcf765dff18389a4 | diff --git a/bigquery/google/cloud/bigquery/_helpers.py b/bigquery/google/cloud/bigquery/_helpers.py
--- a/bigquery/google/cloud/bigquery/_helpers.py
+++ b/bigquery/google/cloud/bigquery/_helpers.py
@@ -21,6 +21,7 @@
from google.cloud._helpers import UTC
from google.cloud._helpers import _date_from_iso8601_date
from google.cloud._helpers import _datetime_from_microseconds
+from google.cloud._helpers import _microseconds_from_datetime
from google.cloud._helpers import _RFC3339_NO_FRACTION
from google.cloud._helpers import _time_from_iso8601_time_naive
from google.cloud._helpers import _to_bytes
@@ -122,6 +123,38 @@ def _record_from_json(value, field):
}
+def _row_from_json(row, schema):
+ """Convert JSON row data to row with appropriate types.
+
+ Note: ``row['f']`` and ``schema`` are presumed to be of the same length.
+
+ :type row: dict
+ :param row: A JSON response row to be converted.
+
+ :type schema: tuple
+ :param schema: A tuple of
+ :class:`~google.cloud.bigquery.schema.SchemaField`.
+
+ :rtype: tuple
+ :returns: A tuple of data converted to native types.
+ """
+ row_data = []
+ for field, cell in zip(schema, row['f']):
+ converter = _CELLDATA_FROM_JSON[field.field_type]
+ if field.mode == 'REPEATED':
+ row_data.append([converter(item['v'], field)
+ for item in cell['v']])
+ else:
+ row_data.append(converter(cell['v'], field))
+
+ return tuple(row_data)
+
+
+def _rows_from_json(rows, schema):
+ """Convert JSON row data to rows with appropriate types."""
+ return [_row_from_json(row, schema) for row in rows]
+
+
def _int_to_json(value):
"""Coerce 'value' to an JSON-compatible representation."""
if isinstance(value, int):
@@ -148,8 +181,11 @@ def _bytes_to_json(value):
return value
-def _timestamp_to_json(value):
- """Coerce 'value' to an JSON-compatible representation."""
+def _timestamp_to_json_parameter(value):
+ """Coerce 'value' to an JSON-compatible representation.
+
+ This version returns the string representation used in query parameters.
+ """
if isinstance(value, datetime.datetime):
if value.tzinfo not in (None, UTC):
# Convert to UTC and remove the time zone info.
@@ -159,6 +195,16 @@ def _timestamp_to_json(value):
return value
+def _timestamp_to_json_row(value):
+ """Coerce 'value' to an JSON-compatible representation.
+
+ This version returns floating-point seconds value used in row data.
+ """
+ if isinstance(value, datetime.datetime):
+ value = _microseconds_from_datetime(value) * 1e-6
+ return value
+
+
def _datetime_to_json(value):
"""Coerce 'value' to an JSON-compatible representation."""
if isinstance(value, datetime.datetime):
@@ -180,7 +226,8 @@ def _time_to_json(value):
return value
-_SCALAR_VALUE_TO_JSON = {
+# Converters used for scalar values marshalled as row data.
+_SCALAR_VALUE_TO_JSON_ROW = {
'INTEGER': _int_to_json,
'INT64': _int_to_json,
'FLOAT': _float_to_json,
@@ -188,41 +235,16 @@ def _time_to_json(value):
'BOOLEAN': _bool_to_json,
'BOOL': _bool_to_json,
'BYTES': _bytes_to_json,
- 'TIMESTAMP': _timestamp_to_json,
+ 'TIMESTAMP': _timestamp_to_json_row,
'DATETIME': _datetime_to_json,
'DATE': _date_to_json,
'TIME': _time_to_json,
}
-def _row_from_json(row, schema):
- """Convert JSON row data to row with appropriate types.
-
- :type row: dict
- :param row: A JSON response row to be converted.
-
- :type schema: tuple
- :param schema: A tuple of
- :class:`~google.cloud.bigquery.schema.SchemaField`.
-
- :rtype: tuple
- :returns: A tuple of data converted to native types.
- """
- row_data = []
- for field, cell in zip(schema, row['f']):
- converter = _CELLDATA_FROM_JSON[field.field_type]
- if field.mode == 'REPEATED':
- row_data.append([converter(item['v'], field)
- for item in cell['v']])
- else:
- row_data.append(converter(cell['v'], field))
-
- return tuple(row_data)
-
-
-def _rows_from_json(rows, schema):
- """Convert JSON row data to rows with appropriate types."""
- return [_row_from_json(row, schema) for row in rows]
+# Converters used for scalar values marshalled as query parameters.
+_SCALAR_VALUE_TO_JSON_PARAM = _SCALAR_VALUE_TO_JSON_ROW.copy()
+_SCALAR_VALUE_TO_JSON_PARAM['TIMESTAMP'] = _timestamp_to_json_parameter
class _ConfigurationProperty(object):
@@ -420,7 +442,7 @@ def to_api_repr(self):
:returns: JSON mapping
"""
value = self.value
- converter = _SCALAR_VALUE_TO_JSON.get(self.type_)
+ converter = _SCALAR_VALUE_TO_JSON_PARAM.get(self.type_)
if converter is not None:
value = converter(value)
resource = {
@@ -506,7 +528,7 @@ def to_api_repr(self):
a_values = [repr_['parameterValue'] for repr_ in reprs]
else:
a_type = {'type': self.array_type}
- converter = _SCALAR_VALUE_TO_JSON.get(self.array_type)
+ converter = _SCALAR_VALUE_TO_JSON_PARAM.get(self.array_type)
if converter is not None:
values = [converter(value) for value in values]
a_values = [{'value': value} for value in values]
@@ -600,7 +622,7 @@ def to_api_repr(self):
values[name] = repr_['parameterValue']
else:
s_types[name] = {'name': name, 'type': {'type': type_}}
- converter = _SCALAR_VALUE_TO_JSON.get(type_)
+ converter = _SCALAR_VALUE_TO_JSON_PARAM.get(type_)
if converter is not None:
value = converter(value)
values[name] = {'value': value}
diff --git a/bigquery/google/cloud/bigquery/table.py b/bigquery/google/cloud/bigquery/table.py
--- a/bigquery/google/cloud/bigquery/table.py
+++ b/bigquery/google/cloud/bigquery/table.py
@@ -22,10 +22,10 @@
import six
from google.cloud._helpers import _datetime_from_microseconds
-from google.cloud._helpers import _microseconds_from_datetime
from google.cloud._helpers import _millis_from_datetime
from google.cloud.exceptions import NotFound
from google.cloud.exceptions import make_exception
+from google.cloud.iterator import HTTPIterator
from google.cloud.streaming.exceptions import HttpError
from google.cloud.streaming.http_wrapper import Request
from google.cloud.streaming.http_wrapper import make_api_request
@@ -33,7 +33,7 @@
from google.cloud.streaming.transfer import Upload
from google.cloud.bigquery.schema import SchemaField
from google.cloud.bigquery._helpers import _row_from_json
-from google.cloud.iterator import HTTPIterator
+from google.cloud.bigquery._helpers import _SCALAR_VALUE_TO_JSON_ROW
_TABLE_HAS_NO_SCHEMA = "Table has no schema: call 'table.reload()'"
@@ -673,6 +673,9 @@ def fetch_data(self, max_results=None, page_token=None, client=None):
(this is distinct from the total number of rows in the
current page: ``iterator.page.num_items``).
"""
+ if len(self._schema) == 0:
+ raise ValueError(_TABLE_HAS_NO_SCHEMA)
+
client = self._require_client(client)
path = '%s/data' % (self.path,)
iterator = HTTPIterator(client=client, path=path,
@@ -741,11 +744,9 @@ def insert_data(self,
row_info = {}
for field, value in zip(self._schema, row):
- if field.field_type == 'TIMESTAMP':
- # BigQuery stores TIMESTAMP data internally as a
- # UNIX timestamp with microsecond precision.
- # Specifies the number of seconds since the epoch.
- value = _convert_timestamp(value)
+ converter = _SCALAR_VALUE_TO_JSON_ROW.get(field.field_type)
+ if converter is not None: # STRING doesn't need converting
+ value = converter(value)
row_info[field.name] = value
info = {'json': row_info}
@@ -1128,10 +1129,3 @@ class _UrlBuilder(object):
def __init__(self):
self.query_params = {}
self._relative_path = ''
-
-
-def _convert_timestamp(value):
- """Helper for :meth:`Table.insert_data`."""
- if isinstance(value, datetime.datetime):
- value = _microseconds_from_datetime(value) * 1e-6
- return value
| 'Table.insert_data' does not support Datetime values
Seems like the python api library does not support native python datetime objects for inserting BigQuery DateTime-values.
```
INFO:root:Sending load request
[(1L, 'as', datetime.datetime(2014, 2, 1, 0, 0), 'asdasd'),
(2L, 'ds', datetime.datetime(2014, 2, 2, 0, 0), 'asdsad')]
Traceback (most recent call last):
...
File "/home/chris/.local/lib/python2.7/site-packages/google/cloud/bigquery/table.py", line 770, in insert_data
data=data)
File "/home/chris/.local/lib/python2.7/site-packages/google/cloud/_http.py", line 326, in api_request
data = json.dumps(data)
File "/usr/lib/python2.7/json/__init__.py", line 244, in dumps
return _default_encoder.encode(obj)
File "/usr/lib/python2.7/json/encoder.py", line 207, in encode
chunks = self.iterencode(o, _one_shot=True)
File "/usr/lib/python2.7/json/encoder.py", line 270, in iterencode
return _iterencode(o, 0)
File "/usr/lib/python2.7/json/encoder.py", line 184, in default
raise TypeError(repr(o) + " is not JSON serializable")
TypeError: datetime.datetime(2014, 2, 1, 0, 0) is not JSON serializable
```
| Thanks for reporting @inkrement!
@inkrement I haven't been able to reproduce this locally. Are you on the latest version of the library?
I installed it directly from the pip repository. But I don't use the library anymore so I don't know :/
@daspecter Make sure there is a test for this case. If there already is, point to it and close this. If there is not, make one and post a PR.
I believe this is covered in [this](https://github.com/GoogleCloudPlatform/google-cloud-python/blob/7b0188f07b296b76e012fed7e1f6a12402117f5f/system_tests/bigquery.py#L283) system test.
@daspecster Perhaps your environment is different? I'm still seeing the issue, and it's exceedingly easy to reproduce.
1. Create a Python2.7 virtualenv. Activate it, obviously.
2. pip install google-cloud-bigquery (currently resolves to the latest version on pip: google_cloud_bigquery-0.23.0-py2.py3-none-any)
3. Make sure you're authenticated and create a test project. Do this in python interactvely:
```
>>> from google.cloud import bigquery
>>> import datetime
>>> client = bigquery.Client(project='myproject-1803')
>>> dataset = client.dataset('testset')
>>> dataset.create()
>>> table = dataset.table('test_datefield')
>>> table.schema = (bigquery.SchemaField('name', 'STRING'), bigquery.SchemaField('birthday', 'DATETIME'))
>>> table.create()
>>> table.insert_data(rows=[('Ken', datetime.datetime.now())])
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/kkinder/bigquerytest/testenv/lib/python2.7/site-packages/google/cloud/bigquery/table.py", line 770, in insert_data
data=data)
File "/Users/kkinder/bigquerytest/testenv/lib/python2.7/site-packages/google/cloud/_http.py", line 294, in api_request
data = json.dumps(data)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/__init__.py", line 243, in dumps
return _default_encoder.encode(obj)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/encoder.py", line 207, in encode
chunks = self.iterencode(o, _one_shot=True)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/encoder.py", line 270, in iterencode
return _iterencode(o, 0)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/encoder.py", line 184, in default
raise TypeError(repr(o) + " is not JSON serializable")
TypeError: datetime.datetime(2017, 3, 30, 14, 23, 50, 699725) is not JSON serializable
```
It's also a pretty common thing with Python's json library. Any time you pass a datetime object, this happens. The question is, what is the other end expecting for formatting? | 2017-05-16T20:57:50Z | [] | [] |
Traceback (most recent call last):
...
File "/home/chris/.local/lib/python2.7/site-packages/google/cloud/bigquery/table.py", line 770, in insert_data
data=data)
File "/home/chris/.local/lib/python2.7/site-packages/google/cloud/_http.py", line 326, in api_request
data = json.dumps(data)
File "/usr/lib/python2.7/json/__init__.py", line 244, in dumps
return _default_encoder.encode(obj)
File "/usr/lib/python2.7/json/encoder.py", line 207, in encode
chunks = self.iterencode(o, _one_shot=True)
File "/usr/lib/python2.7/json/encoder.py", line 270, in iterencode
return _iterencode(o, 0)
File "/usr/lib/python2.7/json/encoder.py", line 184, in default
raise TypeError(repr(o) + " is not JSON serializable")
TypeError: datetime.datetime(2014, 2, 1, 0, 0) is not JSON serializable
| 5,994 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-3437 | 937165ed2d9f500b0de53ab27d46cdf06739e923 | diff --git a/spanner/google/cloud/spanner/session.py b/spanner/google/cloud/spanner/session.py
--- a/spanner/google/cloud/spanner/session.py
+++ b/spanner/google/cloud/spanner/session.py
@@ -311,7 +311,9 @@ def run_in_transaction(self, func, *args, **kw):
_delay_until_retry(exc, deadline)
del self._transaction
else:
- return txn.committed
+ committed = txn.committed
+ del self._transaction
+ return committed
# pylint: disable=misplaced-bare-raise
| Calling run_in_transaction twice causes the second call to fail
It seems that the `_transaction` object in Session is not being reset once the transaction commits. This leads to a "Transaction is already committed" error. Here is a simple code to demonstrate this:
```python
def run_txn():
from google.cloud import spanner
spanner_client = spanner.Client()
instance_id = 'your-instance'
instance = spanner_client.instance(instance_id)
database_id = 'example-db'
database = instance.database(database_id)
def f(t):
t.insert_or_update(
table="Singers",
columns=('SingerId', 'FirstName'),
values = [(6, "A"),])
database.run_in_transaction(f)
database.run_in_transaction(f)
if __name__ == '__main__':
run_txn()
```
The error that is raised is:
```python
Traceback (most recent call last):
File "txn.py", line 17, in <module>
run_txn()
File "txn.py", line 14, in run_txn
database.run_in_transaction(f)
File "/usr/local/google/home/vikask/venvs/pythonq/local/lib/python2.7/site-packages/google/cloud/spanner/database.py", line 379, in run_in_transaction
return session.run_in_transaction(func, *args, **kw)
File "/usr/local/google/home/vikask/venvs/pythonq/local/lib/python2.7/site-packages/google/cloud/spanner/session.py", line 309, in run_in_transaction
txn.commit()
File "/usr/local/google/home/vikask/venvs/pythonq/local/lib/python2.7/site-packages/google/cloud/spanner/transaction.py", line 104, in commit
self._check_state()
File "/usr/local/google/home/vikask/venvs/pythonq/local/lib/python2.7/site-packages/google/cloud/spanner/transaction.py", line 46, in _check_state
raise ValueError("Transaction is already committed")
ValueError: Transaction is already committed
```
| The version of google-cloud-spanner I used was 0.24.1 | 2017-05-17T20:14:14Z | [] | [] |
Traceback (most recent call last):
File "txn.py", line 17, in <module>
run_txn()
File "txn.py", line 14, in run_txn
database.run_in_transaction(f)
File "/usr/local/google/home/vikask/venvs/pythonq/local/lib/python2.7/site-packages/google/cloud/spanner/database.py", line 379, in run_in_transaction
return session.run_in_transaction(func, *args, **kw)
File "/usr/local/google/home/vikask/venvs/pythonq/local/lib/python2.7/site-packages/google/cloud/spanner/session.py", line 309, in run_in_transaction
txn.commit()
File "/usr/local/google/home/vikask/venvs/pythonq/local/lib/python2.7/site-packages/google/cloud/spanner/transaction.py", line 104, in commit
self._check_state()
File "/usr/local/google/home/vikask/venvs/pythonq/local/lib/python2.7/site-packages/google/cloud/spanner/transaction.py", line 46, in _check_state
raise ValueError("Transaction is already committed")
ValueError: Transaction is already committed
| 5,997 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-3443 | 4f584931c4e2abe60d80fa7a51ef4c6ee662874b | diff --git a/pubsub/google/cloud/pubsub/_gax.py b/pubsub/google/cloud/pubsub/_gax.py
--- a/pubsub/google/cloud/pubsub/_gax.py
+++ b/pubsub/google/cloud/pubsub/_gax.py
@@ -42,6 +42,9 @@
from google.cloud.pubsub.subscription import Subscription
from google.cloud.pubsub.topic import Topic
+_CONFLICT_ERROR_CODES = (
+ StatusCode.FAILED_PRECONDITION, StatusCode.ALREADY_EXISTS)
+
class _PublisherAPI(object):
"""Helper mapping publisher-related APIs.
@@ -105,7 +108,7 @@ def topic_create(self, topic_path):
try:
topic_pb = self._gax_api.create_topic(topic_path)
except GaxError as exc:
- if exc_to_code(exc.cause) == StatusCode.FAILED_PRECONDITION:
+ if exc_to_code(exc.cause) in _CONFLICT_ERROR_CODES:
raise Conflict(topic_path)
raise
return {'name': topic_pb.name}
@@ -337,7 +340,7 @@ def subscription_create(self, subscription_path, topic_path,
retain_acked_messages=retain_acked_messages,
message_retention_duration=message_retention_duration)
except GaxError as exc:
- if exc_to_code(exc.cause) == StatusCode.FAILED_PRECONDITION:
+ if exc_to_code(exc.cause) in _CONFLICT_ERROR_CODES:
raise Conflict(topic_path)
raise
return MessageToDict(sub_pb)
@@ -584,7 +587,7 @@ def snapshot_create(self, snapshot_path, subscription_path):
snapshot_pb = self._gax_api.create_snapshot(
snapshot_path, subscription_path)
except GaxError as exc:
- if exc_to_code(exc.cause) == StatusCode.FAILED_PRECONDITION:
+ if exc_to_code(exc.cause) in _CONFLICT_ERROR_CODES:
raise Conflict(snapshot_path)
elif exc_to_code(exc.cause) == StatusCode.NOT_FOUND:
raise NotFound(subscription_path)
| PubSub topic.create method should raise a Conflict error instead of RetryError/GaxError when topic exists
Following the [documentation from PubSub site](https://cloud.google.com/pubsub/docs/reference/libraries#client-libraries-install-python) if you run the example 2x it will generate the error:
```
GaxError(Exception occurred in retry method that was not classified as transient, caused by <_Rendezvous of RPC that terminated with (StatusCode.ALREADY_EXISTS, Resource already exists in the project (resource=my-new-topic).)>) <class 'google.gax.errors.RetryError'>
```
However it should raise Conflict Error (google.cloud.exceptions.Conflict), because the API returns a 409 response.
You can see the status code using API Explorer:
https://developers.google.com/apis-explorer/#search/pubsub/pubsub/v1/pubsub.projects.topics.create
```
409
{
"error": {
"code": 409,
"message": "Resource already exists in the project (resource=my-new-topic).",
"status": "ALREADY_EXISTS",
...
}
}
```
Full traceback on Python:
```
>>> topic.create()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/cloud/pubsub/topic.py", line 155, in create
api.topic_create(topic_path=self.full_name)
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/cloud/pubsub/_gax.py", line 104, in topic_create
topic_pb = self._gax_api.create_topic(topic_path)
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/cloud/gapic/pubsub/v1/publisher_client.py", line 278, in create_topic
return self._create_topic(request, options)
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/gax/api_callable.py", line 419, in inner
return api_caller(api_call, this_settings, request)
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/gax/api_callable.py", line 407, in base_caller
return api_call(*args)
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/gax/api_callable.py", line 368, in inner
return a_func(*args, **kwargs)
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/gax/retry.py", line 126, in inner
' classified as transient', exception)
google.gax.errors.RetryError: GaxError(Exception occurred in retry method that was not classified as transient, caused by <_Rendezvous of RPC that terminated with (StatusCode.ALREADY_EXISTS, Resource already exists in the project (resource=my-new-topic).)>)
```
| @lukesneeringer This is part of a more general question: how / where should we re-wrap `GaxError`?
For anybody affected by this, to handle ALREADY_EXISTS exception you can use the following scheme:
```python
try:
# create topic
except Conflict as error:
# handle the current conflict exception
except RetryException as error:
status_code = error.cause.code()
if status_code == status_code.ALREADY_EXISTS:
# handle the already exists
else:
# nothing that we want just re-raise the exception
raise
```
> @lukesneeringer This is part of a more general question: how / where should we re-wrap GaxError?
The _correct_ answer is to have GAX raise these different kinds of errors, rather than the catch-all `GaxError`. It could import them from `google.cloud.exceptions`.
@lukesneeringer That would be much preferred to the [current ad-hoc approach][1] (only in use in one subpackage, though some [other remappings][2] also exist).
[1]: https://github.com/GoogleCloudPlatform/google-cloud-python/blob/983783b105c1bff4b43e7403b649c4acfa1b9138/datastore/google/cloud/datastore/_gax.py#L98-L115
[2]: https://github.com/GoogleCloudPlatform/google-cloud-python/blob/983783b105c1bff4b43e7403b649c4acfa1b9138/pubsub/google/cloud/pubsub/_gax.py#L103-L107
@dhermes While your team is fixing the issue, can you to fix the [subscription_create issue](https://github.com/GoogleCloudPlatform/google-cloud-python/blob/master/pubsub/google/cloud/pubsub/_gax.py#L316-L323) at the same time?
It's exactly the same issue but for subscription and I completely forgot to mention it in my support case with @lucmult
@lukesneeringer Generalizing the solution from `datastore` requires moving `datastore._gax._catch_remap_gax_error` into `core`, most obviously `core.exceptions`. Doing that requires adding a from `google.cloud.core` to `google.gax`. We have unconditional dependencies to `google.gax` in other packages: would that be a problem for core?
The issue within pubsub seems to be that the back-end API shifted from returning `FAILED_PRECONDITION` to `ALREADY_EXISTS`: we have handling which raises `Conflict` for the older code.
We should add a system test (or extend and existing one) to ensure that we don't regress whatever fix we apply now. | 2017-05-19T19:40:53Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/cloud/pubsub/topic.py", line 155, in create
api.topic_create(topic_path=self.full_name)
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/cloud/pubsub/_gax.py", line 104, in topic_create
topic_pb = self._gax_api.create_topic(topic_path)
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/cloud/gapic/pubsub/v1/publisher_client.py", line 278, in create_topic
return self._create_topic(request, options)
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/gax/api_callable.py", line 419, in inner
return api_caller(api_call, this_settings, request)
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/gax/api_callable.py", line 407, in base_caller
return api_call(*args)
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/gax/api_callable.py", line 368, in inner
return a_func(*args, **kwargs)
File "/usr/local/google/home/lucmult/.virtualenvs/pubsub/local/lib/python2.7/site-packages/google/gax/retry.py", line 126, in inner
' classified as transient', exception)
google.gax.errors.RetryError: GaxError(Exception occurred in retry method that was not classified as transient, caused by <_Rendezvous of RPC that terminated with (StatusCode.ALREADY_EXISTS, Resource already exists in the project (resource=my-new-topic).)>)
| 5,998 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-3729 | 22252895b203e72ad19a362b4181be4297a7d5f4 | diff --git a/vision/nox.py b/vision/nox.py
--- a/vision/nox.py
+++ b/vision/nox.py
@@ -35,10 +35,16 @@ def unit_tests(session, python_version):
session.install('-e', '.')
# Run py.test against the unit tests.
- session.run('py.test', '--quiet',
- '--cov=google.cloud.vision', '--cov=google.cloud.vision_v1',
- '--cov-append', '--cov-config=.coveragerc', '--cov-report=',
- 'tests/',
+ session.run(
+ 'py.test',
+ '--quiet',
+ '--cov=google.cloud.vision',
+ '--cov=google.cloud.vision_v1',
+ '--cov-append',
+ '--cov-config=.coveragerc',
+ '--cov-report=',
+ 'tests',
+ *session.posargs
)
@@ -63,7 +69,12 @@ def system_tests(session, python_version):
session.install('-e', '.')
# Run py.test against the unit tests.
- session.run('py.test', '--quiet', 'tests/system.py')
+ session.run(
+ 'py.test',
+ '--quiet',
+ os.path.join('tests', 'system.py'),
+ *session.posargs
+ )
@nox.session
@@ -84,7 +95,12 @@ def system_tests_manual_layer(session, python_version):
session.install('-e', '.')
# Run py.test against the unit tests.
- session.run('py.test', '--quiet', 'tests/system_old.py')
+ session.run(
+ 'py.test',
+ '--quiet',
+ os.path.join('tests', 'system_old.py'),
+ *session.posargs
+ )
@nox.session
diff --git a/vision/setup.py b/vision/setup.py
--- a/vision/setup.py
+++ b/vision/setup.py
@@ -25,7 +25,7 @@
readme = readme_file.read()
REQUIREMENTS = [
- 'google-cloud-core >= 0.25.0, < 0.26dev',
+ 'google-cloud-core >= 0.26.0, < 0.27dev',
'google-gax >= 0.15.13, < 0.16dev',
'googleapis-common-protos[grpc] >= 1.5.2, < 2.0dev',
]
| Vision "manual" system tests are broken
https://circleci.com/gh/GoogleCloudPlatform/google-cloud-python/2591
```
nox > Session system_tests(python_version='3.6') successful. :)
nox > Running session system_tests_manual_layer(python_version='2.7')
nox > virtualenv /var/code/gcp/.nox/system_tests_manual_layer-python_version-2-7 -p python2.7
nox > chdir /var/code/gcp/vision
nox > pip install --upgrade pytest ../core/ ../storage/
nox > pip install --upgrade ../test_utils/
nox > pip install --upgrade -e .
nox > py.test --quiet tests/system_old.py
............FFF................
=================================== FAILURES ===================================
_______________ TestVisionClientLabel.test_detect_labels_content _______________
Traceback (most recent call last):
File "/var/code/gcp/vision/tests/system_old.py", line 368, in test_detect_labels_content
self._assert_label(label)
File "/var/code/gcp/vision/tests/system_old.py", line 357, in _assert_label
self.assertIn(label.description, self.DESCRIPTIONS)
File "/usr/local/lib/python2.7/unittest/case.py", line 803, in assertIn
self.fail(self._formatMessage(msg, standardMsg))
File "/usr/local/lib/python2.7/unittest/case.py", line 410, in fail
raise self.failureException(msg)
AssertionError: u'motor vehicle' not found in ('car', 'vehicle', 'land vehicle', 'automotive design', 'wheel', 'automobile make', 'luxury vehicle', 'sports car', 'performance car', 'automotive exterior')
...
```
| This actually looks like the _output_ changed.
As of https://circleci.com/gh/GoogleCloudPlatform/google-cloud-python/2688 / #3720, this is keeping is red.
This is bothering me, so I'm fixing it right now (if I can in under 15m). | 2017-08-04T22:15:13Z | [] | [] |
Traceback (most recent call last):
File "/var/code/gcp/vision/tests/system_old.py", line 368, in test_detect_labels_content
self._assert_label(label)
File "/var/code/gcp/vision/tests/system_old.py", line 357, in _assert_label
self.assertIn(label.description, self.DESCRIPTIONS)
File "/usr/local/lib/python2.7/unittest/case.py", line 803, in assertIn
self.fail(self._formatMessage(msg, standardMsg))
File "/usr/local/lib/python2.7/unittest/case.py", line 410, in fail
raise self.failureException(msg)
AssertionError: u'motor vehicle' not found in ('car', 'vehicle', 'land vehicle', 'automotive design', 'wheel', 'automobile make', 'luxury vehicle', 'sports car', 'performance car', 'automotive exterior')
| 6,030 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-3865 | 830f874996555ccd8f22315b123693a407a7e136 | diff --git a/pubsub/setup.py b/pubsub/setup.py
--- a/pubsub/setup.py
+++ b/pubsub/setup.py
@@ -51,6 +51,7 @@
REQUIREMENTS = [
+ 'google-cloud-core >= 0.27.0, < 0.28dev',
'google-gax >= 0.15.13, < 0.16dev',
'googleapis-common-protos[grpc] >= 1.5.2, < 2.0dev',
'grpc-google-iam-v1 >= 0.11.1, < 0.12dev',
@@ -60,7 +61,7 @@
setup(
name='google-cloud-pubsub',
- version='0.28.0',
+ version='0.28.1',
description='Python Client for Google Cloud Pub/Sub',
long_description=README,
namespace_packages=[
| Workers no longer start, error in pubsub
As of this morning, when I gcloud app deploy, the workers no longer start.
I'm using Google AppEngine, Python Flexible Environment
[2017-08-24 19:41:31 +0000] [9] [ERROR] Exception in worker process
Traceback (most recent call last):
File "/env/lib/python3.5/site-packages/gunicorn/arbiter.py", line 578, in spawn_worker
worker.init_process()
File "/env/lib/python3.5/site-packages/gunicorn/workers/base.py", line 126, in init_process
self.load_wsgi()
File "/env/lib/python3.5/site-packages/gunicorn/workers/base.py", line 135, in load_wsgi
self.wsgi = self.app.wsgi()
File "/env/lib/python3.5/site-packages/gunicorn/app/base.py", line 67, in wsgi
self.callable = self.load()
File "/env/lib/python3.5/site-packages/gunicorn/app/wsgiapp.py", line 65, in load
return self.load_wsgiapp()
File "/env/lib/python3.5/site-packages/gunicorn/app/wsgiapp.py", line 52, in load_wsgiapp
return util.import_app(self.app_uri)
File "/env/lib/python3.5/site-packages/gunicorn/util.py", line 352, in import_app
__import__(module)
File "/home/vmagent/app/main.py", line 2, in <module>
import episteme.factory
File "/home/vmagent/app/episteme/factory.py", line 25, in <module>
import episteme.model
File "/home/vmagent/app/episteme/model/__init__.py", line 662, in <module>
from . import user
File "/home/vmagent/app/episteme/model/user.py", line 563, in <module>
from episteme.mail import EpistemeMailManager
File "/home/vmagent/app/episteme/mail/__init__.py", line 16, in <module>
from google.cloud import pubsub
File "/env/lib/python3.5/site-packages/google/cloud/pubsub.py", line 17, in <module>
from google.cloud.pubsub_v1 import PublisherClient
File "/env/lib/python3.5/site-packages/google/cloud/pubsub_v1/__init__.py", line 18, in <module>
from google.cloud.pubsub_v1.publisher import Client as PublisherClient
File "/env/lib/python3.5/site-packages/google/cloud/pubsub_v1/publisher/__init__.py", line 17, in <module>
from google.cloud.pubsub_v1.publisher.client import Client
File "/env/lib/python3.5/site-packages/google/cloud/pubsub_v1/publisher/client.py", line 27, in <module>
from google.cloud.pubsub_v1.publisher.batch import thread
File "/env/lib/python3.5/site-packages/google/cloud/pubsub_v1/publisher/batch/thread.py", line 22, in <module>
from google.cloud.pubsub_v1.publisher import exceptions
File "/env/lib/python3.5/site-packages/google/cloud/pubsub_v1/publisher/exceptions.py", line 19, in <module>
from google.api.core.exceptions import GoogleAPICallError
ImportError: No module named 'google.api.core'
| Confirmed. Investigating. | 2017-08-24T20:40:09Z | [] | [] |
Traceback (most recent call last):
File "/env/lib/python3.5/site-packages/gunicorn/arbiter.py", line 578, in spawn_worker
worker.init_process()
File "/env/lib/python3.5/site-packages/gunicorn/workers/base.py", line 126, in init_process
self.load_wsgi()
File "/env/lib/python3.5/site-packages/gunicorn/workers/base.py", line 135, in load_wsgi
self.wsgi = self.app.wsgi()
File "/env/lib/python3.5/site-packages/gunicorn/app/base.py", line 67, in wsgi
self.callable = self.load()
File "/env/lib/python3.5/site-packages/gunicorn/app/wsgiapp.py", line 65, in load
return self.load_wsgiapp()
File "/env/lib/python3.5/site-packages/gunicorn/app/wsgiapp.py", line 52, in load_wsgiapp
return util.import_app(self.app_uri)
File "/env/lib/python3.5/site-packages/gunicorn/util.py", line 352, in import_app
__import__(module)
File "/home/vmagent/app/main.py", line 2, in <module>
import episteme.factory
File "/home/vmagent/app/episteme/factory.py", line 25, in <module>
import episteme.model
File "/home/vmagent/app/episteme/model/__init__.py", line 662, in <module>
from . import user
File "/home/vmagent/app/episteme/model/user.py", line 563, in <module>
from episteme.mail import EpistemeMailManager
File "/home/vmagent/app/episteme/mail/__init__.py", line 16, in <module>
from google.cloud import pubsub
File "/env/lib/python3.5/site-packages/google/cloud/pubsub.py", line 17, in <module>
from google.cloud.pubsub_v1 import PublisherClient
File "/env/lib/python3.5/site-packages/google/cloud/pubsub_v1/__init__.py", line 18, in <module>
from google.cloud.pubsub_v1.publisher import Client as PublisherClient
File "/env/lib/python3.5/site-packages/google/cloud/pubsub_v1/publisher/__init__.py", line 17, in <module>
from google.cloud.pubsub_v1.publisher.client import Client
File "/env/lib/python3.5/site-packages/google/cloud/pubsub_v1/publisher/client.py", line 27, in <module>
from google.cloud.pubsub_v1.publisher.batch import thread
File "/env/lib/python3.5/site-packages/google/cloud/pubsub_v1/publisher/batch/thread.py", line 22, in <module>
from google.cloud.pubsub_v1.publisher import exceptions
File "/env/lib/python3.5/site-packages/google/cloud/pubsub_v1/publisher/exceptions.py", line 19, in <module>
from google.api.core.exceptions import GoogleAPICallError
ImportError: No module named 'google.api.core'
| 6,054 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-3954 | 944e8fd09e36e94d6b2fa6dcc46cca120c0c2b53 | diff --git a/logging/google/cloud/logging/handlers/transports/background_thread.py b/logging/google/cloud/logging/handlers/transports/background_thread.py
--- a/logging/google/cloud/logging/handlers/transports/background_thread.py
+++ b/logging/google/cloud/logging/handlers/transports/background_thread.py
@@ -20,7 +20,6 @@
from __future__ import print_function
import atexit
-import copy
import logging
import threading
@@ -254,9 +253,7 @@ class BackgroundThreadTransport(Transport):
def __init__(self, client, name, grace_period=_DEFAULT_GRACE_PERIOD,
batch_size=_DEFAULT_MAX_BATCH_SIZE):
- http = copy.deepcopy(client._http)
- self.client = client.__class__(
- client.project, client._credentials, http)
+ self.client = client
logger = self.client.logger(name)
self.worker = _Worker(logger)
self.worker.start()
| Issue with logging and google-auth
The following code:
```
import google.cloud.logging
import logging
from google.cloud.logging.handlers import CloudLoggingHandler
client = google.cloud.logging.Client()
handler = CloudLoggingHandler(client)
cloud_logger = logging.getLogger()
cloud_logger.setLevel(logging.INFO)
cloud_logger.addHandler(handler)
cloud_logger.addHandler(logging.StreamHandler())
cloud_logger.info('hello')
```
works fine with google-cloud-logging==1.1.0 but fails with newer versions (1.2.0 or 1.3.0) :
```
Failed to submit 1 logs.
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/google/cloud/logging/handlers/transports/background_thread.py", line 99, in _safely_commit_batch
batch.commit()
File "/usr/local/lib/python2.7/dist-packages/google/cloud/logging/logger.py", line 549, in commit
client.logging_api.write_entries(entries, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/google/cloud/logging/_http.py", line 163, in write_entries
self.api_request(method='POST', path='/entries:write', data=data)
File "/usr/local/lib/python2.7/dist-packages/google/cloud/_http.py", line 290, in api_request
headers=headers, target_object=_target_object)
File "/usr/local/lib/python2.7/dist-packages/google/cloud/_http.py", line 183, in _make_request
return self._do_request(method, url, headers, data, target_object)
File "/usr/local/lib/python2.7/dist-packages/google/cloud/_http.py", line 212, in _do_request
url=url, method=method, headers=headers, data=data)
File "/usr/local/lib/python2.7/dist-packages/google/auth/transport/requests.py", line 175, in request
self.credentials.before_request(
AttributeError: 'AuthorizedSession' object has no attribute 'credentials'
```
| Trying to repro right now.
@tcroiset I can't reproduce. Can you step into a debugger and inspect `self` / `pp self.__dict__` at the moment of failure? I usually just run code in IPython and then hop into the stacktrace with the `%debug` magic.
```
self:
<google.auth.transport.requests.AuthorizedSession object at 0x7fce6777bb90>
```
```
self.__dict__:
{
'cookies': < RequestsCookieJar[] > ,
'stream': False,
'hooks': {
'response': []
},
'auth': None,
'trust_env': True,
'headers': {
'Connection': 'keep-alive',
'Accept-Encoding': 'gzip, deflate',
'Accept': '*/*',
'User-Agent': 'python-requests/2.18.4'
},
'cert': None,
'params': {},
'prefetch': None,
'verify': True,
'proxies': {},
'adapters': OrderedDict([('https://', < requests.adapters.HTTPAdapter object at 0x7fce6777bd50 > ), ('http://', < requests.adapters.HTTPAdapter object at 0x7fce6771a090 > )]),
'max_redirects': 30
}
```
@tcroiset Thanks a lot!
@jonparrott Any ideas?
Can I get the output of `pip freeze` @tcroiset?
cachetools==2.0.1
certifi==2017.7.27.1
chardet==3.0.4
colorama==0.3.2
dill==0.2.7.1
enum34==1.1.6
future==0.16.0
futures==3.1.1
gapic-google-cloud-logging-v2==0.91.3
google-auth==1.0.2
google-cloud-core==0.27.1
google-cloud-logging==1.3.0
google-gax==0.15.14
googleapis-common-protos==1.5.2
grpcio==1.6.0
html5lib==0.999
httplib2==0.10.3
idna==2.6
mercurial==3.1.2
oauth2client==3.0.0
ply==3.8
proto-google-cloud-logging-v2==0.91.3
protobuf==3.4.0
pyasn1==0.3.4
pyasn1-modules==0.1.4
requests==2.18.4
rsa==3.4.2
six==1.10.0
urllib3==1.22
virtualenv==15.0.3
Hrm I can't repro either. Can you tell me a bit more info?
* Python version
* OS
* Which type of credentials you're using (`import google.auth; print(google.auth.default())`)
Here is the dockerfile I use
(It's based on an appengine image because in my full code, it's an appengine app with a worker and a monitor like https://cloud.google.com/python/getting-started/tutorial-app)
```
FROM gcr.io/google_appengine/python
RUN apt-get update && apt-get install -y emacs
ADD requirements.txt /app/
RUN pip install -r requirements.txt
ADD . /app
ADD MyServiceAccount.json /cred/
ENV GOOGLE_APPLICATION_CREDENTIALS "/cred/MyServiceAccount.json"
ENV GOOGLE_CLOUD_DISABLE_GRPC true
CMD python my.py
```
Hum, I just noticed that it is linked with GOOGLE_CLOUD_DISABLE_GRPC.
If I remove it, the error doesn't occur.
Ah, it's probably [this line](https://github.com/GoogleCloudPlatform/google-cloud-python/blob/master/logging/google/cloud/logging/handlers/transports/background_thread.py#L257) which is a relic from when httplib2 was used as it wasn't threadsafe. We can probably just remove that.
@liyanhui1228 Can you take this?
Sure! I can take this. Sorry for the late reply, I'm looking into this right now. | 2017-09-13T23:39:02Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/google/cloud/logging/handlers/transports/background_thread.py", line 99, in _safely_commit_batch
batch.commit()
File "/usr/local/lib/python2.7/dist-packages/google/cloud/logging/logger.py", line 549, in commit
client.logging_api.write_entries(entries, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/google/cloud/logging/_http.py", line 163, in write_entries
self.api_request(method='POST', path='/entries:write', data=data)
File "/usr/local/lib/python2.7/dist-packages/google/cloud/_http.py", line 290, in api_request
headers=headers, target_object=_target_object)
File "/usr/local/lib/python2.7/dist-packages/google/cloud/_http.py", line 183, in _make_request
return self._do_request(method, url, headers, data, target_object)
File "/usr/local/lib/python2.7/dist-packages/google/cloud/_http.py", line 212, in _do_request
url=url, method=method, headers=headers, data=data)
File "/usr/local/lib/python2.7/dist-packages/google/auth/transport/requests.py", line 175, in request
self.credentials.before_request(
AttributeError: 'AuthorizedSession' object has no attribute 'credentials'
| 6,061 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-4015 | eb43849569556c6e47f11b8310864c5a280507f2 | diff --git a/spanner/google/cloud/spanner/streamed.py b/spanner/google/cloud/spanner/streamed.py
--- a/spanner/google/cloud/spanner/streamed.py
+++ b/spanner/google/cloud/spanner/streamed.py
@@ -298,13 +298,15 @@ def _merge_struct(lhs, rhs, type_):
_MERGE_BY_TYPE = {
+ type_pb2.ARRAY: _merge_array,
type_pb2.BOOL: _unmergeable,
- type_pb2.INT64: _merge_string,
+ type_pb2.BYTES: _merge_string,
+ type_pb2.DATE: _merge_string,
type_pb2.FLOAT64: _merge_float64,
+ type_pb2.INT64: _merge_string,
type_pb2.STRING: _merge_string,
- type_pb2.ARRAY: _merge_array,
type_pb2.STRUCT: _merge_struct,
- type_pb2.BYTES: _merge_string,
+ type_pb2.TIMESTAMP: _merge_string,
}
| Google cloud spanner client fails when reading large columns
Consider following table:
```
CREATE TABLE big_arrays (
key INT64 NOT NULL,
a_big_array ARRAY<TIMESTAMP> NOT NULL,
) PRIMARY KEY (key)
```
Which gets populated by 100 rows each containing a large array of 1000 timestamps.
```
from google.cloud import spanner
import datetime
import random
spanner_client = spanner.Client(project='my-project')
instance = spanner_client.instance('my-instance')
db = instance.database('my-db')
def random_timestamp():
return datetime.datetime.utcfromtimestamp(random.uniform(1e9, 2e9))
def random_big_array(size=100):
return [random_timestamp() for i in range(1000)]
batch_size = 10
batches = 10
for batch_index in range(batches):
with db.batch() as batch:
batch.insert_or_update(
table='big_arrays',
columns=['key', 'a_big_array'],
values = [(key, random_big_array()) for key in range(batch_index*batch_size, (batch_index+1)*batch_size)]
)
```
Trying to fetch these rows with `db.execute_sql('select * from big_arrays limit 100').consume_all()` results in following error and stacktrace:
```
Traceback (most recent call last):
File "Untitled5.py", line 60, in <module>
db.execute_sql("SELECT * FROM big_arrays limit 100").consume_all()
File "/Users/myname/Repos/transformations/venv/lib/python2.7/site-packages/google/cloud/spanner/streamed.py", line 159, in consume_all
self.consume_next()
File "/Users/myname/Repos/transformations/venv/lib/python2.7/site-packages/google/cloud/spanner/streamed.py", line 148, in consume_next
values[0] = self._merge_chunk(values[0])
File "/Users/myname/Repos/transformations/venv/lib/python2.7/site-packages/google/cloud/spanner/streamed.py", line 108, in _merge_chunk
merged = _merge_by_type(self._pending_chunk, value, field.type)
File "/Users/myname/Repos/transformations/venv/lib/python2.7/site-packages/google/cloud/spanner/streamed.py", line 272, in _merge_by_type
return merger(lhs, rhs, type_)
File "/Users/myname/Repos/transformations/venv/lib/python2.7/site-packages/google/cloud/spanner/streamed.py", line 233, in _merge_array
merged = _merge_by_type(last, first, element_type)
File "/Users/myname/Repos/transformations/venv/lib/python2.7/site-packages/google/cloud/spanner/streamed.py", line 271, in _merge_by_type
merger = _MERGE_BY_TYPE[type_.code]
KeyError: 4
```
Running the above query with the gcloud cli works as intended
`gcloud spanner databases execute-sql my-db --instance=my-instance --sql='select * from big_arrays limit 100' >/dev/null`
add merging methods for date and timestamp chunks
Fix for https://github.com/GoogleCloudPlatform/google-cloud-python/issues/3998
| Issue seems fixed by adding
```
_MERGE_BY_TYPE.update({
type_pb2.DATE: _merge_string,
type_pb2.TIMESTAMP: _merge_string,
})
```
to google.cloud.spanner.streamed
cc @tseaver @lukesneeringer
Thanks for filing this. This is duplicate of #3981 .
@paulMissault Do you want to submit your proposed fix as a pull request. It looks reasonable to me.
@lukesneeringer @tseaver Do you agree that this looks like a quick fix? If so, let's get it done before Beta. If it looks much more involved, then let's discuss more.
Thanks for your pull request. It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).
:memo: **Please visit <https://cla.developers.google.com/> to sign.**
Once you've signed, please reply here (e.g. `I signed it!`) and we'll verify. Thanks.
---
- If you've already signed a CLA, it's possible we don't have your GitHub username or you're using a different email address. Check [your existing CLA data](https://cla.developers.google.com/clas) and verify that your [email is set on your git commits](https://help.github.com/articles/setting-your-email-in-git/).
- If your company signed a CLA, they designated a Point of Contact who decides which employees are authorized to participate. You may need to contact the Point of Contact for your company and ask to be added to the group of authorized contributors. If you don't know who your Point of Contact is, direct the project maintainer to go/cla#troubleshoot.
- In order to pass this check, please resolve this problem and have the pull request author add another comment and the bot will run again.
<!-- need_sender_cla -->
Hi @paulMissault,
Thanks so much for doing this. I am ready to merge it.
As the botty-bot notes, we do need you to sign a CLA (basically legally allowing us to use the code), or otherwise my desk might get swarmed with angry lawyers with pitchforks. Okay, not really, but I do need you to sign it before I can hit the big green button.
After that, happy to merge. Thanks! | 2017-09-21T13:49:30Z | [] | [] |
Traceback (most recent call last):
File "Untitled5.py", line 60, in <module>
db.execute_sql("SELECT * FROM big_arrays limit 100").consume_all()
File "/Users/myname/Repos/transformations/venv/lib/python2.7/site-packages/google/cloud/spanner/streamed.py", line 159, in consume_all
self.consume_next()
File "/Users/myname/Repos/transformations/venv/lib/python2.7/site-packages/google/cloud/spanner/streamed.py", line 148, in consume_next
values[0] = self._merge_chunk(values[0])
File "/Users/myname/Repos/transformations/venv/lib/python2.7/site-packages/google/cloud/spanner/streamed.py", line 108, in _merge_chunk
merged = _merge_by_type(self._pending_chunk, value, field.type)
File "/Users/myname/Repos/transformations/venv/lib/python2.7/site-packages/google/cloud/spanner/streamed.py", line 272, in _merge_by_type
return merger(lhs, rhs, type_)
File "/Users/myname/Repos/transformations/venv/lib/python2.7/site-packages/google/cloud/spanner/streamed.py", line 233, in _merge_array
merged = _merge_by_type(last, first, element_type)
File "/Users/myname/Repos/transformations/venv/lib/python2.7/site-packages/google/cloud/spanner/streamed.py", line 271, in _merge_by_type
merger = _MERGE_BY_TYPE[type_.code]
KeyError: 4
| 6,062 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-4381 | 69a9840955caefbd431ee459ae91632b9cf79a16 | diff --git a/storage/google/cloud/storage/bucket.py b/storage/google/cloud/storage/bucket.py
--- a/storage/google/cloud/storage/bucket.py
+++ b/storage/google/cloud/storage/bucket.py
@@ -251,7 +251,7 @@ def exists(self, client=None):
except NotFound:
return False
- def create(self, client=None):
+ def create(self, client=None, project=None):
"""Creates current bucket.
If the bucket already exists, will raise
@@ -265,12 +265,28 @@ def create(self, client=None):
``NoneType``
:param client: Optional. The client to use. If not passed, falls back
to the ``client`` stored on the current bucket.
+
+ :type project: str
+ :param project: (Optional) the project under which the bucket is to
+ be created. If not passed, uses the project set on
+ the client.
+ :raises ValueError: if :attr:`user_project` is set.
+ :raises ValueError: if ``project`` is None and client's
+ :attr:`project` is also None.
"""
if self.user_project is not None:
raise ValueError("Cannot create bucket with 'user_project' set.")
client = self._require_client(client)
- query_params = {'project': client.project}
+
+ if project is None:
+ project = client.project
+
+ if project is None:
+ raise ValueError(
+ "Client project not set: pass an explicit project.")
+
+ query_params = {'project': project}
properties = {key: self._properties[key] for key in self._changes}
properties['name'] = self.name
api_response = client._connection.api_request(
diff --git a/storage/google/cloud/storage/client.py b/storage/google/cloud/storage/client.py
--- a/storage/google/cloud/storage/client.py
+++ b/storage/google/cloud/storage/client.py
@@ -26,10 +26,13 @@
from google.cloud.storage.bucket import Bucket
+_marker = object()
+
+
class Client(ClientWithProject):
"""Client to bundle configuration needed for API requests.
- :type project: str
+ :type project: str or None
:param project: the project which the client acts on behalf of. Will be
passed when creating a topic. If not passed,
falls back to the default inferred from the environment.
@@ -55,10 +58,19 @@ class Client(ClientWithProject):
'https://www.googleapis.com/auth/devstorage.read_write')
"""The scopes required for authenticating as a Cloud Storage consumer."""
- def __init__(self, project=None, credentials=None, _http=None):
+ def __init__(self, project=_marker, credentials=None, _http=None):
self._base_connection = None
+ if project is None:
+ no_project = True
+ project = '<none>'
+ else:
+ no_project = False
+ if project is _marker:
+ project = None
super(Client, self).__init__(project=project, credentials=credentials,
_http=_http)
+ if no_project:
+ self.project = None
self._connection = Connection(self)
self._batch_stack = _LocalStack()
@@ -216,7 +228,7 @@ def lookup_bucket(self, bucket_name):
except NotFound:
return None
- def create_bucket(self, bucket_name, requester_pays=None):
+ def create_bucket(self, bucket_name, requester_pays=None, project=None):
"""Create a new bucket.
For example:
@@ -238,17 +250,22 @@ def create_bucket(self, bucket_name, requester_pays=None):
(Optional) Whether requester pays for API requests for this
bucket and its blobs.
+ :type project: str
+ :param project: (Optional) the project under which the bucket is to
+ be created. If not passed, uses the project set on
+ the client.
+
:rtype: :class:`google.cloud.storage.bucket.Bucket`
:returns: The newly created bucket.
"""
bucket = Bucket(self, name=bucket_name)
if requester_pays is not None:
bucket.requester_pays = requester_pays
- bucket.create(client=self)
+ bucket.create(client=self, project=project)
return bucket
def list_buckets(self, max_results=None, page_token=None, prefix=None,
- projection='noAcl', fields=None):
+ projection='noAcl', fields=None, project=None):
"""Get all buckets in the project associated to the client.
This will not populate the list of blobs available in each
@@ -284,11 +301,24 @@ def list_buckets(self, max_results=None, page_token=None, prefix=None,
response with just the next page token and the language of each
bucket returned: 'items/id,nextPageToken'
+ :type project: str
+ :param project: (Optional) the project whose buckets are to be listed.
+ If not passed, uses the project set on the client.
+
:rtype: :class:`~google.api_core.page_iterator.Iterator`
+ :raises ValueError: if both ``project`` is ``None`` and the client's
+ project is also ``None``.
:returns: Iterator of all :class:`~google.cloud.storage.bucket.Bucket`
belonging to this project.
"""
- extra_params = {'project': self.project}
+ if project is None:
+ project = self.project
+
+ if project is None:
+ raise ValueError(
+ "Client project not set: pass an explicit project.")
+
+ extra_params = {'project': project}
if prefix is not None:
extra_params['prefix'] = prefix
diff --git a/storage/google/cloud/storage/notification.py b/storage/google/cloud/storage/notification.py
--- a/storage/google/cloud/storage/notification.py
+++ b/storage/google/cloud/storage/notification.py
@@ -76,6 +76,11 @@ def __init__(self, bucket, topic_name,
if topic_project is None:
topic_project = bucket.client.project
+
+ if topic_project is None:
+ raise ValueError(
+ "Client project not set: pass an explicit topic_project.")
+
self._topic_project = topic_project
self._properties = {}
| Vision: Google Cloud Storage Client project=None bootstrapping problem
Hello,
In using the Google Cloud API's, I noticed a small discrepancy in the documentation's expected behavior, and the actual behavior.
The documentation and Google cloud examples code typically assumes that credentials should be created using a JSON key store saved on persistent storage. However, this isn't always an option, as with deployments using ephemeral storage. Luckily, a Credentials object can be created directly from a dictionary, which houses the same mapping that the JSON keystore would hold. This is the procedure that I use in my project (the specifics of which, is shown below).
The [Storage.Client documentation](https://googlecloudplatform.github.io/google-cloud-python/latest/storage/client.html#google.cloud.storage.client.Client) suggests that the `project` parameter is optional, and thus may be None. We found, however, that when it is not specified, an error is thrown, as a result of a sort of bootstrapping problem in which the Google Cloud APIs check to see if the project is None, and if so, attempt to create it (which is impossible, without first having the keys).
To elaborate:
1. When we make a call to Storage.Client(), we step into [storage/google/cloud/storage/client.py](https://github.com/GoogleCloudPlatform/google-cloud-python/blob/master/).
2. This class calls `super(Client, self).__init__(project=project, credentials=credentials, _http=_http)`, which takes us to [core/google/cloud/client.py](https://github.com/GoogleCloudPlatform/google-cloud-python/blob/master/core/google/cloud/client.py) .
3. This calls `_ClientProjectMixin.__init__(self, project=project)`, where we see the following call:
```
project = self._determine_default(project)
```
4. Following this call, we see:
```
if project is None:
_, project = google.auth.default()
```
So, there is an attempt to infer the project parameter based on the environment credentials. However, as discussed above, we are using a dictionary-created Credentials object as our credentials, so no such credentials exist in the environment. **This is a problem because the documentation claims that project is optional, when, in fact, it is not if the project is not stored in the environment.**
Note, however, that this seems to be a problem specific with Storage.Client(), and not other API clients. Trying this procedure (see the code below) for `google.cloud.vision.ImageAnnotatorClient` seems to work perfectly fine with just a credentials object, without the project argument.
----
1. **OS:** Ubuntu 16.04
2. **Python Version:** Python 2.7.10
3. **Google APIs:**
google-auth==1.1.1
google-cloud-core==0.27.1
google-cloud-storage==1.4.0
google-cloud-vision==0.27.0
google-gax==0.15.15
google-resumable-media==0.2.3
googleapis-common-protos==1.5.3
oauth2client==4.1.2
4. **Stacktrace:**
```
Traceback (most recent call last):
File "storage_client.py", line 23, in <module>
client = _get_client()
File "storage_client.py", line 18, in _get_client
client = storage.Client(credentials=creds)
File "/Users/admin/testproject/venv/lib/python2.7/site-packages/google/cloud/storage/client.py", line 59, in __init__
_http=_http)
File "/Users/admin/testproject/venv/lib/python2.7/site-packages/google/cloud/client.py", line 211, in __init__
_ClientProjectMixin.__init__(self, project=project)
File "/Users/admin/testproject/venv/lib/python2.7/site-packages/google/cloud/client.py", line 165, in __init__
project = self._determine_default(project)
File "/Users/admin/testproject/venv/lib/python2.7/site-packages/google/cloud/client.py", line 178, in _determine_default
return _determine_default_project(project)
File "/Users/admin/testproject/venv/lib/python2.7/site-packages/google/cloud/_helpers.py", line 179, in _determine_default_project
_, project = google.auth.default()
File "/Users/admin/testproject/venv/lib/python2.7/site-packages/google/auth/_default.py", line 286, in default
raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or
explicitly create credential and re-run the application. For more
information, please see
https://developers.google.com/accounts/docs/application-default-credentials.
```
5. **Steps to reproduce:**
- So, create an OAuth2 credentials object from a python dictionary (let's call it `ketfile_dict` using `google.oauth2.service_account.Credentials.from_service_account_info(keyfile_dict)`.
- Create an instance of a storage client using `google.cloud.storage.Client(credentials=creds)`
- Although the documentation states that the "project" argument is optional, the API will in fact throw an error when no project argument is presented.
6. Code example
Assume that in the same directory, you create a file, `constants.py`, with the following fields defined from the appropriate fields of the service account key JSON.
constants.GCP_SCOPE
constants.GCP_TYPE
constants.GCP_CLIENT_EMAIL
constants.GCP_PRIVATE_KEY
constants.GCP_PRIVATE_KEY_ID
constants.GCP_CLIENT_ID
constants.GCP_TOKEN_URI
constants.GCP_PROJECT_ID
```
from google.cloud import storage
from google.oauth2 import service_account
import constants
# Return an Oauth2 credentials instance
def get_credentials(with_keyfile=False):
keyfile_dict = _get_keyfile_dict()
scope = [constants.GCP_SCOPE]
creds = service_account.Credentials.from_service_account_info(keyfile_dict).with_scopes(scope)
if not with_keyfile:
return creds
else:
# Return creds and the keyfile.
return creds, keyfile_dict
# Construct a dictionary of the required key:value pairs.
def _get_keyfile_dict():
keyfile_dict = {}
keyfile_dict['type'] = constants.GCP_TYPE
keyfile_dict['client_email'] = constants.GCP_CLIENT_EMAIL
keyfile_dict['private_key'] = constants.GCP_PRIVATE_KEY
keyfile_dict['private_key_id'] = constants.GCP_PRIVATE_KEY_ID
keyfile_dict['client_id'] = constants.GCP_CLIENT_ID
keyfile_dict['token_uri'] = constants.GCP_TOKEN_URI
keyfile_dict['project_id'] = constants.GCP_PROJECT_ID
return keyfile_dict
def _get_client():
creds, keyfile_dict = get_credentials(with_keyfile=True)
project = keyfile_dict['project_id']
#client = storage.Client(project=project, credentials=creds) # This WILL work because a project is passed in.
#client = vision.ImageAnnotatorClient(credentials=creds) #Note that the Vision client will work without a project argument
client = storage.Client(credentials=creds) # This will throw an error because project=None.
return client
# Instantiates a client
client = _get_client()
```
Let me know if you have any other questions. :)
| @jonparrott Is this just a documentation issue, or should the client be willing to infer the project from the passed `credentials`?
This is interesting, because storage effectively doesn't need a project to operate. Buckets are global resources.
Places which touch `client.project`:
- `list_buckets` is the only client method which uses the project: I guess we could add an optional `project` argument, falling back to the client's project. Likely we should raise an exception if neither is set: `project` is a [required parameter for 'buckets.list'](https://cloud.google.com/storage/docs/json_api/v1/buckets/list#project).
- `Bucket.create` is the only bucket method which uses it: it is [likewise required](https://cloud.google.com/storage/docs/json_api/v1/buckets/insert#project), so we could allow an optional `project` argument and fall back / fail in the same way. We would also need to plumb it through to `Client.create_bucket`.
- `BucketNotification.__init__` takes ~~and~~ **an** optional `topic_project` parameter, and falls back to the client's project. We would need to have it raise if neither was set.
To implement this, we could have the default for the `project` argument be a marker, and then treat `None` specially (pass through a dummy to the `super()` version and then assign `None`.
Sounds reasonable, @tseaver. | 2017-11-10T18:04:16Z | [] | [] |
Traceback (most recent call last):
File "storage_client.py", line 23, in <module>
client = _get_client()
File "storage_client.py", line 18, in _get_client
client = storage.Client(credentials=creds)
File "/Users/admin/testproject/venv/lib/python2.7/site-packages/google/cloud/storage/client.py", line 59, in __init__
_http=_http)
File "/Users/admin/testproject/venv/lib/python2.7/site-packages/google/cloud/client.py", line 211, in __init__
_ClientProjectMixin.__init__(self, project=project)
File "/Users/admin/testproject/venv/lib/python2.7/site-packages/google/cloud/client.py", line 165, in __init__
project = self._determine_default(project)
File "/Users/admin/testproject/venv/lib/python2.7/site-packages/google/cloud/client.py", line 178, in _determine_default
return _determine_default_project(project)
File "/Users/admin/testproject/venv/lib/python2.7/site-packages/google/cloud/_helpers.py", line 179, in _determine_default_project
_, project = google.auth.default()
File "/Users/admin/testproject/venv/lib/python2.7/site-packages/google/auth/_default.py", line 286, in default
raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or
| 6,095 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-4699 | d16f5c1358569b914e008543e4f3dee64ba66ecb | diff --git a/logging/google/cloud/logging/sink.py b/logging/google/cloud/logging/sink.py
--- a/logging/google/cloud/logging/sink.py
+++ b/logging/google/cloud/logging/sink.py
@@ -27,9 +27,8 @@ class Sink(object):
:param name: the name of the sink
:type filter_: str
- :param filter_: the advanced logs filter expression defining the entries
- exported by the sink. If not passed, the instance should
- already exist, to be refreshed via :meth:`reload`.
+ :param filter_: (optional) the advanced logs filter expression defining
+ the entries exported by the sink.
:type destination: str
:param destination: destination URI for the entries exported by the sink.
@@ -91,8 +90,8 @@ def from_api_repr(cls, resource, client):
from the client.
"""
sink_name = resource['name']
- filter_ = resource['filter']
destination = resource['destination']
+ filter_ = resource.get('filter')
return cls(sink_name, filter_, destination, client=client)
def _require_client(self, client):
@@ -163,8 +162,8 @@ def reload(self, client=None):
"""
client = self._require_client(client)
data = client.sinks_api.sink_get(self.project, self.name)
- self.filter_ = data['filter']
self.destination = data['destination']
+ self.filter_ = data.get('filter')
def update(self, client=None):
"""API call: update sink configuration via a PUT request
| Logging: sink reload throws exception if the sink has no filter
A KeyError exception for 'filter' is thrown when calling [sink].reload() if there is no filter. The API doesn't return a 'filter' key when retrieving the sink if its null.
```
Python 3.5.2
google-cloud-core (0.28.0)
google-cloud-logging (1.4.0)
```
## Simplified code example
```
logging_client = logging.Client(project_id)
sink = logging_client.sink(sink_name)
sink.reload()
```
## Raw API response for sink request (via gcloud)
```
$ gcloud logging sinks list --project xxxx --log-http
=======================
==== request start ====
uri: https://logging.googleapis.com/v2/projects/xxxx/sinks?alt=json
method: GET
[...]
-- body start --
{
"sinks": [
{
"name": "xxxx",
"destination": "storage.googleapis.com/xxxx",
"outputVersionFormat": "V2",
"writerIdentity": "serviceAccount:xxxx"
}
]
}
-- body end --
[...]
NAME DESTINATION FILTER
xxxx storage.googleapis.com/xxxx (empty filter)
```
## Problematic code
/google/cloud/logging/sink.py Line 157
`self.filter_ = data['filter']`
## Traceback
```
Traceback (most recent call last):
File "/[...]/python3.5/site-packages/xxxx", line 50, in _get_sink
sink.reload()
File "[...]/python3.5/site-packages/google/cloud/logging/sink.py", line 157, in reload
self.filter_ = data['filter']
KeyError: 'filter'
```
| @jceresini Thanks for the report! I think maybe the `filter` argument used to be required on the back-end, and so we never saw this case. | 2018-01-04T20:44:24Z | [] | [] |
Traceback (most recent call last):
File "/[...]/python3.5/site-packages/xxxx", line 50, in _get_sink
sink.reload()
File "[...]/python3.5/site-packages/google/cloud/logging/sink.py", line 157, in reload
self.filter_ = data['filter']
KeyError: 'filter'
| 6,118 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-4749 | f1266b25da0129210c8a16328211d97321af3d11 | diff --git a/spanner/google/cloud/spanner_v1/database.py b/spanner/google/cloud/spanner_v1/database.py
--- a/spanner/google/cloud/spanner_v1/database.py
+++ b/spanner/google/cloud/spanner_v1/database.py
@@ -245,6 +245,9 @@ def update_ddl(self, ddl_statements):
See
https://cloud.google.com/spanner/reference/rpc/google.spanner.admin.database.v1#google.spanner.admin.database.v1.DatabaseAdmin.UpdateDatabase
+ :type ddl_statements: Sequence[str]
+ :param ddl_statements: a list of DDL statements to use on this database
+
:rtype: :class:`google.api_core.operation.Operation`
:returns: an operation instance
:raises NotFound: if the database does not exist
| SPANNER Python API Client - Using database.update_ddl to create/update/drop TABLES
I tried to execute a simple DDL script in python (3.6) to create a table. But I don't know if the update_ddl is used only to manipulate the database schema, or it will also allow me to manipulate the database objects: like TABLES, INDEXES, ... Running regular SQL (inserting, updating, deleting, querying) it all worked like a charm.
Here is a simple example of DDL to create TABLES:
//
def create_tables(instance_id, database_id):
spanner_client = spanner.Client()
instance = spanner_client.instance(instance_id)
database = instance.database(database_id)
# instance = spanner.Client().instance(instance_id)
operation = database.update_ddl(database_id, ddl_statements=(
"""CREATE TABLE junk (
junk_id INT64 NOT NULL,
name STRING(MAX)
) PRIMARY KEY (junk_id)"""
)
status = database.create()
print('Waiting for operation to complete...')
status.result()
//
**ERROR**:
Traceback (most recent call last):
File "spannerPython.py", line 77, in <module>
if __name__ == "__main__":main()
File "spannerPython.py", line 71, in main
create_tables(instance_id, database_id)
File "spannerPython.py", line 17, in create_tables
) PRIMARY KEY (junk_id)"""
TypeError: update_ddl() got multiple values for argument 'ddl_statements'
This is the current environment information I have installed:
**OS**:
System Version: macOS 10.13.2 (17C205)
Kernel Version: Darwin 17.3.0
**Python**
Version: 3.6.3
**google-cloud-spanner**
Version: 0.30.0
Summary: Python Client for Cloud Spanner
Home-page: https://github.com/GoogleCloudPlatform/google-cloud-python
Author: Google Cloud Platform
Author-email: googleapis-publisher@google.com
License: Apache 2.0
Location: /Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages
Requires: google-api-core, requests, google-cloud-core, grpc-google-iam-v1, google-auth
**pip freeze google**
gapic-google-cloud-vision-v1==0.90.3
google-api-core==0.1.4
google-api-python-client==1.6.4
google-auth==1.3.0
google-auth-httplib2==0.0.3
google-cloud-core==0.28.0
google-cloud-spanner==0.30.0
google-cloud-storage==1.6.0
google-cloud-vision==0.29.0
google-gax==0.15.16
google-resumable-media==0.3.1
googleapis-common-protos==1.5.3
| Hello, Could you try putting the ddl statements in a list? I'm afk right
now so I can't test it. But it might be worth a try.
On Fri, Jan 12, 2018, 3:53 AM LMeinhardt <notifications@github.com> wrote:
> I tried to execute a simple DDL script in python (3.6) to create a table.
> So I don't know id the update_ddl is used only to manipulate the database
> schema, or it will also allow me to manipulate the database objects: like
> TABLES, INDEXES, ... Running regular SQL (inserting, updating, deleting,
> querying) it all worked like a charm.
>
> Here is a simple example of DDL to create TABLES:
> //
> def create_tables(instance_id, database_id):
>
> spanner_client = spanner.Client()
> instance = spanner_client.instance(instance_id)
> database = instance.database(database_id)
> # instance = spanner.Client().instance(instance_id)
>
> operation = database.update_ddl(database_id, ddl_statements=(
> """CREATE TABLE junk (
> junk_id INT64 NOT NULL,
> name STRING(MAX)
> ) PRIMARY KEY (junk_id)"""
> )
>
> status = database.create()
>
> print('Waiting for operation to complete...')
> status.result()
>
> //
>
> ERROR:
> Traceback (most recent call last):
> File "spannerPython.py", line 77, in
> if *name* == "*main*":main()
> File "spannerPython.py", line 71, in main
> create_tables(instance_id, database_id)
> File "spannerPython.py", line 17, in create_tables
> ) PRIMARY KEY (junk_id)"""
> TypeError: update_ddl() got multiple values for argument 'ddl_statements'
>
> This is the current environment information I have installed:
> OS:
> System Version: macOS 10.13.2 (17C205)
> Kernel Version: Darwin 17.3.0
> Python
> Version: 3.6.3
> google-cloud-spanner
> Version: 0.30.0
> Summary: Python Client for Cloud Spanner
> Home-page: https://github.com/GoogleCloudPlatform/google-cloud-python
> Author: Google Cloud Platform
> Author-email: googleapis-publisher@google.com
> License: Apache 2.0
> Location:
> /Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages
> Requires: google-api-core, requests, google-cloud-core,
> grpc-google-iam-v1, google-auth
> pip freeze google
> gapic-google-cloud-vision-v1==0.90.3
> google-api-core==0.1.4
> google-api-python-client==1.6.4
> google-auth==1.3.0
> google-auth-httplib2==0.0.3
> google-cloud-core==0.28.0
> google-cloud-spanner==0.30.0
> google-cloud-storage==1.6.0
> google-cloud-vision==0.29.0
> google-gax==0.15.16
> google-resumable-media==0.3.1
> googleapis-common-protos==1.5.3
>
> —
> You are receiving this because you are subscribed to this thread.
> Reply to this email directly, view it on GitHub
> <https://github.com/GoogleCloudPlatform/google-cloud-python/issues/4747>,
> or mute the thread
> <https://github.com/notifications/unsubscribe-auth/ADzDDDJ3ck8G7qbC_vPZnjWtdCzFEuGVks5tJ0fDgaJpZM4RcMOC>
> .
>
Fixed the problem, thanks...
//
def create_database(instance_id, database_id):
"""Creates a database and tables for sample data."""
spanner_client = spanner.Client()
instance = spanner_client.instance(instance_id)
database = instance.database(database_id)
operation = database.update_ddl([
"""CREATE TABLE junk (
junk_id INT64 NOT NULL,
name STRING(144),
) PRIMARY KEY(junk_id)"""])
print('Waiting for operation to complete...')
operation.result()
print('Created tables on database {} - instance {}'.format(
database_id, instance_id))
//
Do we need to better document the parameter types here:
https://github.com/GoogleCloudPlatform/google-cloud-python/blob/master/spanner/google/cloud/spanner_v1/database.py#L240
On Fri, Jan 12, 2018 at 9:33 AM, LMeinhardt <notifications@github.com>
wrote:
> Fixed the problem, thanks...
> //
> def create_database(instance_id, database_id):
> """Creates a database and tables for sample data."""
> spanner_client = spanner.Client()
> instance = spanner_client.instance(instance_id)
> database = instance.database(database_id)
>
> operation = database.update_ddl([
> """CREATE TABLE junk (
> junk_id INT64 NOT NULL,
> name STRING(144),
> ) PRIMARY KEY(junk_id)"""])
>
> print('Waiting for operation to complete...')
> operation.result()
>
> print('Created tables on database {} - instance {}'.format(
> database_id, instance_id))
>
> //
>
> —
> You are receiving this because you are subscribed to this thread.
> Reply to this email directly, view it on GitHub
> <https://github.com/GoogleCloudPlatform/google-cloud-python/issues/4747#issuecomment-357300014>,
> or mute the thread
> <https://github.com/notifications/unsubscribe-auth/ATdef2cK4qGY_0Ovcj_gZRU5VkzO4Accks5tJ5STgaJpZM4RcMOC>
> .
>
| 2018-01-12T18:49:52Z | [] | [] |
Traceback (most recent call last):
File "spannerPython.py", line 77, in <module>
if __name__ == "__main__":main()
File "spannerPython.py", line 71, in main
create_tables(instance_id, database_id)
File "spannerPython.py", line 17, in create_tables
) PRIMARY KEY (junk_id)"""
TypeError: update_ddl() got multiple values for argument 'ddl_statements'
| 6,124 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-480 | 626adca707bf10f5e236247a8dca7ce219a542ec | diff --git a/gcloud/datastore/__init__.py b/gcloud/datastore/__init__.py
--- a/gcloud/datastore/__init__.py
+++ b/gcloud/datastore/__init__.py
@@ -146,18 +146,6 @@ def _require_dataset():
return _implicit_environ.DATASET
-def get_entity(key):
- """Retrieves entity from implicit dataset, along with its attributes.
-
- :type key: :class:`gcloud.datastore.key.Key`
- :param key: The name of the item to retrieve.
-
- :rtype: :class:`gcloud.datastore.entity.Entity` or ``None``
- :returns: The requested entity, or ``None`` if there was no match found.
- """
- return _require_dataset().get_entity(key)
-
-
def get_entities(keys):
"""Retrieves entities from implied dataset, along with their attributes.
diff --git a/gcloud/datastore/dataset.py b/gcloud/datastore/dataset.py
--- a/gcloud/datastore/dataset.py
+++ b/gcloud/datastore/dataset.py
@@ -72,21 +72,6 @@ def id(self):
return self._id
- def get_entity(self, key):
- """Retrieves entity from the dataset, along with its attributes.
-
- :type key: :class:`gcloud.datastore.key.Key`
- :param key: The key of the entity to be retrieved.
-
- :rtype: :class:`gcloud.datastore.entity.Entity` or `NoneType`
- :returns: The requested entity, or ``None`` if there was no
- match found.
- """
- entities = self.get_entities([key])
-
- if entities:
- return entities[0]
-
def get_entities(self, keys, missing=None, deferred=None):
"""Retrieves entities from the dataset, along with their attributes.
diff --git a/gcloud/datastore/entity.py b/gcloud/datastore/entity.py
--- a/gcloud/datastore/entity.py
+++ b/gcloud/datastore/entity.py
@@ -213,8 +213,8 @@ def reload(self):
exist only locally.
"""
key = self._must_key
- dataset = self._must_dataset
- entity = dataset.get_entity(key.to_protobuf())
+ connection = self._must_dataset.connection()
+ entity = key.get(connection=connection)
if entity:
self.update(entity)
diff --git a/gcloud/datastore/key.py b/gcloud/datastore/key.py
--- a/gcloud/datastore/key.py
+++ b/gcloud/datastore/key.py
@@ -227,6 +227,33 @@ def to_protobuf(self):
return key
+ def get(self, connection=None):
+ """Retrieve entity corresponding to the curretn key.
+
+ :type connection: :class:`gcloud.datastore.connection.Connection`
+ :param connection: Optional connection used to connection to datastore.
+
+ :rtype: :class:`gcloud.datastore.entity.Entity` or `NoneType`
+ :returns: The requested entity, or ``None`` if there was no
+ match found.
+ :raises: `ValueError` if the current key is partial.
+ """
+ # Temporary import hack until Dataset is removed in #477.
+ from gcloud.datastore.dataset import Dataset
+
+ if self.is_partial:
+ raise ValueError('Can only retrieve complete keys.')
+
+ connection = connection or _implicit_environ.CONNECTION
+ dataset = Dataset(self.dataset_id, connection=connection)
+ entities = dataset.get_entities([self])
+
+ if entities:
+ result = entities[0]
+ # We assume that the backend has not changed the key.
+ result.key(self)
+ return result
+
@property
def is_partial(self):
"""Boolean indicating if the key has an ID (or name).
diff --git a/regression/datastore.py b/regression/datastore.py
--- a/regression/datastore.py
+++ b/regression/datastore.py
@@ -29,6 +29,7 @@
DATASET_ID = os.getenv('GCLOUD_TESTS_DATASET_ID')
datastore.set_default_dataset(dataset_id=DATASET_ID)
+datastore.set_default_connection()
class TestDatastore(unittest2.TestCase):
@@ -97,11 +98,9 @@ def _generic_test_post(self, name=None, key_id=None):
self.assertEqual(entity.key().name, name)
if key_id is not None:
self.assertEqual(entity.key().id, key_id)
- retrieved_entity = datastore.get_entity(entity.key())
+ retrieved_entity = entity.key().get()
# Check the keys are the same.
- self.assertEqual(retrieved_entity.key().path, entity.key().path)
- self.assertEqual(retrieved_entity.key().namespace,
- entity.key().namespace)
+ self.assertTrue(retrieved_entity.key() is entity.key())
# Check the data is the same.
retrieved_dict = dict(retrieved_entity.items())
@@ -352,13 +351,13 @@ def test_transaction(self):
entity['url'] = u'www.google.com'
with Transaction():
- retrieved_entity = datastore.get_entity(key)
+ retrieved_entity = key.get()
if retrieved_entity is None:
entity.save()
self.case_entities_to_delete.append(entity)
# This will always return after the transaction.
- retrieved_entity = datastore.get_entity(key)
+ retrieved_entity = key.get()
retrieved_dict = dict(retrieved_entity.items())
entity_dict = dict(entity.items())
self.assertEqual(retrieved_dict, entity_dict)
| Entity.reload does not work
We should make sure to fix the tests as well as the [code](https://github.com/GoogleCloudPlatform/gcloud-python/blob/b343acf4ade6dfc659f71a981eb51b62d22e33a0/gcloud/datastore/entity.py#L215):
``` python
key = self._must_key
dataset = self._must_dataset
entity = dataset.get_entity(key.to_protobuf())
```
(the `key` should be re-loaded, not the key protobuf).
Example failure:
``` python
>>> entity.reload()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "gcloud/datastore/entity.py", line 215, in reload
entity = dataset.get_entity(key.to_protobuf())
File "gcloud/datastore/dataset.py", line 139, in get_entity
key = Key.from_path(*key_or_path)
TypeError: from_path() argument after * must be a sequence, not Key
```
| 2015-01-02T20:37:50Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "gcloud/datastore/entity.py", line 215, in reload
entity = dataset.get_entity(key.to_protobuf())
File "gcloud/datastore/dataset.py", line 139, in get_entity
key = Key.from_path(*key_or_path)
TypeError: from_path() argument after * must be a sequence, not Key
| 6,128 |
||||
googleapis/google-cloud-python | googleapis__google-cloud-python-4819 | a2b64a8f4bf44062d2af30c95c48eebd4eb7671c | diff --git a/runtimeconfig/google/cloud/runtimeconfig/variable.py b/runtimeconfig/google/cloud/runtimeconfig/variable.py
--- a/runtimeconfig/google/cloud/runtimeconfig/variable.py
+++ b/runtimeconfig/google/cloud/runtimeconfig/variable.py
@@ -36,8 +36,11 @@
"""
import base64
+import datetime
-from google.cloud._helpers import _rfc3339_to_datetime
+import pytz
+
+from google.api_core import datetime_helpers
from google.cloud.exceptions import NotFound
from google.cloud.runtimeconfig._helpers import variable_name_from_full_name
@@ -151,13 +154,28 @@ def update_time(self):
See
https://cloud.google.com/deployment-manager/runtime-configurator/reference/rest/v1beta1/projects.configs.variables
- :rtype: :class:`datetime.datetime` or ``NoneType``
- :returns: Datetime object parsed from RFC3339 valid timestamp, or
- ``None`` if the property is not set locally.
+ Returns:
+ :class:`~api_core.datetime_helpers.DatetimeWithNanoseconds`,
+ :class:`datetime.datetime` or ``NoneType``:
+ Datetime object parsed from RFC3339 valid timestamp, or
+ ``None`` if the property is not set locally.
+
+ Raises:
+ ValueError: if value is not a valid RFC3339 timestamp
"""
value = self._properties.get('updateTime')
if value is not None:
- value = _rfc3339_to_datetime(value)
+ try:
+ value = datetime.datetime.strptime(
+ value, datetime_helpers._RFC3339_MICROS)
+ except ValueError:
+ DatetimeNS = datetime_helpers.DatetimeWithNanoseconds
+ value = DatetimeNS.from_rfc3339(value)
+ naive = (
+ value.tzinfo is None
+ or value.tzinfo.utcoffset(value) is None)
+ if naive:
+ value = pytz.utc.localize(value)
return value
def _require_client(self, client):
| Runtimeconfig: Update time fails to parse
OS: Linux (Debian 4.9.65)
Python: 3.5.3 / 2.7.13
Version: google-cloud-runtimeconfig==0.28.0, google-cloud==0.32.0
Steps to reproduce:
1. Using gcloud, set a runtimeconfig variable.
2. In Python, load that variable
3. Try to access its update_time property.
```pycon
>>> from google.cloud import runtimeconfig
>>> client = runtimeconfig.Client()
>>> config = client.config('testconfig')
>>> var = config.get_variable('foo')
>>> var.update_time
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/redacted/lib/python3.5/site-packages/google/cloud/runtimeconfig/variable.py", line 160, in update_time
value = _rfc3339_to_datetime(value)
File "/redacted/lib/python3.5/site-packages/google/cloud/_helpers.py", line 274, in _rfc3339_to_datetime
dt_str, _RFC3339_MICROS).replace(tzinfo=UTC)
File "/usr/lib/python3.5/_strptime.py", line 510, in _strptime_datetime
tt, fraction = _strptime(data_string, format)
File "/usr/lib/python3.5/_strptime.py", line 343, in _strptime
(data_string, format))
ValueError: time data '2018-01-22T21:39:44.095040522Z' does not match format '%Y-%m-%dT%H:%M:%S.%fZ'
>>> var._properties
{'text': '43', 'updateTime': '2018-01-22T21:39:44.095040522Z'}
```
Observation: The `%f` format accepts microseconds, not fractions of arbitrary precision. That means six digits at most, but I see nine in the data.
| Interesting. RuntimeConfig must have started recording nanoseconds somewhat recently.
@jonparrott It seems [Python only supports microsecond precision, not nanoseconds](https://stackoverflow.com/a/10612166/101923).
I'm guessing we should modify the parsing to discard nanoseconds when present. | 2018-01-31T23:35:03Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/redacted/lib/python3.5/site-packages/google/cloud/runtimeconfig/variable.py", line 160, in update_time
value = _rfc3339_to_datetime(value)
File "/redacted/lib/python3.5/site-packages/google/cloud/_helpers.py", line 274, in _rfc3339_to_datetime
dt_str, _RFC3339_MICROS).replace(tzinfo=UTC)
File "/usr/lib/python3.5/_strptime.py", line 510, in _strptime_datetime
tt, fraction = _strptime(data_string, format)
File "/usr/lib/python3.5/_strptime.py", line 343, in _strptime
(data_string, format))
ValueError: time data '2018-01-22T21:39:44.095040522Z' does not match format '%Y-%m-%dT%H:%M:%S.%fZ'
| 6,130 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-5097 | 838144c71233b0f0b1f24c1dde6d799f9b7e6617 | diff --git a/pubsub/google/cloud/pubsub_v1/subscriber/_consumer.py b/pubsub/google/cloud/pubsub_v1/subscriber/_consumer.py
--- a/pubsub/google/cloud/pubsub_v1/subscriber/_consumer.py
+++ b/pubsub/google/cloud/pubsub_v1/subscriber/_consumer.py
@@ -210,6 +210,12 @@ def __iter__(self):
# RPC is still active, keep waiting for queue items.
continue
+ # A call to consumer.close() signaled us to stop generating
+ # requests.
+ if item == _helper_threads.STOP:
+ _LOGGER.debug('Cleanly exiting request generator.')
+ return
+
if self._should_exit():
# We have an item, but the RPC is closed. We should put the
# item back on the queue so that the next RPC can consume it.
@@ -416,6 +422,7 @@ def _stop_no_join(self):
self.resume() # Make sure we aren't paused.
self._stopped.set()
_LOGGER.debug('Stopping helper thread %s', self._consumer_thread.name)
+ # Signal the request generator RPC to exit cleanly.
self.send_request(_helper_threads.STOP)
thread = self._consumer_thread
self._consumer_thread = None
| PubSub - Error message on subscription.close()
Python version: 2.7.14
API Client version: 0.33.0
While closing a subscription, the following message is printed to stderr but no exception is thrown:
```
ERROR:root:Exception serializing message!
Traceback (most recent call last):
File "/miniconda/lib/python2.7/site-packages/grpc/_common.py", line 87, in _transform
return transformer(message)
TypeError: descriptor 'SerializeToString' requires a 'google.protobuf.pyext._message.CMessage' object but received a 'UUID'
```
Recreated by creating a topic+subscription and running the following script:
```
import google.cloud.pubsub
sub_client = google.cloud.pubsub.SubscriberClient()
sub_name = sub_client.subscription_path('[PROJECT]', '[SUBSCRIPTION]')
sub = sub_client.subscribe(sub_name)
sub.open(lambda x: x.ack())
sub.close()
```
| Hrm, I haven't been able to reproduce this. Does this happen consistently for you?
@jonparrott Yes, both with a PubSub emulator and non-emulator, but only with version 0.33.0. When we roll back to 0.32.1 the problem disappears.
Ah okay, I got it reproduced. Will investigate. | 2018-03-22T18:29:26Z | [] | [] |
Traceback (most recent call last):
File "/miniconda/lib/python2.7/site-packages/grpc/_common.py", line 87, in _transform
return transformer(message)
TypeError: descriptor 'SerializeToString' requires a 'google.protobuf.pyext._message.CMessage' object but received a 'UUID'
| 6,159 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-5181 | c17c4e280821e7d1d9f4478d491fef2dc9a6a429 | diff --git a/storage/google/cloud/storage/blob.py b/storage/google/cloud/storage/blob.py
--- a/storage/google/cloud/storage/blob.py
+++ b/storage/google/cloud/storage/blob.py
@@ -1183,8 +1183,13 @@ def create_resumable_upload_session(
def get_iam_policy(self, client=None):
"""Retrieve the IAM policy for the object.
- See
- https://cloud.google.com/storage/docs/json_api/v1/objects/getIamPolicy
+ .. note:
+
+ Blob- / object-level IAM support does not yet exist and methods
+ currently call an internal ACL backend not providing any utility
+ beyond the blob's :attr:`acl` at this time. The API may be enhanced
+ in the future and is currently undocumented. Use :attr:`acl` for
+ managing object access control.
If :attr:`user_project` is set on the bucket, bills the API request
to that project.
@@ -1215,8 +1220,13 @@ def get_iam_policy(self, client=None):
def set_iam_policy(self, policy, client=None):
"""Update the IAM policy for the bucket.
- See
- https://cloud.google.com/storage/docs/json_api/v1/objects/setIamPolicy
+ .. note:
+
+ Blob- / object-level IAM support does not yet exist and methods
+ currently call an internal ACL backend not providing any utility
+ beyond the blob's :attr:`acl` at this time. The API may be enhanced
+ in the future and is currently undocumented. Use :attr:`acl` for
+ managing object access control.
If :attr:`user_project` is set on the bucket, bills the API request
to that project.
@@ -1253,8 +1263,13 @@ def set_iam_policy(self, policy, client=None):
def test_iam_permissions(self, permissions, client=None):
"""API call: test permissions
- See
- https://cloud.google.com/storage/docs/json_api/v1/objects/testIamPermissions
+ .. note:
+
+ Blob- / object-level IAM support does not yet exist and methods
+ currently call an internal ACL backend not providing any utility
+ beyond the blob's :attr:`acl` at this time. The API may be enhanced
+ in the future and is currently undocumented. Use :attr:`acl` for
+ managing object access control.
If :attr:`user_project` is set on the bucket, bills the API request
to that project.
| Blob IAM support apparently invalid
While writing system tests for the `requester_pays` feature, I triggered the following:
```python
Traceback (most recent call last):
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/storage/tests/system.py", line 372, in test_blob_acl_iam_w_user_project
blob.set_iam_policy(policy)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/storage/google/cloud/storage/blob.py", line 1148, in set_iam_policy
_target_object=None)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/storage/.nox/sys-3-6/lib/python3.6/site-packages/google/cloud/_http.py", line 293, in api_request
raise exceptions.from_http_response(response)
google.api.core.exceptions.BadRequest: 400 PUT https://www.googleapis.com/storage/v1/b/new_1506705016579/o/SmallFile/iam?userProject=citric-celerity-697: roles/storage.objectViewer is not a valid role for projects/_/buckets/new_1506705016579/objects/SmallFile#0.
```
Note that the earlier call to `Blob.get_iam_policy` does succeed, returning:
```python
(Pdb) pp policy.to_api_repr()
{'bindings': [{'members': ['projectEditor:some-project-742',
'projectOwner:some-project-742',
'serviceAccount:1065521786570-19reuv03qbdp37du41inh9gtd1s35g1j@developer.gserviceaccount.com'],
'role': 'roles/storage.legacyObjectOwner'},
{'members': ['projectViewer:some-project-742'],
'role': 'roles/storage.legacyObjectReader'},
{'members': ['allUsers'], 'role': 'roles/storage.objectViewer'}],
'etag': 'CAM='}
```
We don't have existing system tests for `Blob.set_iam_policy`, but the [Storage IAM docs](https://cloud.google.com/storage/docs/access-control/using-iam-permissions) don't define any IAM operations for blobs, only for buckets and projects. Indeed, they say:
> To learn about controlling access to individual objects in your buckets, see [Access Control Lists](https://cloud.google.com/storage/docs/access-control/lists).
The [API documentation for Objects](https://cloud.google.com/storage/docs/json_api/v1/objects) also doesn't (any longer?) show `getIamPolicy`, `setIamPolicy`, or `testIamPermissions`.
@lukesneeringer can you loop somebody in to clarify?
| @frankyn confirmed to me offline:
> There is no Object level IAM only Object Level ACL.
It turns out that `Blob.set_iam_policy` can only assign the following "legacy" roles:
- `roles/storage.legacyObjectReader`
- `roles/storage.legacyObjectOwner`
IAM object-level methods should not be deprecated. We should document how to use these methods to assign legacy roles instead.
@frankyn ISTM that the back-end docs need to be updated correspondingly (and first, so that we track them without guesswork).
Good point. I'm working on resolving this in Storage docs. I'll update this issue.
@frankyn ping?
I'm out of office until Monday! I'll pick back up then. Apologies for the
delay.
On Jan 4, 2018 11:37 AM, "Tres Seaver" <notifications@github.com> wrote:
> @frankyn <https://github.com/frankyn> ping?
>
> —
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <https://github.com/GoogleCloudPlatform/google-cloud-python/issues/4087#issuecomment-355362540>,
> or mute the thread
> <https://github.com/notifications/unsubscribe-auth/AB0bwLK3FIgA9YM7FBKKDqZqVkh518DDks5tHRpVgaJpZM4PpDhF>
> .
>
@tseaver docs were blocked by an internal bug and I will check-in again this week. Apologies for the delay.
@frankyn I still don't see API documentation for `objects.getIamPermissions` et. al.
Will update soon. I need to sync with a few stakeholders on the GCS team before continuing there exist additional open questions before introducing the docs back into the world. Setting up meeting for mid-next week given some are OOO. I will have a better update then. Thank you for patience.
Sent a document for GCS team review to determine consensus for what to do here.
@frankyn ping?
Hey @tseaver, responding through hangouts. | 2018-04-11T17:43:40Z | [] | [] |
Traceback (most recent call last):
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/storage/tests/system.py", line 372, in test_blob_acl_iam_w_user_project
blob.set_iam_policy(policy)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/storage/google/cloud/storage/blob.py", line 1148, in set_iam_policy
_target_object=None)
File "/home/tseaver/projects/agendaless/Google/src/google-cloud-python/storage/.nox/sys-3-6/lib/python3.6/site-packages/google/cloud/_http.py", line 293, in api_request
raise exceptions.from_http_response(response)
google.api.core.exceptions.BadRequest: 400 PUT https://www.googleapis.com/storage/v1/b/new_1506705016579/o/SmallFile/iam?userProject=citric-celerity-697: roles/storage.objectViewer is not a valid role for projects/_/buckets/new_1506705016579/objects/SmallFile#0.
| 6,163 |
|||
googleapis/google-cloud-python | googleapis__google-cloud-python-5295 | 4c5e421adc147c816c2f919d8c08822b6a129050 | diff --git a/api_core/nox.py b/api_core/nox.py
--- a/api_core/nox.py
+++ b/api_core/nox.py
@@ -52,7 +52,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/bigquery/nox.py b/bigquery/nox.py
--- a/bigquery/nox.py
+++ b/bigquery/nox.py
@@ -65,7 +65,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/bigquery_datatransfer/nox.py b/bigquery_datatransfer/nox.py
--- a/bigquery_datatransfer/nox.py
+++ b/bigquery_datatransfer/nox.py
@@ -53,7 +53,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/bigtable/nox.py b/bigtable/nox.py
--- a/bigtable/nox.py
+++ b/bigtable/nox.py
@@ -53,7 +53,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/container/nox.py b/container/nox.py
--- a/container/nox.py
+++ b/container/nox.py
@@ -53,7 +53,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/core/nox.py b/core/nox.py
--- a/core/nox.py
+++ b/core/nox.py
@@ -58,7 +58,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/dataproc/nox.py b/dataproc/nox.py
--- a/dataproc/nox.py
+++ b/dataproc/nox.py
@@ -53,7 +53,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/datastore/nox.py b/datastore/nox.py
--- a/datastore/nox.py
+++ b/datastore/nox.py
@@ -54,7 +54,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/dlp/nox.py b/dlp/nox.py
--- a/dlp/nox.py
+++ b/dlp/nox.py
@@ -50,7 +50,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/dns/nox.py b/dns/nox.py
--- a/dns/nox.py
+++ b/dns/nox.py
@@ -53,7 +53,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/error_reporting/nox.py b/error_reporting/nox.py
--- a/error_reporting/nox.py
+++ b/error_reporting/nox.py
@@ -54,7 +54,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/firestore/nox.py b/firestore/nox.py
--- a/firestore/nox.py
+++ b/firestore/nox.py
@@ -56,7 +56,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/language/nox.py b/language/nox.py
--- a/language/nox.py
+++ b/language/nox.py
@@ -53,7 +53,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/logging/nox.py b/logging/nox.py
--- a/logging/nox.py
+++ b/logging/nox.py
@@ -72,7 +72,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/monitoring/nox.py b/monitoring/nox.py
--- a/monitoring/nox.py
+++ b/monitoring/nox.py
@@ -58,7 +58,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/oslogin/nox.py b/oslogin/nox.py
--- a/oslogin/nox.py
+++ b/oslogin/nox.py
@@ -24,7 +24,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/pubsub/google/cloud/pubsub_v1/subscriber/scheduler.py b/pubsub/google/cloud/pubsub_v1/subscriber/scheduler.py
--- a/pubsub/google/cloud/pubsub_v1/subscriber/scheduler.py
+++ b/pubsub/google/cloud/pubsub_v1/subscriber/scheduler.py
@@ -118,5 +118,9 @@ def shutdown(self):
# Drop all pending item from the executor. Without this, the executor
# will block until all pending items are complete, which is
# undesirable.
- self._executor._work_queue.queue.clear()
+ try:
+ while True:
+ self._executor._work_queue.get(block=False)
+ except queue.Empty:
+ pass
self._executor.shutdown()
diff --git a/pubsub/nox.py b/pubsub/nox.py
--- a/pubsub/nox.py
+++ b/pubsub/nox.py
@@ -53,7 +53,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/resource_manager/nox.py b/resource_manager/nox.py
--- a/resource_manager/nox.py
+++ b/resource_manager/nox.py
@@ -53,7 +53,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/runtimeconfig/nox.py b/runtimeconfig/nox.py
--- a/runtimeconfig/nox.py
+++ b/runtimeconfig/nox.py
@@ -53,7 +53,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/speech/nox.py b/speech/nox.py
--- a/speech/nox.py
+++ b/speech/nox.py
@@ -51,7 +51,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/storage/nox.py b/storage/nox.py
--- a/storage/nox.py
+++ b/storage/nox.py
@@ -54,7 +54,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/texttospeech/nox.py b/texttospeech/nox.py
--- a/texttospeech/nox.py
+++ b/texttospeech/nox.py
@@ -24,7 +24,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/trace/nox.py b/trace/nox.py
--- a/trace/nox.py
+++ b/trace/nox.py
@@ -53,7 +53,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/translate/nox.py b/translate/nox.py
--- a/translate/nox.py
+++ b/translate/nox.py
@@ -48,7 +48,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/videointelligence/nox.py b/videointelligence/nox.py
--- a/videointelligence/nox.py
+++ b/videointelligence/nox.py
@@ -35,7 +35,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/vision/nox.py b/vision/nox.py
--- a/vision/nox.py
+++ b/vision/nox.py
@@ -53,7 +53,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
diff --git a/websecurityscanner/nox.py b/websecurityscanner/nox.py
--- a/websecurityscanner/nox.py
+++ b/websecurityscanner/nox.py
@@ -51,7 +51,7 @@ def default(session):
@nox.session
-@nox.parametrize('py', ['2.7', '3.4', '3.5', '3.6'])
+@nox.parametrize('py', ['2.7', '3.5', '3.6', '3.7'])
def unit(session, py):
"""Run the unit test suite."""
| Scheduler needs to shutdown in a way compatible with SimpleQueue and Queue types for worker queue.
Currently one test is failing for the scheduler. The issue is Py3.7 changed the _worker_queue to be a SimpleQueue.
Traceback (most recent call last):
File "/Users/crwilcox/workspace/google-cloud-python/pubsub/tests/unit/pubsub_v1/subscriber/test_scheduler.py", line 51, in test_schedule
scheduler_.shutdown()
File "/Users/crwilcox/workspace/google-cloud-python/pubsub/google/cloud/pubsub_v1/subscriber/scheduler.py", line 121, in shutdown
self._executor._work_queue.
AttributeError: '_queue.SimpleQueue' object has no attribute 'queue'
| 2018-05-03T20:21:28Z | [] | [] |
Traceback (most recent call last):
File "/Users/crwilcox/workspace/google-cloud-python/pubsub/tests/unit/pubsub_v1/subscriber/test_scheduler.py", line 51, in test_schedule
scheduler_.shutdown()
File "/Users/crwilcox/workspace/google-cloud-python/pubsub/google/cloud/pubsub_v1/subscriber/scheduler.py", line 121, in shutdown
self._executor._work_queue.
AttributeError: '_queue.SimpleQueue' object has no attribute 'queue'
| 6,171 |
||||
googleapis/google-cloud-python | googleapis__google-cloud-python-5374 | f79717decde5c1aec3e42e69f8bc1f0512797a73 | diff --git a/pubsub/google/cloud/pubsub_v1/subscriber/_protocol/streaming_pull_manager.py b/pubsub/google/cloud/pubsub_v1/subscriber/_protocol/streaming_pull_manager.py
--- a/pubsub/google/cloud/pubsub_v1/subscriber/_protocol/streaming_pull_manager.py
+++ b/pubsub/google/cloud/pubsub_v1/subscriber/_protocol/streaming_pull_manager.py
@@ -241,22 +241,26 @@ def open(self, callback):
self._callback = functools.partial(_wrap_callback_errors, callback)
- # Start the thread to pass the requests.
- self._dispatcher = dispatcher.Dispatcher(self, self._scheduler.queue)
- self._dispatcher.start()
-
- # Start consuming messages.
+ # Create the RPC
self._rpc = bidi.ResumableBidiRpc(
start_rpc=self._client.api.streaming_pull,
initial_request=self._get_initial_request,
should_recover=self._should_recover)
self._rpc.add_done_callback(self._on_rpc_done)
+
+ # Create references to threads
+ self._dispatcher = dispatcher.Dispatcher(self, self._scheduler.queue)
self._consumer = bidi.BackgroundConsumer(
self._rpc, self._on_response)
+ self._leaser = leaser.Leaser(self)
+
+ # Start the thread to pass the requests.
+ self._dispatcher.start()
+
+ # Start consuming messages.
self._consumer.start()
# Start the lease maintainer thread.
- self._leaser = leaser.Leaser(self)
self._leaser.start()
def close(self, reason=None):
| Exception in subscribe_experimental - google cloud pubsub 0.34.0
Hi, getting an exception in 0.34.0:
```
[2018-05-13 12:30:14,744 - ERROR - Thread-17 - _channel] Exception iterating requests!
Traceback (most recent call last):
File "/Users/alonshomrat/Library/Python/2.7/lib/python/site-packages/grpc/_channel.py", line 187, in consume_request_iterator
request = next(request_iterator)
File "/Users/alonshomrat/Library/Python/2.7/lib/python/site-packages/google/cloud/pubsub_v1/subscriber/_protocol/bidi.py", line 100, in __iter__
yield self._initial_request()
File "/Users/alonshomrat/Library/Python/2.7/lib/python/site-packages/google/cloud/pubsub_v1/subscriber/_protocol/streaming_pull_manager.py", line 313, in _get_initial_request
lease_ids = self._leaser.ack_ids
AttributeError: 'NoneType' object has no attribute 'ack_ids'
```
I think it's the CleanupThread.
Created a script to reproduce:
1. Create topic
2. Insert message to it.
3. Create subscription.
4. Start subscribe.
5. Get exception (CleanupThread in _channel -> grpc package).
```
# Create a topic
# Insert data to the topic
# Create subscription and connect it to the topic
# Start subscriber
# Get exception
from google.cloud.pubsub_v1 import PublisherClient, SubscriberClient
publisher_obj = PublisherClient()
subscriber_obj = SubscriberClient()
topics_path = [u'projects/your-project-name/topics/topic_general']
subscribers_path = [u'projects/your-project-name/subscriptions/instance_general']
def create_topics():
for topic in topics_path:
try:
publisher_obj.create_topic(topic)
except Exception as e:
print("Topic {}: {}. Notice: don't forget to run pubsub emulator before".format(topic, e.message))
def publish_messages(data, message_type):
response = publisher_obj.publish(topics_path[0], data=data, type=message_type)
return response
def create_subscriptions():
subscriptions = []
for subscribe, topic in zip(subscribers_path, topics_path):
subscriptions.append(subscriber_obj.create_subscription(subscribe, topic))
return subscriptions
def start_subscribe():
subscription_path = SubscriberClient().subscription_path("your-project-name", "instance_general")
_ = SubscriberClient().subscribe_experimental(subscription_path, callback=lambda x: x)
def pubsub_error():
create_topics()
a = publish_messages(data='{"key3": "value3", "key2": "value2", "key1": "value1"}', message_type="/webhook/webhook_type")
create_subscriptions()
start_subscribe()
if __name__ == "__main__":
pubsub_error()
```
Rest of the information:
```
$ pip show google-cloud
Name: google-cloud
Version: 0.32.0
Summary: API Client library for Google Cloud
Home-page: https://github.com/GoogleCloudPlatform/google-cloud-python
Author: Google Cloud Platform
Author-email: googleapis-publisher@google.com
License: Apache 2.0
Location: /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages
Requires: google-cloud-videointelligence, google-cloud-trace, google-cloud-bigquery-datatransfer, google-cloud-bigquery, google-cloud-language, google-cloud-datastore, google-cloud-core, google-cloud-storage, google-cloud-vision, google-cloud-logging, google-cloud-pubsub, google-cloud-container, google-cloud-monitoring, google-cloud-runtimeconfig, google-cloud-error-reporting, google-cloud-dns, google-cloud-firestore, google-api-core, google-cloud-resource-manager, google-cloud-translate, google-cloud-spanner, google-cloud-speech, google-cloud-bigtable
Required-by:
$ pip freeze
analytics==0.6.5
appengine==1.8.0.2
asn1crypto==0.22.0
astroid==1.5.2
backports-abc==0.5
backports.functools-lru-cache==1.3
beautifulsoup4==4.6.0
bitstring==3.1.5
boto==2.48.0
bumpversion==0.5.3
cachetools==2.0.1
certifi==2018.1.18
cffi==1.10.0
chardet==3.0.4
click==6.7
cloud==2.8.5
cognitive-luis==0.1
crypto==1.4.1
cryptography==2.0.3
cycler==0.10.0
dill==0.2.7.1
enum==0.4.6
enum34==1.1.6
expiringdict==1.1.3
ez-setup==0.9
fakeredis==0.9.0
Flask==0.12.2
funcsigs==1.0.2
future==0.16.0
futures==3.2.0
gapic-google-cloud-datastore-v1==0.15.3
gapic-google-cloud-error-reporting-v1beta1==0.15.3
gapic-google-cloud-functions-v1beta2==0.15.3
gapic-google-cloud-language-v1==0.15.3
gapic-google-cloud-logging-v2==0.91.3
gapic-google-cloud-pubsub-v1==0.15.4
gapic-google-logging-v2==0.9.3
gapic-google-pubsub-v1==0.11.1
gax-google-logging-v2==0.8.3
gax-google-pubsub-v1==0.8.3
gcloud==0.18.3
gcs-oauth2-boto-plugin==1.14
google==2.0.1
google-api-core==0.1.4
google-api-python-client==1.6.5
google-api-wrapper==2.0.0a1
google-auth==1.4.1
google-auth-httplib2==0.0.3
google-cloud==0.32.0
google-cloud-bigquery==0.28.0
google-cloud-bigquery-datatransfer==0.1.1
google-cloud-bigtable==0.28.1
google-cloud-container==0.1.0
google-cloud-core==0.28.1
google-cloud-datastore==1.4.0
google-cloud-dns==0.28.0
google-cloud-error-reporting==0.28.0
google-cloud-firestore==0.28.0
google-cloud-language==1.0.1
google-cloud-logging==1.4.0
google-cloud-monitoring==0.28.0
google-cloud-pubsub==0.34.0
google-cloud-resource-manager==0.28.0
google-cloud-runtimeconfig==0.28.1
google-cloud-spanner==0.29.0
google-cloud-speech==0.30.0
google-cloud-storage==1.6.0
google-cloud-trace==0.17.0
google-cloud-translate==1.3.0
google-cloud-videointelligence==1.0.0
google-cloud-vision==0.29.0
google-gax==0.15.16
google-resumable-media==0.3.1
googleapis-common-protos==1.5.3
googlemaps==2.5.1
grpc-google-iam-v1==0.11.4
grpc-google-logging-v2==0.8.1
grpc-google-pubsub-v1==0.11.1
grpcio==1.9.1
gunicorn==19.7.1
httplib2==0.10.3
icalendar==3.11.4
idna==2.6
intervaltree==2.1.0
ipaddress==1.0.18
isort==4.2.5
itsdangerous==0.24
Jinja2==2.9.6
keyring==8.7
keyrings.alt==1.3
lazy-object-proxy==1.3.1
luis==2.0.2
lxml==3.7.2
MarkupSafe==1.0
matplotlib==1.5.3
mixpanel==4.3.2
mixpanel-py-async==0.1.0
mock==2.0.0
Naked==0.1.31
numpy==1.13.1
nydus==0.11.0
oauth2client==3.0.0
olefile==0.44
parsedatetime==2.4
pbr==3.1.1
pep8==1.7.0
Pillow==4.0.0
ply==3.8
proto-google-cloud-datastore-v1==0.90.4
proto-google-cloud-error-reporting-v1beta1==0.15.3
proto-google-cloud-functions-v1beta2==0.15.3
proto-google-cloud-language-v1==0.15.3
proto-google-cloud-logging-v2==0.91.3
proto-google-cloud-pubsub-v1==0.15.4
protobuf==3.5.1
psutil==5.4.3
pubsub==0.1.2
py==1.4.34
pyasn1==0.4.2
pyasn1-modules==0.2.1
pycodestyle==2.3.1
pycparser==2.18
pycrypto==2.6.1
pycurl==7.43.0
pyicloud==0.9.1
Pympler==0.5
pyOpenSSL==17.3.0
pyparsing==2.2.0
pytest==3.1.3
python-dateutil==1.5
python-memcached==1.59
python-telegram-bot==6.1b0
pytz==2018.3
PyYAML==3.12
redis==2.10.5
requests==2.18.4
retry-decorator==1.1.0
rsa==3.4.2
shellescape==3.4.1
singledispatch==3.4.0.3
six==1.10.0
SocksiPy-branch==1.1
sortedcontainers==1.5.7
titlecase==0.10.0
tornado==4.5.1
tornadohttpclient==1.1.3
twx.botapi==3.1.1
tzlocal==1.4
uritemplate==3.0.0
urllib3==1.22
urllib3-mock==0.3.3
virtualenv==15.1.0
webapp2==2.3
WebOb==1.7.3
Werkzeug==0.14.1
wrapt==1.10.10
$ python --version
Python 2.7.13
```
| 2018-05-23T17:56:09Z | [] | [] |
Traceback (most recent call last):
File "/Users/alonshomrat/Library/Python/2.7/lib/python/site-packages/grpc/_channel.py", line 187, in consume_request_iterator
request = next(request_iterator)
File "/Users/alonshomrat/Library/Python/2.7/lib/python/site-packages/google/cloud/pubsub_v1/subscriber/_protocol/bidi.py", line 100, in __iter__
yield self._initial_request()
File "/Users/alonshomrat/Library/Python/2.7/lib/python/site-packages/google/cloud/pubsub_v1/subscriber/_protocol/streaming_pull_manager.py", line 313, in _get_initial_request
lease_ids = self._leaser.ack_ids
AttributeError: 'NoneType' object has no attribute 'ack_ids'
| 6,177 |
||||
googleapis/google-cloud-python | googleapis__google-cloud-python-5826 | d6fe5e17840c891455e1454bd1b43eef2190118b | diff --git a/pubsub/google/cloud/pubsub_v1/_gapic.py b/pubsub/google/cloud/pubsub_v1/_gapic.py
--- a/pubsub/google/cloud/pubsub_v1/_gapic.py
+++ b/pubsub/google/cloud/pubsub_v1/_gapic.py
@@ -32,6 +32,8 @@ def wrap(wrapped_fx):
# Similarly, for instance methods, we need to send self.api rather
# than self, since that is where the actual methods were declared.
instance_method = True
+
+ # If this is a bound method it's a classmethod.
self = getattr(wrapped_fx, '__self__', None)
if issubclass(type(self), type):
instance_method = False
@@ -41,8 +43,9 @@ def wrap(wrapped_fx):
if instance_method:
fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw) # noqa
return functools.wraps(wrapped_fx)(fx)
- fx = lambda self, *a, **kw: wrapped_fx(*a, **kw) # noqa
- return functools.wraps(wrapped_fx)(fx)
+
+ fx = lambda *a, **kw: wrapped_fx(*a, **kw) # noqa
+ return staticmethod(functools.wraps(wrapped_fx)(fx))
def actual_decorator(cls):
# Reflectively iterate over most of the methods on the source class
| PubSub: Got unbound method error when calling the class method from_service_account_json
I am using the service account for authentication. By following the example in [here](https://googlecloudplatform.github.io/google-cloud-python/latest/core/auth.html#service-accounts)
, I create the following code:
```
from google.cloud import pubsub
logging.debug(dir(pubsub.PublisherClient))
pubsub_publisher_client = pubsub.PublisherClient.from_service_account_json(key_path)
```
The output:
```
DEBUG:root:['__class__', '__delattr__', '__dict__', '__doc__', '__format__', '__getattribute__', '__hash__', '__init__', '__module__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', '__weakref__', 'batch', 'create_topic', 'delete_topic', 'from_service_account_file', 'from_service_account_json', 'get_iam_policy', 'get_topic', 'list_topic_subscriptions', 'list_topics', 'project_path', 'publish', 'set_iam_policy', 'target', 'test_iam_permissions', 'topic_path', 'update_topic']
Traceback (most recent call last):
File "main.py", line 81, in <module>
pubsub_publisher_client = pubsub.PublisherClient.from_service_account_json(key_path)
TypeError: unbound method from_service_account_file() must be called with PublisherClient instance as first argument (got str instance instead)
```
However, when I use similar code to create datastore client with the service account JSON file, it works. E.g.
```
from google.cloud import datastore
datastore_client = datastore.Client.from_service_account_json(key_path)
```
After checking the source code, perhaps the decorator in `pubsub/google/cloud pubsub_v1/_gapic.py` failed to create the class method properly.
| Weird, @lukesneeringer @andreamlin can you take a look?
The bug is almost certainly in [`google.cloud.pubsub_v1._gapic.add_methods`](https://github.com/GoogleCloudPlatform/google-cloud-python/blob/75de6a53c9feee1bc37ee4bfd1369b3010403194/pubsub/google/cloud/pubsub_v1/_gapic.py#L35), which is not detecting that `from_service_account_file` (aliased to `from_service_account_json`) is a classmethod. | 2018-08-21T18:25:49Z | [] | [] |
Traceback (most recent call last):
File "main.py", line 81, in <module>
pubsub_publisher_client = pubsub.PublisherClient.from_service_account_json(key_path)
TypeError: unbound method from_service_account_file() must be called with PublisherClient instance as first argument (got str instance instead)
| 6,233 |