You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This sample Python function demonstrates how to submit a Monitoring Query Language (MQL) query to OCI Monitoring Service which will extract a collection of historical metrics and then push this package to Object Storage as a json file.
4
+
5
+

6
+
7
+
This sample is intended to help you get up and running quickly by using default values for most parameters. In fact, there is only one required parameter: **compartmentId** --> this specifies the Object Storage Compartment to create the storage bucket (if it does not already exist). Supply values for optional parameters to select target resources, specific metrics to collect, define time ranges, etc., per your requirements. Advanced users may also replace the export function within the code to push metrics to your downstream application api.
8
+
9
+
The full list of *optional* parameters and their default values are presented below in the **Deploy** and **Test** sections. But first let's cover Prerequisites and the Function Development Environment setup.
10
+
11
+
12
+
As you make your way through this tutorial, look out for this icon .
13
+
Whenever you see it, it's time for you to perform an action.
14
+
15
+
16
+
## Prerequisites
17
+
18
+
Before you deploy this sample function, make sure you have run steps A, B
19
+
and C of the [Oracle Functions Quick Start Guide for Cloud Shell](https://www.oracle.com/webfolder/technetwork/tutorials/infographics/oci_functions_cloudshell_quickview/functions_quickview_top/functions_quickview/index.html)
20
+
* A - Set up your tenancy
21
+
* B - Create application
22
+
* C - Set up your Cloud Shell dev environment
23
+
24
+
25
+
## List Applications
26
+
27
+
Assuming you have successfully completed the prerequisites, you should see your
28
+
application in the list of applications.
29
+
30
+
```
31
+
fn ls apps
32
+
```
33
+
34
+
35
+
## Create or Update your Dynamic Group
36
+
37
+
In order to use other OCI Services, your function must be part of a dynamic
38
+
group. For information on how to create a dynamic group, refer to the
When specifying the *Matching Rules*, we suggest matching all functions in a compartment with:
42
+
43
+
```
44
+
ALL {resource.type = 'fnfunc', resource.compartment.id = 'ocid1.compartment.oc1..aaaaaxxxxx'}
45
+
```
46
+
47
+
48
+
## Create or Update IAM Policies
49
+
50
+
Create a new policy that allows the dynamic group to read metrics and manage object storage.
51
+
52
+
53
+

54
+
55
+
Your policy should look something like this:
56
+
```
57
+
Allow dynamic group <group-name> to read metrics in compartment <compartment-name>
58
+
Allow dynamic group <group-name> to manage buckets in compartment <compartment-name>
59
+
Allow dynamic group <group-name> to manage objects in compartment <compartment-name>
60
+
```
61
+
62
+
For more information on how to create policies, go [here](https://docs.cloud.oracle.com/iaas/Content/Identity/Concepts/policysyntax.htm).
63
+
64
+
65
+
## Review and customize the function
66
+
67
+
Review the following files in the current folder:
68
+
* the code of the function, [func.py](./func.py)
69
+
* its dependencies, [requirements.txt](./requirements.txt)
70
+
* the function metadata, [func.yaml](./func.yaml)
71
+
72
+
73
+
## Deploy the function
74
+
75
+
In Cloud Shell, run the `fn deploy` command to build *this* function and its dependencies as a Docker image,
76
+
push the image to the specified Docker registry, and deploy *this* function to Oracle Functions
77
+
in the application created earlier:
78
+
79
+

80
+
```
81
+
fn -v deploy --app <app-name>
82
+
```
83
+
For example,
84
+
```
85
+
fn -v deploy --app myapp
86
+
```
87
+
88
+
### Test
89
+
90
+
You may test the function using default values by executing the following command, specifying the only mandatory parameter (compartmentId). This will create an Object Storage bucket named metrics-export, if it does not already exist, and upload a json file containing per-minute avg cpu levels during the preceding hour for all VMs in the specified compartment.
There are several optional parameters to customize the query and time range. In addition to compartmentId, use any combination of additional parameters in the following format:
logger.error("Error in createBucketIfNotExists(): {}".format(str(e)))
36
+
raise
37
+
38
+
"""
39
+
Delete target objectname if it already exists.
40
+
Due to filenames containing embedded timestamps this scenario is rare, but could occur if re-executing a previous export with specific start/end timestamps.
0 commit comments