The following examples demonstrate how easy it is to get working in the production environment using the Sitelink3D REST API. Each section below lists the service APIs used to achieve certain tasks and illustrates how these services can be combined to implement different pieces of functionality. By the end of these short scenarios we will have created a site, connected a machine to it using Site Discovery and published machine positions to the cloud to represent the machine working.
The following code snippets are written such that they can be copied, pasted and run "as is" against the production system. Each snippet is designed to run after the previous. As such, setup is only performed once and variables may be used in subsequent snippets. All code snippets can be copied into a single Python file and executed together. They appear separated here to cleanly segregate the purpose of each block and to highlight the simplicity of achieving certain tasks in Sitelink3D.
For context and convenience, the code snippets provided here along with python library files are available for download as an archive. Extracting this to your local machine will provide all the code and supporting files required to run all the examples. Feel free to jump straight to the first example of creating a site, or read on for a little more detail before you begin coding.
The following files are required to set up a site:
make_site.py
will create a site for you to use.
It uses the settings in settings.json
.
Details will be stored in a file called site.json
.
See the "Creating a Site" section below.
extract_site_file.js
will create a site.json
file based on an existing site.
See "Using an Existing Site" below.
mfk_send_replicates.py
pushes machine data to the cloud
mfk_live_reader.py
reads and displays this data
file_example.py
demonstrates the file uploading, downloading, and listing features of the API.
In order to work with Sitelink3D programmatically, developers will first need to utilize two pieces of data provided here. The first is a UUID identifying your Legal Entity. Legal Entities are defined in the Swagger documentation but for now, let's think of this as our organization identifier; it's what enables Sitelink3D to categorize data hierarchically in the cloud and provide billing, reports and auditing functionality for your company. For the purpose of these examples however, the Legal Entity is a shared resource which means that anyone can see sites and data that you produce and visa versa. The following Legal Entity is provided as a sandbox in the production environment.
3cc1d2ca-ca6a-11e7-ba27-a44cc8cb5ed0
Secondly, Sitelink3D API calls require a security token in the form of a JWT (JSON Web Token). These tokens allow precise control over resources and are fundamental to how Sitelink3D achieves seamless B2B operation and authentication free usage between organizations, business units and other domains in a secure manner. Furthermore, you can create your own Sitelink3D tokens with any level of granularity for the resources you own. This provides precise control over delegation of any asset owned by you and your sites. Security within Sitelink3D is discussed in detail elsewhere; for now it is sufficient to say that the following token will allow unrestricted activity for the production Legal Entity defined above. It is referenced in the code snippets below.
eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.eyJhY3QiOnsiKiI6WyIqIl19LCJleHAiOjMxMDM0MDM4NDYsImlhdCI6MTUyNjYwMzU0NiwiaXNzIjoiM2NjMWQyY2EtY2E2YS0xMWU3LWJhMjctYTQ0Y2M4Y2I1ZWQwIiwianRpIjoiNzIwYTZkY2ItYzBkNC00NjdiLTllOWQtZTBkYjNlODVjNGU5Iiwic2NwIjoiKiIsInN1YiI6InVybjpYLXRvcGNvbjpsZTozY2MxZDJjYS1jYTZhLTExZTctYmEyNy1hNDRjYzhjYjVlZDAifQ.fyEz93kyhkpmzHbcKkbe8MGT2VT9NJ978xtis4hoWSttzXXtRWPZ5PGD_VwrKumWifp4UTrPzXHe3_MJTwb8bkOneQBlSkKz4BhvGeNo_240EV79lB7oaHaRo7E_uCr6h10Bv5wKcLKNmCzSEtW3hpXDj1vHqEC5a5f43Dwd8x9UUB_oBKxVpd5PqtlxXDY_duUu0eVX_FC0u6FEokRoaXyxh00smfDIshOUUkVF8OQ3gEd6jqK4fNNcmK2THel54OXj5H67-Nhd0Uxs-EuXTS2wyxBSt4vKt7Uhq0Heu8GPYfpkMuwUPp3mqStbSPRM0PN30NiMwuTLc-BccuMgkQ
File make_site.py
will create a site for you to use.
It uses the settings in settings.json
.
Details will be stored in a file called site.json
in the current directory.
Creating a site involves the following entities:
owner_uuid
, one will be created and that will be saved back into settings.json
.
Any further sites you create will be owned by this uuid.
If you later wish to create a new owner_uuid
, simply edit the settings.json
file and remove the
corresponding entry.
Let's walk through the steps to programmatically create a site in the cloud that will be capable of accepting data.
Here we import some packages, define a session and specify our JWT security token that we'll use subsequently.
Note that in our sample code we have put the settings in settings.json
to make things a bit less cluttered.
import json
import requests
import uuid
import os
root_path = os.path.dirname(os.path.realpath(__file__))
security_token = "eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.eyJhY3QiOnsiKiI6WyIqIl19LCJleHAiOjMwOTIxNjQ4MzcsImlhdCI6MTUxNTM2NDUzNywiaXNzIjoiM2NjMWQyY2EtY2E2YS0xMWU3LWJhMjctYTQ0Y2M4Y2I1ZWQwIiwianRpIjoiNTVhNzJhMzUtZDE0NS00OWJkLTk4NmItMDJiY2IzYmIyNDlhIiwic2NwIjoiKiIsInN1YiI6InVybjpYLXRvcGNvbjpsZTozY2MxZDJjYS1jYTZhLTExZTctYmEyNy1hNDRjYzhjYjVlZDAifQ.DFg3bKQ6JTHj3TLIVcHo6KPGE7KSTYP459pXvIZapES75oLDkixyj5T17gGVjQ1_TLKIIz-OHVrrDvPYS5KZWE6mLFsNhH7DxRTQ-PTerj-7VomdErxF3td5eXK7VvSWozEJRzfnoBIiHTYUs6smzgNgkUgNH-xUnOmm5FlK7EGoiUIBQQnb6MKA8vgyoH-IK4T2OAoCkcCFvz6mWnShoet5iaeM24w5Y3k8VLy9ZO1Q5YD2MqZi9ekdOXtNAVznJwKzx7bsIevfKABO2he545CWcUaYzFKGRwndT9i0Z8C2BnOvXLQ8IPN6_G55EG_9Mfp4UuVcmmrFm_5exkOmPQ"
server = "api.code.topcon.com"
session = requests.Session()
session.verify = os.path.join(root_path, "fixtures", "gd_bundle-g2-g1.crt")
Next, we use the Site Owner micro service to create a site owner. An owner typically represents a business unit within an organization (Legal Entity). For the purpose of this example, we could consider the owner to be a team. We'll need this owner in order to create a site.
Note that the payload refers to the Legal Entity UUID that was discussed in the Before You Start section. We cache the UUID representing our owner as we'll need to refer to it later.
owner_uuid = uuid.uuid1()
payload_json = {
"legal_entity_uuid": "3cc1d2ca-ca6a-11e7-ba27-a44cc8cb5ed0",
"owner_uuid": str(owner_uuid),
"owner_email": "test@email.com"
}
header_json = {
"X-Topcon-Auth" : security_token,
"Content-Type" : "application/json",
}
response = session.post("https://{}/siteowner/v1/owners".format(server), headers=header_json, data=json.dumps(payload_json))
response.raise_for_status()
print "create owner returned {0}\n{1}".format(response.status_code, json.dumps(response.json(),indent=4))
Lastly, we create a payload that will represent a site and post that bean to the Site Owner micro service. We now have an owner and a site in the production system.
payload_json = {
"site_uuid": str(uuid.uuid1()),
"name": "my construction site",
"dc": "us",
"region": "medium"
}
response = session.post("https://{0}/siteowner/v1/owners/{1}/create_site".format(server,owner_uuid), headers=header_json, data=json.dumps(payload_json))
site_details_json = response.json() # we will need this information for other examples.
print "create site returned {0}\n{1}".format(response.status_code, json.dumps(site_details_json,indent=4))
site.json
file based on an existing site.
We intend to provide a simple widget to get the details from our web page; until that is in place,
the following steps will create the file for you:
extract_site_file.js
: extract_site_file.js
and place the output in site.json
.
On Linux with nodejs installed: ./extract_site_file.js > site.json
Now that we have created a site, we want clients to be able to connect to it. This is achieved with the Site Discovery micro service.
In essence, we simply associate meta-data with the site such that clients can be alerted to its presence by their phone or other GPS enabled device. To achieve this, we submit both descriptive information about the site and a definition for a geo-fenced region to the RDM service. To simulate us "arriving at site", we will then query the Site Discovery service for what sites are available at a particular location. Note that although we are querying the Site Discovery service in our example application, no user input is required.
This demonstrates Sitelink3D's zero-barrier to work design goal in that an operator can be automatically issued a greeting and instructions simply by physically arriving at the geographical region we've defined in code. Importantly, we'll see in this demonstration that we can configure our site to be discovered by those external to our company if we so desire. This demonstrates interoperability as one of Sitelink3D's key features. This enables us to allow any piece of software that also uses the Sitelink3D API to locate the site information we wish to expose, regardless of vendor, platform or technology.
The following code exemplifies the representation of the data that is used to greet Site Discovery clients when they arrive on location. The Site Discovery API allows software to present contact information including a name, email and phone number such that workers require no foreknowledge when showing up to work. Here we simply display the last site to be returned from us but the code is easily modified to display all overlapping sites at a location and provide a user with options as to which one is relevant for their work.
import time
import base64
marker_json = {"lat":-27.4699,"lon":153.0252}
payload_json = {
"_at" : int(round(time.time() * 1000)),
"_id" : "site",
"_rev" : str(uuid.uuid1()),
"_type": "sl::site",
"_v" : 3,
"job_code" : "code",
"marker" : marker_json,
"name" : "the ABC construction site",
"timezone" : "Australia/Brisbane",
"_extra" : {
"sl::site::site_discovery" : {
"contact" : {
"phone" : "1234567890",
"email" : "foreman@mysite.com",
"name" : "site foreman, John Smith"
},
"discoverable" : True
}
}
}
time.sleep(1)
data_encoded_json = { "data_b64" : base64.b64encode(json.dumps(payload_json)) }
response = session.post("https://{0}/rdm_log/v1/site/{1}/domain/{2}/events".format(server,site_details_json["identifier"],"sitelink"), headers=header_json, data=json.dumps(data_encoded_json))
print "post RDM site information returned {0}\n{1}".format(response.status_code, json.dumps(response.json(),indent=4))
time.sleep(0.5)
geo_fence = [
[-27.9,152.9],
[-27.1,152.9],
[-27.1,154.0],
[-27.9,154.0],
[-27.9,152.9]
]
region_id = "DiscoveryRegion"
payload_json = {
"_at" : int(round(time.time() * 1000)),
"_id" : region_id,
"_rev" : str(uuid.uuid1()),
"_type" : "sl::region",
"_v" : 1,
"coordinate_system" : "wgs84",
"is_site_discovery" : True,
"name" : "my region",
"color" : "#172d39",
"opacity" : 50,
"vertices" : {
"storage_type" : "data",
"data" : geo_fence
}
}
data_encoded_json = { "data_b64" : base64.b64encode(json.dumps(payload_json)) }
response = session.post("https://{0}/rdm_log/v1/site/{1}/domain/{2}/events".format(server,site_details_json["identifier"],"sitelink"), headers=header_json, data=json.dumps(data_encoded_json))
print "post RDM region information returned {0}\n{1}".format(response.status_code, json.dumps(response.json(),indent=4))
time.sleep(0.5)
response = session.get("https://{0}/discovery/v1/refs?latitude={1}&longitude={2}".format(server,marker_json["lat"], marker_json["lon"]), headers=header_json)
print "get sites by location returned {0}\n{1}".format(response.status_code, json.dumps(response.json(),indent=4))
# tell the user that they've arrived at a site
site_discovery_json = response.json()[-1]
print "Welcome to {0}. Please call {1} on {2} or email {3} for site induction details.".format(site_discovery_json["name"], site_discovery_json["contact"]["name"], site_discovery_json["contact"]["phone"], site_discovery_json["contact"]["email"])
The concept of connecting to a site is nothing more than being able to query a site identifier along with a PIN in order to retrieve a JWT that can subsequently be used to post data site data. The JWT will provide permissions relating to certain file, MFK, RDM and reporting operations for example.
The pin is configured by posting an sl::auth_code
object to RDM as follows.
site_pin = "123456"
auth_code_json = {
"_id": str(uuid.uuid1()),
"_at": int(round(time.time() * 1000)),
"_rev": str(uuid.uuid1()),
"_type": "sl::auth_code",
"name": "code-name",
"pin_sha256": hashlib.sha256(site_pin).hexdigest()
}
data_encoded_json = { "data_b64" : base64.b64encode(json.dumps(auth_code_json)) }
response = session.post("https://{0}/rdm_log/v1/site/{1}/domain/{2}/events".format(server,site_details_json["identifier"],"sitelink"), headers=header_json, data=json.dumps(data_encoded_json))
print "post RDM auth code information returned {0}\n{1}".format(response.status_code, json.dumps(response.json(),indent=4))
time.sleep(1)
response = session.get("https://{0}/discovery/v1/sites/byref/{1}?pin={2}&device_id=device_id".format(server,site_details_json["identifier"],site_pin))
print json.dumps(response.json(), indent=4)
With our site identifier already provided by the Site Discovery service above, it is now trivial to start pushing information about us to the Site. The information that we can provide is potentially voluminous but for the purpose of this example, we will be submitting enough data to enable a suitably authorized client to view our machine shape, position and orientation. To achieve this, we open a web socket connection to the Data Logger service. It is important to construct the Data Logger channel path correctly.
Detailed information on the channel path can be found in the Data Logger micro service documentation. For now however, it is sufficient to know that we will be using the update stream for higher frequency payloads and the context stream for less common traffic when building our channels. We'll also be using the site identifier and the site region to construct the path; all of which is available to us thanks to the preceding steps. Note that we will be utilizing imports in the subsequent code snippets that are available for download as an archive.
To start, let's first construct the Data Logger paths that we'll be using to send data to. We also write the current site identifier to a file such that another script can access it and read the positions we'll be posting.
site_id = site_details_json["identifier"]
data_logger_live_update_url = "wss://{0}:443/data_logger/v1/publish_live/us/update/{1}.medium".format(server,site_id)
data_logger_live_context_url = "wss://{0}:443/data_logger/v1/publish_live/us/context/{1}.medium".format(server,site_id)
print "connecting to web socket at {0}".format(data_logger_live_context_url)
ws_context = websocket.WebSocket(sslopt={"cert_reqs": ssl.CERT_NONE})
ws_context.connect(data_logger_live_context_url)
Next we want to construct and send a Resource Configuration message to the Data Logger to define what our machine looks like. For this demonstration, we'll be using a generic excavator. It is important to remember to send current time stamps to the Data Logger as stale data in MFK "at" fields can result in machine data being effectively invisible to MFK Live subscribers.
rc_json = ResourceConfiguration.rc_json
rc_data = json.loads(base64.b64decode(rc_json["data"]["data_part_b64"]))
rc_uuid = str(uuid.uuid4())
print "using new uuid {0}".format(rc_uuid)
rc_data["uuid"] = rc_uuid
rc_json["data"]["uuid"] = rc_uuid
rc_json_str = json.dumps(rc_data)
rc_json["data"]["data_part_b64"] = base64.b64encode(rc_json_str)
rc_json["data"]["total_size"] = len(rc_json_str)
time.sleep(5)
print "now ms : {0}".format(int(round(time.time() * 1000)))
rc_json["at"] = int(round(time.time() * 1000))
rc_payload = json.dumps(rc_json, indent=4)
ws_context.send(rc_payload)
time.sleep(5)
With our Resource Configuration sent to the Data Logger, we now construct and send and Asset payload. Normally, two of these are sent. These payloads will represent the physical machine (our particular excavator in this example) and the HMI device running on that machine which could be a GX-60 for example. In reality, we are simply sending public keys to Site Link that represent these assents. These keys will allow subsequent signed updates from our two assets to be identified and trusted.
Asset.mfk_asset_json["at"] = int(round(time.time() * 1000))
ws_context.send(json.dumps(Asset.mfk_asset_json))
time.sleep(5)
The preceding step referenced our ability to use Asset keys to facilitate trusted updates. In this step, we exploit this ability by sending what's called an Asset Context. Our Asset Context payload will reference the collection of Assets that we've already defined. This is the last piece of context required for us to ultimately produce machine position data. Position data will reference the Asset Context we will create here and in doing so, will associate it with the device and machine Assets collectively.
AssetContext.mfk_ac_json["at"] = int(round(time.time() * 1000))
ws_context.send(json.dumps(AssetContext.mfk_ac_json))
time.sleep(5)
Lastly, we will open a second WebSocket to the Data Logger service that we will use to send position information. As already mentioned, this will use the update stream and require the transmission of Replicate messages.
print "connecting to web socket at {0}".format(data_logger_live_update_url)
ws_update = websocket.WebSocket(sslopt={"cert_reqs": ssl.CERT_NONE})
ws_update.connect(data_logger_live_update_url)
Replicate_NW.mfk_replicate_json["data"]["rc_uuid"] = rc_uuid
Replicate_NE.mfk_replicate_json["data"]["rc_uuid"] = rc_uuid
Replicate_SE.mfk_replicate_json["data"]["rc_uuid"] = rc_uuid
Replicate_SW.mfk_replicate_json["data"]["rc_uuid"] = rc_uuid
while True:
time.sleep(3)
Replicate_NW.mfk_replicate_json["at"] = int(round(time.time() * 1000))
ws_update.send(json.dumps(Replicate_NW.mfk_replicate_json))
time.sleep(3)
Replicate_NE.mfk_replicate_json["at"] = int(round(time.time() * 1000))
ws_update.send(json.dumps(Replicate_NE.mfk_replicate_json))
time.sleep(3)
Replicate_SE.mfk_replicate_json["at"] = int(round(time.time() * 1000))
ws_update.send(json.dumps(Replicate_SE.mfk_replicate_json))
time.sleep(3)
Replicate_SW.mfk_replicate_json["at"] = int(round(time.time() * 1000))
ws_update.send(json.dumps(Replicate_SW.mfk_replicate_json))
To complete this example, let's now run through the minimum steps required to retrieve the location updates that we started transmitting in the previous step. We can use this data to display in any manner desirable, including in the cabin of another machine or on a web page in an office. We achieve this by reading from a WebSocket. This code is available in the file mfk_live_reader.py.
Connect a WebSocket to MFK Live for the site of interest and send authorization allowing us access to site data.
mfk_live_url = "wss://{0}/mfk_live/v1/subscribe/{1}".format(server,site_id)
print "connecting to web socket at {0}".format(mfk_live_url)
ws = websocket.WebSocket(sslopt={"cert_reqs": ssl.CERT_NONE})
ws.connect(mfk_live_url)
print "sending authz"
authz = {
"X-Topcon-Auth": [security_token]
}
ws.send(json.dumps(authz))
Lastly, we seed our resource configuration and simply read the live machine data and process or display it as desired.
rc = mfk.ResourceConfiguration(json.loads(ws.recv()))
while True:
msg_json = json.loads(ws.recv())
print json.dumps(msg_json,indent=4)
rc.apply_manifest(msg_json['data']['manifest'])
The file file_example.py
provides a composite demonstration of the file upload feature of Sitelink 2. It
The individual functions are demonstrated in the following:
make_folder.py
file_upload.py
file_listing.py
file_download.py