Zum Hauptinhalt wechseln

Upload point cloud

Info

With large point clouds, we don't recommend using the Upload 3D Models feature in Cognite Data Fusion (CDF). This is due to longer upload times and potential file transfer failures.

To upload point clouds, the new approach removes the need to compress individual files into a single zip file. You only need to specify the folder or directory containing the files.

Prerequisites

Install Python and Python SDK in your environment before you get started. To download and install Python, see Download Python.

To install the Python Cognite API, use:

pip install cognite-sdk

If you have multiple Python versions installed, to specify the version, use:

py -version -m pip install cognite-sdk/  

Replace version with the specific Python version.

Create and implement

Copy the code into a local Python file. The script will create a model, upload all the files, and generate a file containing the IDs of the point cloud files in JSON format. You can use the model ID and JSON file's file ID to create a new point cloud revision. The script will automatically trigger a browser pop-up for authentication.

NOTE: It'll take a significant amount of time to complete the uploading process. Don't interrupt or shut down the computer until the process is completed.

Use the Python script below to implement the solution.

from cognite.client import global_config, ClientConfig, CogniteClient
from cognite.client.credentials import OAuthInteractive
import json

# Note: Input the details for your environment
# ******************************************************

TENANT_ID = '<TENANT_ID>'
CLIENT_ID = '<CLIENT_ID>'
CDF_CLUSTER = '<CDF_CLUSTER>'
COGNITE_PROJECT = '<PROJECT NAME>'
MODEL_NAME = "<MODEL NAME>"
DIRECTORY_NAME = "<Specify the directory containing all point cloud files>"
AUTHORITY_URL =f"https://login.microsoftonline.com/{TENANT_ID}"

# ******************************************************

BASE_URL = f"https://{CDF_CLUSTER}.cognitedata.com"

# Using OAuth in Azure Entra Id
tokenProvider = OAuthInteractive(
authority_url=AUTHORITY_URL,
client_id=CLIENT_ID,
scopes= [f'{BASE_URL}/.default']
)

clientConfig = ClientConfig(
client_name="upload point cloud script",
project=COGNITE_PROJECT,
credentials=tokenProvider,
base_url=BASE_URL,
timeout=300,
)

global_config.default_client_config = clientConfig
client = CogniteClient()

# Create model
print("Create model ....")
modelIdKey = "CogniteModelId"
model = client.three_d.models.create(name = MODEL_NAME )

# A directory with point cloud files will be uploaded as individual files
print(f"Uploading files from {DIRECTORY_NAME} .... (Uploading is in progress; it can take hours)")
directory_upload_result = client.files.upload(DIRECTORY_NAME, overwrite=True, recursive=True, metadata={modelIdKey: model.id})
print(f"Number of files uploaded: {len(directory_upload_result)}")

# Create and upload file_id list json file
file_ids = [file.id for file in directory_upload_result]
file_content = {"dataType": "PointCloud", "fileIds": file_ids}
file_name = MODEL_NAME + ".json"
print(f"Uploading {file_name} ....")
with open(file_name, 'w') as outfile:
json.dump(file_content, outfile)

file_upload_result = client.files.upload(file_name, overwrite=True; metadata={modelIdKey: model.id})

# Print the result
print("Summary:")
print(f"Model id : {model.id}")
print(f"File id : {file_upload_result.id}")
print(f"Model name: {MODEL_NAME}")
print(f"File name : {file_name}")