Die Analysis

Now we will run a die analysis on the power envelopes we uploaded earlier.

As before, make sure you have the following environment variables set or added to a .env file:

GDSFACTORY_HUB_API_URL="https://{org}.gdsfactoryhub.com"
GDSFACTORY_HUB_QUERY_URL="https://query.{org}.gdsfactoryhub.com"
GDSFACTORY_HUB_KEY="<your-gdsfactoryplus-api-key>"
import getpass

from tqdm.notebook import tqdm

import gdsfactoryhub as gfh
from gdsfactoryhub import FunctionTargetModel
project_id = f"spirals-{getpass.getuser()}"
client = gfh.create_client_from_env(project_id=project_id)
api = client.api()
query = client.query()
utils = client.utils()

Die Analysis

You can trigger a die analysis for 300, 500 and 800nm wide waveguides.

from gdsfactoryhub.functions.die import propagation_loss

propagation_loss.run?
die_pkeys = [d["pk"] for d in query.dies().execute().data]
propagation_loss.run(die_pkey=die_pkeys[0])
{'output': {'propagation_loss': np.float64(0.08355123220684546),
  'insertion_loss': np.float64(-0.2923000835106616)},
 'summary_plot': <Figure size 640x480 with 1 Axes>,
 'die_pkey': '0030e48f-6697-4f6e-9f1d-a642f6dabc6c'}

png

Lets aggregate the data from different dies to extract the die propagation loss

with gfh.suppress_api_error():
    result = api.upload_function(
        function_id="propagation-loss",
        target_model="die",
        file=gfh.get_module_path(propagation_loss),
        test_target_model_pk=die_pkeys[0],
        test_kwargs={},
    )
Duplicate function
results = []
for die_pk in (pb := tqdm(die_pkeys)):
    pb.set_postfix(die_pk=die_pk)
    result = api.start_analysis(  # start_analysis triggers the analysis task, but does not wait for it to finish.
        analysis_id=f"propagation-loss-{die_pk}",
        function_id="propagation-loss",
        target_model=FunctionTargetModel.DIE,
        target_model_pk=die_pk,
        kwargs={},
    )
    results.append(result)
  0%|          | 0/21 [00:00<?, ?it/s]

Let's have a look at the last analysis:

analysis_pks = [r["pk"] for r in results]
utils.analyses().wait_for_completion(pks=analysis_pks)
analyses = query.analyses().in_("pk", analysis_pks).execute().data
succesful_analyses = [a for a in analyses if a["status"] == "COMPLETED"]
analysis = succesful_analyses[-1]
img = api.download_plot(analysis["summary_plot"]["path"])
img.resize((530, 400))
Waiting for analyses:   0%|          | 0/21 [00:00<?, ?it/s]

png

On This Page