Skip to content

Commit

Permalink
Updated full workflow inference code in docs
Browse files Browse the repository at this point in the history
  • Loading branch information
bmunday3 committed Oct 26, 2023
1 parent 473ea58 commit 657f3ed
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 182 deletions.
7 changes: 4 additions & 3 deletions docs/docs/getting-started/full-workflow.md
Original file line number Diff line number Diff line change
Expand Up @@ -132,16 +132,17 @@ Next, open a Python file (new or existing) and paste the following inference cod
from chassis.client import OMIClient
from chassis.guides import DigitsSampleData

# Call and view results of status RPC
# Instantiate OMI Client connection to model running on localhost:45000
client = OMIClient("localhost", 45000)
# Call and view results of status RPC
status = await client.status()
print(f"Status: {status}")
# Submit inference with quickstart sample data
res = await client.run(DigitsSampleData)
# Parse results from output item
# Parse results from output item
result = res.outputs[0].output["results.json"]
# View results
print(f"Result: {result}")

```

Execute this code to perform an inference against your running container.
Expand Down
179 changes: 0 additions & 179 deletions packages/chassisml/README.md

This file was deleted.

0 comments on commit 657f3ed

Please sign in to comment.