Skip to content

Commit f7f1ffb

Browse files
authored
Remove set-input, get-output (#77)
As discussed in #43, there is no requirement to set up tensors prior to calling `compute` as well as retrieving them separately afterwards. As of #59, passing around tensors is cheap (they're resources now), so there is no data copy necessary if we adopt this PR. This change proposes removing the `set-input` and `get-output` functions, moving all of the tensor-passing to `compute`. Closes #43.
1 parent 82e9d89 commit f7f1ffb

File tree

2 files changed

+17
-43
lines changed

2 files changed

+17
-43
lines changed

ml.md

Lines changed: 11 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -169,49 +169,29 @@ e.g., cannot access a hardware feature requested
169169
#### <a name="tensor"></a>`type tensor`
170170
[`tensor`](#tensor)
171171
<p>
172-
#### <a name="tensor_data"></a>`type tensor-data`
173-
[`tensor-data`](#tensor_data)
174-
<p>
175-
#### <a name="graph_execution_context"></a>`resource graph-execution-context`
172+
#### <a name="named_tensor"></a>`tuple named-tensor`
173+
<p>Identify a tensor by name; this is necessary to associate tensors to
174+
graph inputs and outputs.</p>
175+
<h5>Tuple Fields</h5>
176+
<ul>
177+
<li><a name="named_tensor.0"></a><code>0</code>: <code>string</code></li>
178+
<li><a name="named_tensor.1"></a><code>1</code>: own&lt;<a href="#tensor"><a href="#tensor"><code>tensor</code></a></a>&gt;</li>
179+
</ul>
180+
<h4><a name="graph_execution_context"></a><code>resource graph-execution-context</code></h4>
176181
<p>Bind a <a href="#graph"><code>graph</code></a> to the input and output tensors for an inference.</p>
177182
<h2>TODO: this may no longer be necessary in WIT
178183
(https://github.com/WebAssembly/wasi-nn/issues/43)</h2>
179184
<h3>Functions</h3>
180-
<h4><a name="method_graph_execution_context_set_input"></a><code>[method]graph-execution-context.set-input: func</code></h4>
181-
<p>Define the inputs to use for inference.</p>
182-
<h5>Params</h5>
183-
<ul>
184-
<li><a name="method_graph_execution_context_set_input.self"></a><code>self</code>: borrow&lt;<a href="#graph_execution_context"><a href="#graph_execution_context"><code>graph-execution-context</code></a></a>&gt;</li>
185-
<li><a name="method_graph_execution_context_set_input.name"></a><code>name</code>: <code>string</code></li>
186-
<li><a name="method_graph_execution_context_set_input.tensor"></a><a href="#tensor"><code>tensor</code></a>: own&lt;<a href="#tensor"><a href="#tensor"><code>tensor</code></a></a>&gt;</li>
187-
</ul>
188-
<h5>Return values</h5>
189-
<ul>
190-
<li><a name="method_graph_execution_context_set_input.0"></a> result&lt;_, own&lt;<a href="#error"><a href="#error"><code>error</code></a></a>&gt;&gt;</li>
191-
</ul>
192185
<h4><a name="method_graph_execution_context_compute"></a><code>[method]graph-execution-context.compute: func</code></h4>
193186
<p>Compute the inference on the given inputs.</p>
194-
<p>Note the expected sequence of calls: <code>set-input</code>, <code>compute</code>, <code>get-output</code>. TODO: this
195-
expectation could be removed as a part of
196-
https://github.com/WebAssembly/wasi-nn/issues/43.</p>
197187
<h5>Params</h5>
198188
<ul>
199189
<li><a name="method_graph_execution_context_compute.self"></a><code>self</code>: borrow&lt;<a href="#graph_execution_context"><a href="#graph_execution_context"><code>graph-execution-context</code></a></a>&gt;</li>
190+
<li><a name="method_graph_execution_context_compute.inputs"></a><code>inputs</code>: list&lt;<a href="#named_tensor"><a href="#named_tensor"><code>named-tensor</code></a></a>&gt;</li>
200191
</ul>
201192
<h5>Return values</h5>
202193
<ul>
203-
<li><a name="method_graph_execution_context_compute.0"></a> result&lt;_, own&lt;<a href="#error"><a href="#error"><code>error</code></a></a>&gt;&gt;</li>
204-
</ul>
205-
<h4><a name="method_graph_execution_context_get_output"></a><code>[method]graph-execution-context.get-output: func</code></h4>
206-
<p>Extract the outputs after inference.</p>
207-
<h5>Params</h5>
208-
<ul>
209-
<li><a name="method_graph_execution_context_get_output.self"></a><code>self</code>: borrow&lt;<a href="#graph_execution_context"><a href="#graph_execution_context"><code>graph-execution-context</code></a></a>&gt;</li>
210-
<li><a name="method_graph_execution_context_get_output.name"></a><code>name</code>: <code>string</code></li>
211-
</ul>
212-
<h5>Return values</h5>
213-
<ul>
214-
<li><a name="method_graph_execution_context_get_output.0"></a> result&lt;own&lt;<a href="#tensor"><a href="#tensor"><code>tensor</code></a></a>&gt;, own&lt;<a href="#error"><a href="#error"><code>error</code></a></a>&gt;&gt;</li>
194+
<li><a name="method_graph_execution_context_compute.0"></a> result&lt;list&lt;<a href="#named_tensor"><a href="#named_tensor"><code>named-tensor</code></a></a>&gt;, own&lt;<a href="#error"><a href="#error"><code>error</code></a></a>&gt;&gt;</li>
215195
</ul>
216196
<h2><a name="wasi_nn_graph_0_2_0_rc_2024_08_19"></a>Import interface wasi:nn/graph@0.2.0-rc-2024-08-19</h2>
217197
<p>A <a href="#graph"><code>graph</code></a> is a loaded instance of a specific ML model (e.g., MobileNet) for a specific ML

wit/wasi-nn.wit

Lines changed: 6 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -110,25 +110,19 @@ interface graph {
110110
/// `graph` to input tensors before `compute`-ing an inference:
111111
interface inference {
112112
use errors.{error};
113-
use tensor.{tensor, tensor-data};
113+
use tensor.{tensor};
114+
115+
/// Identify a tensor by name; this is necessary to associate tensors to
116+
/// graph inputs and outputs.
117+
type named-tensor = tuple<string, tensor>;
114118

115119
/// Bind a `graph` to the input and output tensors for an inference.
116120
///
117121
/// TODO: this may no longer be necessary in WIT
118122
/// (https://github.com/WebAssembly/wasi-nn/issues/43)
119123
resource graph-execution-context {
120-
/// Define the inputs to use for inference.
121-
set-input: func(name: string, tensor: tensor) -> result<_, error>;
122-
123124
/// Compute the inference on the given inputs.
124-
///
125-
/// Note the expected sequence of calls: `set-input`, `compute`, `get-output`. TODO: this
126-
/// expectation could be removed as a part of
127-
/// https://github.com/WebAssembly/wasi-nn/issues/43.
128-
compute: func() -> result<_, error>;
129-
130-
/// Extract the outputs after inference.
131-
get-output: func(name: string) -> result<tensor, error>;
125+
compute: func(inputs: list<named-tensor>) -> result<list<named-tensor>, error>;
132126
}
133127
}
134128

0 commit comments

Comments
 (0)