+
+ Examples of how scatterElements works in different slicing schemes.
+
+
+ // input of shape [4,3]:
+ // [[ 0, 1, 2],
+ // [10, 11, 12],
+ // [20, 21, 22],
+ // [30, 31, 32]]
+ // indices of shape [2,3]:
+ // [[3, 1, 1],
+ // [2, 0, 3]]
+ // updates of shape [2,3]:
+ // [[-1, -2, -3],
+ // [-4, -5, -6]]
+ // axis = 0 (default)
+ // output of shape [4,3]:
+ // [[ 0, -5, 2],
+ // [10, -2, -3],
+ // [-4, 21, 22],
+ // [-1, 31, -6]]
+
+ const input1 = builder.constant(
+ {dataType: 'float32', shape: [4, 3]},
+ new Float32Array([0, 1, 2, 10, 11, 12, 20, 21, 22, 30, 31, 32]));
-
- The specific sampling algorithms are based on those widely used in existing Machine Learning frameworks. For example, when performing {{MLInterpolationMode/linear}} resampling from the following *[4, 4]* input tensor (considering only spatial dimensions):
+ const indices1 = builder.constant(
+ {dataType: 'uint32', shape: [2, 3]},
+ new Uint32Array([3, 1, 1, 2, 0, 3]));
- ```
- [ 0 1 2 3 ]
- [ 0 1 2 3 ]
- [ 12 13 14 15 ]
- [ 12 13 14 15 ]
- ```
+ const updates1 = builder.constant(
+ {dataType: 'float32', shape: [2, 3]},
+ new Uint32Array([-1, -2, -3, -4, -5, -6]));
- For an *[8, 8]* output tensor, the expected values are:
+ const output1 = builder.scatterElements(input1, indices1, updates1);
- ```
- [ 0 0.25 0.75 1.25 1.75 2.25 2.75 3 ]
- [ 0 0.25 0.75 1.25 1.75 2.25 2.75 3 ]
- [ 0 0.25 0.75 1.25 1.75 2.25 2.75 3 ]
- [ 3 3.25 3.75 4.25 4.75 5.25 5.75 6 ]
- [ 9 9.25 9.75 10.25 10.75 11.25 11.75 12 ]
- [ 12 12.25 12.75 13.25 13.75 14.25 14.75 15 ]
- [ 12 12.25 12.75 13.25 13.75 14.25 14.75 15 ]
- [ 12 12.25 12.75 13.25 13.75 14.25 14.75 15 ]
- ```
+ // input of shape [4,3]:
+ // [[ 0, 1, 2],
+ // [10, 11, 12],
+ // [20, 21, 22],
+ // [30, 31, 32]]
+ // indices of shape [4,1]:
+ // [[2],
+ // [1],
+ // [0],
+ // [2]],
+ // updates of shape [4,1]:
+ // [[-1],
+ // [-2],
+ // [-3],
+ // [-4]],
+ // axis = 1
+ // output of shape [4,3]:
+ // [[ 0, 1, -1],
+ // [10, -2, 12],
+ // [-3, 21, 22],
+ // [30, 31, -4]]
- This has the convenient properties that the sampling is evenly distributed, symmetric, robust to image mirroring, and the corner values are aligned.
+ const indices2 = builder.constant(
+ {dataType: 'uint32', shape: [4, 1]},
+ new Uint32Array([2, 1, 0, 2]));
+
+ const updates2 =
+ builder.constant({dataType: 'float32', shape: [4, 1]},
+ new Uint32Array([-1, -2, -3, -4]));
+
+ const output2 = builder.scatterElements(input1, indices2, updates2, {axis: 1});
+
+ // input of shape [4,2,2]:
+ // [[[ 0, 1],
+ // [ 10, 11]],
+ // [[100, 101],
+ // [110, 111]],
+ // [[200, 201],
+ // [210, 211]],
+ // [[300, 301],
+ // [310, 311]],]
+ // indices of shape [1,2,2]:
+ // [[[0, 2],
+ // [1, 3]]],
+ // updates of shape [1,2,2]:
+ // [[[-1, -2],
+ // [-3, -4]]],
+ // axis = 0
+ // output of shape [4,2,2]:
+ // [[[ -1, 1],
+ // [ 10, 11]],
+ // [[100, 101],
+ // [ -3, 111]],
+ // [[200, -2],
+ // [210, 211]],
+ // [[300, 301],
+ // [310, -4]],]
+
+ const input3 = builder.constant(
+ {dataType: 'float32', shape: [4, 2, 2]},
+ new Float32Array([0, 1, 10, 11, 100, 101, 110, 111, 200, 201, 210, 211, 300, 301, 310, 311]));
+
+ const indices3 = builder.constant(
+ {dataType: 'uint32', shape: [1, 2, 2]},
+ new Uint32Array([0, 2, 1, 3]));
+
+ const updates3 =
+ builder.constant({dataType: 'float32', shape: [1, 2, 2]},
+ new Uint32Array([-1, -2, -3, -4]));
+
+ const output3 = builder.scatterElements(input3, indices3, updates3, {axis: 0});
+
+
-### reshape ### {#api-mlgraphbuilder-reshape-method}
-Alter the shape of a tensor to a new shape. Reshape does not copy or change the content of the tensor. It just changes the tensor's logical shape for the subsequent operations.
+### scatterND ### {#api-mlgraphbuilder-scatternd}
+Scatter slices of values from the update tensor atop a copy of the input tensor according to the indices.
+
-
+
+
**Arguments:**
- - input: an {{MLOperand}}. The input tensor.
- - newShape: [=sequence=]<{{unsigned long}}>. The shape of the output tensor.
- The number of elements implied by {{MLGraphBuilder/reshape(input, newShape, options)/newShape}} must be the same as the
- number of elements in the input tensor.
- - options: an {{MLOperatorOptions}}. Specifies the optional parameters of the operation.
+ - input: an {{MLOperand}}. The input N-D tensor from to initialize the output with.
+ - indices: an {{MLOperand}}. The indices array contains entire coordinates into the output tensor, with the rightmost dimension holding the number of dimensions per coordinate. So an indices tensor of shape [10,1] holds 10 single-axis indices, and a shape of [4,3] holds 4 indices of 3D coordinates. The values must be of type {{MLOperandDataType/"int32"}}, {{MLOperandDataType/"uint32"}}, or {{MLOperandDataType/"int64"}}, and each must be in the range -N (inclusive) to N (exclusive) where N is the size of the corresponding output dimension, and a negative index means indexing from the end of the corresponding dimension.
+ - updates: an {{MLOperand}}. New values to replace atop the input.
+ - options: an optional {{MLScatterOptions}}. The optional parameters of the operation.
- **Returns:** an {{MLOperand}}. The output tensor. The values of the output
- tensor are the same as values of the input tensor. The shape of the output
- tensor is specified by {{MLGraphBuilder/reshape(input, newShape, options)/newShape}}.
+ **Returns:** an {{MLOperand}}. The output N-D tensor of [=MLOperand/rank=] equal to the [=MLOperand/rank=] of {{MLGraphBuilder/scatterND(input, indices, updates, options)/input}}'s [=MLOperand/rank=] + {{MLGraphBuilder/scatterND(input, indices, updates, options)/indices}}'s [=MLOperand/rank=] - {{MLGraphBuilder/scatterND(input, indices, updates, options)/indices}}'s [=MLOperand/shape=][-1] - 1.
-
- Constraints for {{MLGraphBuilder/reshape()}}
+
+ Constraints for {{MLGraphBuilder/scatterND()}}
operand |
@@ -7301,43 +8515,187 @@ partial dictionary MLOpSupportLimits {
{{input}} |
[=/any data type|any=] |
- [=/any rank|N=] |
+ 1 to [=/any rank|N=] |
+
+
+ {{indices}} |
+ {{MLOperandDataType/"int32"}}, {{MLOperandDataType/"uint32"}}, {{MLOperandDataType/"int64"}} |
+ 1 to [=/any rank|N=] |
+
+
+ {{updates}} |
+ [=/same type as|same as=] {{input}} |
+ {{input}}'s [=MLOperand/rank=] + {{indices}}'s [=MLOperand/rank=] - {{indices}}'s [=MLOperand/shape=][-1] - 1 |
*output* |
[=/same type as|same as=] {{input}} |
- {{newShape}}'s [=list/size=] |
+ 1 to [=/any rank|N=] |
-{{MLOpSupportLimits}} has the following member for {{MLGraphBuilder/reshape()}}:
+{{MLOpSupportLimits}} has the following members for {{MLGraphBuilder/scatterND()}}:
- : reshape
- :: Support limits for operator {{MLGraphBuilder/reshape()}}.
+ : scatterND
+ :: Support limits for operator {{MLGraphBuilder/scatterND()}}.
+
+ The {{MLGraphBuilder/scatterND(input, indices, updates, options)/indices}} parameter to {{MLGraphBuilder/scatterND()}} can not be clamped to the allowed range when the graph is built because the inputs are not known until execution. Implementations can introduce {{MLGraphBuilder/clamp()}} in the compiled graph if the specified clamping behavior is not provided by the underlying platform. Similarly, if the underlying platform does not support negative indices, the implementation can introduce operations in the compiled graph to transform a negative index from the end of the dimension into a positive index.
+
+
- The reshape(|input|, |newShape|, |options|) method steps are:
+ The scatterND(|input|, |indices|, |updates|, |options|) method steps are:
1. If [=this=] [=MLGraphBuilder/can not build=], then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
- 1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. Let |outputShape| be an empty array of {{unsigned long}}.
- 1. If |newShape|'s [=list/size=] is 0, set |outputShape| to an empty [=/list=] for a scalar.
- 1. If any [=list/item=] in |newShape| is not a [=valid dimension=], then [=exception/throw=] a {{TypeError}}.
- 1. Let |inputElementCount| be the product of all [=list/items=] in |input|'s [=MLOperand/shape=]. Empty dimensions yield an |inputElementCount| of 1.
- 1. If product of all values in |newShape| is not equal to |inputElementCount|, then [=exception/throw=] a {{TypeError}}.
- 1. Let |desc| be a copy of |input|.{{MLOperand/[[descriptor]]}}.
- 1. Set |desc|.{{MLOperandDescriptor/shape}} to |newShape|.
+ 1. If [=MLGraphBuilder/validating operand=] with [=this=] and any of |input|, |indices|, and |updates| returns false, then [=exception/throw=] a {{TypeError}}.
+ 1. If |indices|'s [=MLOperand/dataType=]'s is not one of the [=/allowed data types=] (according to [this table](#constraints-scatternd)), then [=exception/throw=] a {{TypeError}}.
+ 1. If |updates|'s [=MLOperand/dataType=] is not equal to |input|'s [=MLOperand/dataType=], then [=exception/throw=] a {{TypeError}}.
+ 1. If the [=MLOperand/rank=] of any of |input|, |indices|, or |updates| is not its [=/allowed rank=], then [=exception/throw=] a {{TypeError}}.
+ 1. Let |inputShape| be |input|'s [=MLOperand/shape=] and |inputRank| be |input|'s [=MLOperand/rank=].
+ 1. Let |indicesShape| be |indices|'s [=MLOperand/shape=] and |indicesRank| be |indices|'s [=MLOperand/rank=].
+ 1. Let |indexableSize| be |indicesRank| - 1.
+ 1. Let |coordinateSize| be |indicesShape|[|indexableSize|].
+ 1. If |coordinateSize| is greater than |inputRank|, then [=exception/throw=] a {{TypeError}}.
+ 1. Let |expectedUpdatesShape| be an empty list.
+ 1. [=list/For each=] |index| in [=the range=] 0 to |indexableSize|, exclusive:
+ 1. [=list/Append=] |indicesShape|[|index|] to |expectedUpdatesShape|.
+ 1. [=list/For each=] |index| in [=the range=] |coordinateSize| to |inputRank|, exclusive:
+ 1. [=list/Append=] |inputShape|[|index|] to |expectedUpdatesShape|.
+ 1. If |updates|'s [=MLOperand/shape=] is not [=list/equal=] to |expectedUpdatesShape|, then [=exception/throw=] a {{TypeError}}.
+ 1. Let |outputShape| be a copy of |input|'s [=MLOperand/shape=].
+ 1. Let |outputDesc| be the result of [=creating an MLOperandDescriptor=] given |input|'s [=MLOperand/dataType=] and |outputShape|.
1. *Make graph connections:*
- 1. Let |output| be the result of [=creating an MLOperand=] given [=this=] and |desc|.
- 1. Let |operator| be an [=operator=] for the "reshape" operation, given |options|.
+ 1. Let |output| be the result of [=creating an MLOperand=] given |outputDesc|.
+ 1. Let |operator| be an [=operator=] for the "scatterND" operation, given |input|, |indices|, |updates|, and |options|.
1. Set |output|.{{MLOperand/[[operator]]}} to |operator|.
- 1. Set |operator|'s [=operator/input=] to |input|.
+ 1. Set |operator|'s [=operator/inputs=] to |input|, |indices|, and |updates|.
1. Set |operator|'s [=operator/output=] to |output|.
1. Return |output|.
+
+
+
+ Examples of how scatterND works in different slicing schemes.
+
+
+ // input of shape [8]:
+ // [0, 1, 2, 3, 4, 5, 6, 7]
+ // indices of shape [4, 1]:
+ // [[4],
+ // [3],
+ // [1],
+ // [7]]
+ // updates of shape [4]:
+ // [-1, -2, -3, -4]
+ // output of shape [8]:
+ // [0, -3, 2, -2, -1, 5, 6, -4]
+
+ const input1 = builder.constant(
+ {dataType: 'float32', shape: [8]},
+ new Float32Array([0, 1, 2, 3, 4, 5, 6, 7]));
+
+ const indices1 = builder.constant(
+ {dataType: 'uint32', shape: [4, 1]},
+ new Uint32Array([4, 3, 1, 7]));
+
+ const updates1 = builder.constant(
+ {dataType: 'uint32', shape: [4]},
+ new Uint32Array([-1, -2, -3, -4]));
+
+ const output1 = builder.scatterND(input1, indices1, updates1);
+
+ // input of shape [2,2]:
+ // [[0, 1],
+ // [2, 3]]
+ // indices of shape [2,2]:
+ // [[0, 0],
+ // [1, 1]]
+ // updates of shape [2]:
+ // [-1, -2]
+ // output of shape [2,2]:
+ // [[-1, 1], <= -1 written to output coordinate [0, 0]
+ // [ 2, -2]] <= -2 written to output coordinate [1, 1]
+
+ const input2 = builder.constant(
+ {dataType: 'float32', shape: [2, 2]},
+ new Float32Array([0, 1, 2, 3]));
+
+ const indices2 = builder.constant(
+ {dataType: 'uint32', shape: [2, 2]},
+ new Uint32Array([0, 0, 1, 1]));
+
+ const updates2 = builder.constant(
+ {dataType: 'uint32', shape: [2]},
+ new Uint32Array([-1, -2]));
+
+ const output2 = builder.scatterND(input2, indices2, updates2);
+
+ // input of shape [3,2]:
+ // [[0, 1],
+ // [2, 3],
+ // [4, 5]]
+ // indices of shape [2,1]:
+ // [[2],
+ // [0]]
+ // updates of shape [2,2]:
+ // [[-1, -2],
+ // [-3, -4]]
+ // output of shape [3,2]:
+ // [[-3 ,-4], <= [-3, -4] written to output coordinates [0, *]
+ // [ 2, 3],
+ // [-1, -2]] <= [-1, -2] written to output coordinates [2, *]
+
+ const input3 = builder.constant(
+ {dataType: 'float32', shape: [3, 2]},
+ new Float32Array([0, 1, 2, 3, 4, 5]));
+
+ const indices3 = builder.constant(
+ {dataType: 'uint32', shape: [2, 1]},
+ new Uint32Array([1, 0]));
+
+ const updates3 = builder.constant(
+ {dataType: 'uint32', shape: [2, 2]},
+ new Uint32Array([-1, -2, -3, 4]));
+
+ const output3 = builder.scatterND(input3, indices3, updates3);
+
+ // input of shape [2,2,2]:
+ // [[[0, 1],
+ // [2, 3]],
+ // [[4, 5],
+ // [6, 7]]]
+ // indices of shape [2,2]:
+ // [[0, 1],
+ // [1, 0]]
+ // updates of shape [2,2]:
+ // [[-1, -2],
+ // [-3, -4]]
+ // output of shape [2,2,2]:
+ // [[[ 0, 1],
+ // [-1, -2]], <= [-1, -2] written to output coordinates [0, 1, *]
+ // [[-3, -4], <= [-3, -4] written to output coordinates [1, 0, *]
+ // [ 6, 7]]]
+
+ const input4 = builder.constant(
+ {dataType: 'float32', shape: [2, 2, 2]},
+ new Float32Array([0, 1, 2, 3, 4, 5, 6, 7]));
+
+ const indices4 = builder.constant(
+ {dataType: 'uint32', shape: [2, 2]},
+ new Uint32Array([0, 1, 1, 0]));
+
+ const updates4 = builder.constant(
+ {dataType: 'uint32', shape: [2, 2]},
+ new Uint32Array([-1, -2, -3, 4]));
+
+ const output4 = builder.scatterND(input4, indices4, updates4);
+
+
+
+
### sigmoid ### {#api-mlgraphbuilder-sigmoid-method}
Compute the sigmoid function of the input tensor. The calculation follows the expression `1 / (exp(-x) + 1)`.
@@ -7422,23 +8780,39 @@ partial dictionary MLOpSupportLimits {
### slice ### {#api-mlgraphbuilder-slice}
Produce a slice of the input tensor.
+
+{{MLSliceOptions}} has the following members:
+
+ : strides
+ ::
+ The stride to step over each input along each axis.
+ The length of the strides array must equal the [=MLOperand/rank=] of the input tensor.
+ The default is an array of length [=MLOperand/rank=] consisting of all 1's.
+ e.g. [1,1,1] for a 3-D tensor.
+ Strides must be greater than zero.
+
+
**Arguments:**
- input: an {{MLOperand}}. The input tensor.
- - starts: [=sequence=]<{{unsigned long}}>. The starting index to slice of each input dimension, of length N where N is the [=MLOperand/rank=] of the input tensor. For each dimension *d* of {{MLGraphBuilder/slice(input, starts, sizes, options)/input}}, {{MLGraphBuilder/slice(input, starts, sizes, options)/starts}}[*d*] indicates the starting index to slice in that dimension. The starting index must be in the range [0, input size - 1] in that dimension.
+ - starts: a [=sequence=]<{{unsigned long}}>. The starting index to slice of each input dimension, of length N where N is the [=MLOperand/rank=] of the input tensor. For each dimension *d* of {{MLGraphBuilder/slice(input, starts, sizes, options)/input}}, {{MLGraphBuilder/slice(input, starts, sizes, options)/starts}}[*d*] indicates the starting index to slice in that dimension. The starting index must be in the range [0, input size - 1] in that dimension.
- sizes: a [=sequence=]<{{unsigned long}}>. The number of elements to slice of each input dimension, of length N where N is the [=MLOperand/rank=] of the input tensor. For each dimension *d* of {{MLGraphBuilder/slice(input, starts, sizes, options)/input}}, {{MLGraphBuilder/slice(input, starts, sizes, options)/sizes}}[*d*] indicates the number of elements to slice in that dimension. The size must not be 0 and must satisfy the constraint `starting index + size <= input size` in that dimension.
- - options: an {{MLOperatorOptions}}. Specifies the optional parameters of the operation.
+ - options: an {{MLSliceOptions}}. Specifies the optional parameters of the operation.
**Returns:** an {{MLOperand}}. The output tensor of the same rank as the input tensor with tensor values stripped to the specified starting and ending indices in each dimension.
@@ -7478,15 +8852,29 @@ partial dictionary MLOpSupportLimits {
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
1. If any of |sizes|'s [=list/items=] are 0, then [=exception/throw=] a {{TypeError}}.
1. If |starts|'s [=list/size=] and |sizes|'s [=list/size=] are not both equal to |input|'s [=MLOperand/rank=], then [=exception/throw=] a {{TypeError}}.
- 1. [=list/For each=] |index| in [=the range=] 0 to |input|'s [=MLOperand/rank=], exclusive:
- 1. If |sizes|[|index|] is 0, then [=exception/throw=] a {{TypeError}}.
+ 1. Let |strides| be a new [=/list=].
+ 1. If |options|.{{MLSliceOptions/strides}} [=map/exists=]:
+ 1. Set |strides| to |options|.{{MLSliceOptions/strides}}.
+ 1. If |strides|'s [=list/size=] is not equal to |input|'s [=MLOperand/rank=], then [=exception/throw=] a {{TypeError}}.
+ 1. Let |inputShape| be |input|'s [=MLOperand/shape=] and |inputRank| be |input|'s [=MLOperand/rank=].
+ 1. Let |outputShape| be a new [=/list=].
+ 1. [=list/For each=] |index| in [=the range=] 0 to |inputRank|, exclusive:
+ 1. Let |inputSize| be |inputShape|[|index|].
+ 1. Let |inputSliceSize| be |sizes|[|index|].
+ 1. Let |stride| be |strides|[|index|] if it is not empty, or 1 otherwise:
+ 1. If |inputSliceSize| is 0, then [=exception/throw=] a {{TypeError}}.
Issue(391): If 0-size dimensions are allowed, revise these steps.
- 1. If |starts|[|index|] is greater than or equal to |input|'s [=MLOperand/shape=][|index|], then [=exception/throw=] a {{TypeError}}.
- 1. If |starts|[|index|] + |sizes|[|index|] is greater than |input|'s [=MLOperand/shape=][|index|], then [=exception/throw=] a {{TypeError}}.
+ 1. If |stride| is less than 1, then [=exception/throw=] a {{TypeError}}.
+ 1. If |starts|[|index|] is greater than |inputSize|, then [=exception/throw=] a {{TypeError}}.
+ 1. If |starts|[|index|] + |inputSliceSize| is greater than |inputSize|, then [=exception/throw=] a {{TypeError}}.
+ 1. Let |outputSizeRoundingExcess| be 1 if |inputSliceSize| % |stride| != 0, or 0 otherwise.
+ 1. Let |outputSize| be floor(|inputSliceSize| / |stride|) + |outputSizeRoundingExcess|:
+ 1. [=list/Append=] |outputSize| to |outputShape|.
+ 1. Let |outputDesc| be the result of [=creating an MLOperandDescriptor=] given |input|'s [=MLOperand/dataType=] and |outputShape|.
1. *Make graph connections:*
- 1. Let |output| be the result of [=copying an MLOperand=] given |input|.
+ 1. Let |output| be the result of [=creating an MLOperand=] given |outputDesc|.
1. Let |operator| be an [=operator=] for the "slice" operation, given |starts|, |sizes|, and |options|.
1. Set |output|.{{MLOperand/[[operator]]}} to |operator|.
1. Set |operator|'s [=operator/input=] to |input|.
@@ -7961,6 +9349,81 @@ partial dictionary MLOpSupportLimits {
+### tile ### {#api-mlgraphbuilder-tile}
+Repeat a tensor the given number of times along each dimension.
+
+
+
+
+ **Arguments:**
+ - input: an {{MLOperand}}. The input N-D tensor.
+ - repetitions: A count per dimension of how many times to repeat that dimension. The [=list/size=] must match the {{MLGraphBuilder/tile(input, repetitions, options)/input}}'s [=MLOperand/rank=], using 1's for any axis that should retain the same size.
+ - options: an optional {{MLOperatorOptions}}. The optional parameters of the operation.
+
+ **Returns:** an {{MLOperand}}. The reversed N-D tensor.
+
+
+
+ Constraints for {{MLGraphBuilder/tile()}}
+
+
+ operand |
+ [=/allowed data types=] |
+ [=/allowed ranks=] |
+
+
+
+ {{input}} |
+ [=/any data type|any=] |
+ [=/any rank|N=] |
+
+
+ *output* |
+ [=/same type as|same as=] {{input}} |
+ [=/same rank as|same as=] {{input}} |
+
+
+
+{{MLOpSupportLimits}} has the following members for {{MLGraphBuilder/tile()}}:
+
+ : tile
+ :: Support limits for operator {{MLGraphBuilder/tile()}}.
+
+
+
+
+ The tile(|input|, |repetitions|, |options|) method steps are:
+
+ 1. If [=this=] [=MLGraphBuilder/can not build=], then [=exception/throw=] an "{{InvalidStateError}}" {{DOMException}}.
+ 1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
+ 1. If |repetitions|'s [=list/size=] is not equal to |input|'s [=MLOperand/rank=], then [=exception/throw=] a {{TypeError}}.
+ 1. If |repetitions|'s values contain 0's, then [=exception/throw=] a {{TypeError}}.
+
+ Issue(391): If 0-size dimensions are allowed, revise these steps.
+
+ 1. Let |outputShape| be a copy of |input|'s [=MLOperand/shape=].
+ 1. [=list/For each=] |index| in [=the range=] 0 to |outputShape|'s [=list/size=], exclusive:
+ 1. Set |outputShape|[|index|] to |outputShape|[|index|] * |repetitions|[|index|].
+ 1. Let |outputDescriptor| be the result of [=creating an MLOperandDescriptor=] given |input|'s [=MLOperand/dataType=] and |outputShape|.
+ 1. *Make graph connections:*
+ 1. Let |output| be the result of [=creating an MLOperand=] given |outputDescriptor|.
+ 1. Let |operator| be an [=operator=] for the "tile" operation, given |options|.
+ 1. Set |output|.{{MLOperand/[[operator]]}} to |operator|.
+ 1. Set |operator|'s [=operator/input=] to |input|.
+ 1. Set |operator|'s [=operator/output=] to |output|.
+ 1. Return |output|.
+
+
### transpose ### {#api-mlgraphbuilder-transpose}
Permute the dimensions of the input tensor according to {{MLTransposeOptions/permutation}}.
@@ -8132,7 +9595,8 @@ partial dictionary MLOpSupportLimits {
// [9, 4, 8],
// [2, 6, 3]]
const input = builder.constant(
- {shape: [3, 3]}, new Float32Array([7, 1, 2, 9, 4, 8, 2, 6, 3]));
+ {dataType: 'float32', shape: [3, 3]},
+ new Float32Array([7, 1, 2, 9, 4, 8, 2, 6, 3]));
// upper triangular matrix:
// [[7, 1, 2],
@@ -8320,6 +9784,8 @@ The shapes of the input tensors must be compatible. A tensor is [=unidirectional
Two tensors are [=bidirectionally broadcastable=] if they can be mutually "stretched" (repeated) across their various dimensions, starting from the last dimension. For example, a *[5,1]* tensor can be bidirectionally broadcast with a *[1,6]* tensor by repeating the first tensor 6 times in the last dimension and the second tensor 5 times in preceding dimension. The result of the operation will be a *[5,6]* tensor. Bidirectional broadcasting is convenient for element-wise operations.
+A tensor is [=blockwise broadcastable=] if the all dimensions can be upsampled by integer multiples to the target tensor's shape. For example, a *[4,5]* tensor can be blockwise broadcast up to a *[16,10]* tensor as it is an exact multiple (16 % 4 = 0, 10 % 5 = 0) by repeating every element 4 times in the first dimension and every element 2 times in the last dimension (e.g. values *[1,2,3,4,5]* in the last dimensions would be repeated to *[1,1,2,2,3,3,4,4,5,5]*). However, a *[4,5]* tensor would be incompatible with a *[9,3]* tensor since both dimensions have a nonzero remainder (9 % 4 = 1, 3 % 5 = 3). Blockwise broadcasting is useful for sharing common values in larger blocks to save memory. Both tensors are expected to have the same rank, and the output shape is simply the target tensor's shape which the smaller one is being upsampled to.
+
Some operations allow broadcasting with special semantics. For example, {{MLGraphBuilder/matmul()}} treats the last two dimensions of the input tensors as the rows and columns of the matrices, and the number of columns in the first matrix must be equal to the number of rows in the second matrix. The matrix multiplication is bidirectionally broadcast across any additional dimensions, treating the input tensors as stacks of matrices to multiply.
@@ -8372,6 +9838,22 @@ To bidirectionally broadcast the sha
|shapeA| is bidirectionally broadcastable to |shapeB| if [=bidirectionally broadcasting=] |shapeA| and |shapeB| does not result in failure.
+
+
+To blockwise broadcast the shapes |shapeFrom| and |shapeTo|, perform the following steps. |shapeFrom| and |shapeTo| are [=/lists=] of positive integers, representing the dimensions of tensors, and the steps return true or false.
+
+
+1. If |shapeFrom|'s [=list/size=] is not equal to |shapeTo|'s [=list/size=], then return false.
+1. [=list/For each=] |index| in [=the range=] 0 to |shapeTo|'s [=list/size=], exclusive:
+ 1. If |shapeFrom|[|index|] is not exactly divisible into |shapeTo|[|index|], then return false.
+1. Return true.
+
+
+
+
+|shapeFrom| is blockwise broadcastable to |shapeTo| if [=blockwise broadcasting=] |shapeFrom| and |shapeTo| returns true.
+
+
## Casting ## {#algorithms-casting}
Explicit numeric casting is used in algorithms where parameters passed as {{MLNumber}} or {{double}} need to be converted to match the {{MLOperandDataType}} of input or output {{MLOperand}}s.
@@ -8581,8 +10063,8 @@ Operations present in other neural network inference APIs can often be emulated
function flatten(builder, input, axis) {
if (axis > input.shape.length)
return input;
- const before = axis.slice(0, axis).reduce((a, b) => a * b);
- const after = axis.slice(axis, input.shape.length).reduce((a, b) => a * b);
+ const before = axis.slice(0, axis).reduce((a, b) => a * b, 1);
+ const after = axis.slice(axis, input.shape.length).reduce((a, b) => a * b, 1);
return builder.reshape(input, [before, after]);
}
@@ -9139,6 +10621,12 @@ Thanks to Feng Dai for his continuous contributions that keep web-platform-tests
"Thomas Scialom"
],
"date": "July 2023"
+ },
+ "Prefix-Sum": {
+ "href": "https://en.wikipedia.org/wiki/Prefix_sum",
+ "title": "Prefix Sum",
+ "authors": ["The Wikipedia community"],
+ "date": "January 2025"
}
}