AWS State Machine Execution Input is an array

99 Views Asked by At

How do I force a the input of a state machine to be an object not an array?

I have a queue which contains jobs where the payload looks like

{ "domain": "stackoverflow.com" }

Using EventBridge Pipes I have a Pipe where the source is the queue and the target is a Step Function State Machine. The state machine receives each job from the queue as expected. Left with the default settings, the Step Function receives

{ "body": "{\"domain\": \"stackoverflow.com\"}", ... }

When setting the Pipe Target Input Transformer to

{ "domain": "<$.body.domain>" }
// <$.body> causes execution error despite showing the correct output in the tester in the pipes UI

The Step function receives

[ { "domain": "stackoverflow.com" } ]

Herein lies my question.

How do I prevent a step function receiving an array as input? (Why is it an array in the first place?)

I suppose the additional notes/context for this is

  • Use of execution input throughout state machine - There are many nested steps throughout the state machine, almost all of which require the original input. If it's in the wrong format, I either rely on using "InputPath": "$.[0]" on every step in the state or pass the domain down between each function. InputPath isn't the worst option as I'll likely have to use it anyway.
  • Readability/understanding - It means additional code in my lambda where I have to accept/handle an array of input which makes no logical sense given there's a single object required and will never be multiple items.
  • Console testing - moot probably as this doesn't matter often, but while testing I'm manually triggering the state machine from the console with some test data. Twice already I've sent {} instead of [] because my brain is wired that the function expects a single object input.
0

There are 0 best solutions below