Command Routing is a concept inspired on a similar technology found in the Windows Presentation Foundation stack. In essence command routing is about segregation of executable code and the call-site. It also forms an excellent abstraction for game-engines to deal with input and gestures.
Traditional methods such as event based input-handling or polling from an queue have a few shortcomings. They clutter an considerable amount of your code-base with state-machines, hard-coded constants enough to make design change a non-trivial problem to recover from. And to make things worse consider multi-platform where some windowing toolkits don't even support half of the keys they should support (Caps Lock and Glut anyone?).
When we look closer to abstractions of the two, we notice that the traditional dispatching has a conflict in it's identity concerning parallel execution. Traditional dispatching is inherently chronological because changes in the state of an input device are chronological. Automatically implies that any parallel execution of code is 'user-mode scheduled' and thus it cannot be part of the abstraction. Command Routing circumvents this problem. Here we do not consider changes in the state of input devices as actual event but we consider them as triggers for actual dispatched events which are input device agnostic by themselves.
Above is a small sample code used in my personal code base. First we register an command. After that we attach one or multiple input bindings, input bindings are essentially patterns that need to be matched for the command to be triggered. And lastly we register our execution handler.
Interesting note is that Command Routing from an abstraction point of view doesn't limit itself to just input device state changes as source triggers but it permits any trigger as stimulant. Some of the more complicated scenarios I have considered include events based on the hit-count of a key or on a history of pressed keys, joy-stick gestures and possible exotic hardware such as Kinnect.
Traditional methods such as event based input-handling or polling from an queue have a few shortcomings. They clutter an considerable amount of your code-base with state-machines, hard-coded constants enough to make design change a non-trivial problem to recover from. And to make things worse consider multi-platform where some windowing toolkits don't even support half of the keys they should support (Caps Lock and Glut anyone?).
When we look closer to abstractions of the two, we notice that the traditional dispatching has a conflict in it's identity concerning parallel execution. Traditional dispatching is inherently chronological because changes in the state of an input device are chronological. Automatically implies that any parallel execution of code is 'user-mode scheduled' and thus it cannot be part of the abstraction. Command Routing circumvents this problem. Here we do not consider changes in the state of input devices as actual event but we consider them as triggers for actual dispatched events which are input device agnostic by themselves.
void __stdcall MyCommand_Forward( RoutedCommandService::BaseEventArg* eventArgs, FirstPersonCameraController* __restrict thisPtr )
{
if( eventArgs->eventType == RoutedCommandService::KeyDown )
{
if( thisPtr->m_Camera->IsPrimary() == true )
{
thisPtr->state[0] = CameraActionStates_Forward;
}
}
else if( eventArgs->eventType == RoutedCommandService::KeyUp )
{
thisPtr->state[0] = CameraActionStates_None;
}
}
//Setup mandatory command registration
RoutedCommandService::RegisterCommand"CAMERASTRAFEFORWARD");
RoutedCommandService::RegisterCommand("CAMERASTRAFEBACKWARD");
RoutedCommandService::RegisterCommand("CAMERASTRAFELEFT");
RoutedCommandService::RegisterCommand("CAMERASTRAFERIGHT");
//Setup key binding
RoutedCommandService::RegisterInputBinding("CAMERASTRAFEFORWARD", aurora::Key::W, RoutedCommandService::Pressed | RoutedCommandService::Released);
RoutedCommandService::RegisterInputBinding("CAMERASTRAFEBACKWARD", aurora::Key::S, RoutedCommandService::Pressed | RoutedCommandService::Released);
RoutedCommandService::RegisterInputBinding("CAMERASTRAFELEFT", aurora::Key::A, RoutedCommandService::Pressed | RoutedCommandService::Released);
RoutedCommandService::RegisterInputBinding("CAMERASTRAFERIGHT", aurora::Key::D, RoutedCommandService::Pressed | RoutedCommandService::Released);
//Setup execution handler
RoutedCommandService::RegisterCommandBinding("CAMERASTRAFEFORWARD", MyCommand_Forward, this );
RoutedCommandService::RegisterCommandBinding("CAMERASTRAFEBACKWARD", MyCommand_Backward, this );
RoutedCommandService::RegisterCommandBinding("CAMERASTRAFELEFT", MyCommand_StrafeLeft, this );
RoutedCommandService::RegisterCommandBinding("CAMERASTRAFERIGHT", MyCommand_StrafeRight, this );
Above is a small sample code used in my personal code base. First we register an command. After that we attach one or multiple input bindings, input bindings are essentially patterns that need to be matched for the command to be triggered. And lastly we register our execution handler.
Interesting note is that Command Routing from an abstraction point of view doesn't limit itself to just input device state changes as source triggers but it permits any trigger as stimulant. Some of the more complicated scenarios I have considered include events based on the hit-count of a key or on a history of pressed keys, joy-stick gestures and possible exotic hardware such as Kinnect.
Comments
Post a Comment