Programming interface: Cockpit HUD
Suggestion:
Allow programmable blocks to draw onto a seated players HUD like we can currently draw on LCDs.
implementation:
We can currently do: block.GetSurface( <surface number> ) to get a graphics surface onto which we can draw almost anything. I suggest it is implement in a similar way for cockpits in the way of
cockpit.GetHudSurface()
Which would return a surface allowing drawing directly onto the HUD of a player seated in the referenced cockpit.
Currently for LCDs you create a custom texture and render it onto the LCD, Do the same but overlay it onto the players hud before any interface components are drawn, to prevent covering important things. When a player exits the seat / cockpit, clear/delete this texture.
Benefits:
This would allow things such predictive targeting reticules, navigational helpers, cargo indicators, speed warnings ( like displayng a calculated deacceleration time based on mass and thruster forces ), comms, presence indicators using the raycast system, and so, so much more.
It would be completely optional as without making a script inside a programmable block, or having player scripts turned on, it wouldnt inpact the player at all.
Cannot be used maliciously, as the player can just exit the cockpit as normal,
And it should be fairly simple to implement unless additional features are wanted.
Replies have been locked on this page!