🎉Community Raffle - Win $25

An exclusive raffle opportunity for active members like you! Complete your profile, answer questions and get your first accepted badge to enter the raffle.
Join and Win

Inspire Python API - user interaction with mouse to get coordinates for sketching and modeling?

User: "briantrease"
Altair Community Member

Hello! I'd like to create some custom sketching and geometry tools in Inspire. Does the Inspire API have any features that would allow me to create scripts/extensions that incorporate user input via mouse clicks on a sketch/plane? I've been unable to find a method that will collect the coordinates of a clicked point on the plane. An example test case I'm working on is to simply replicate the Sketch > Line tool. Thanks!

(I did play with onMousePress(), but it seems to only collect data in screen coordinates, not sketch coordinates.)

Find more posts tagged with

Sort by:
1 - 2 of 21

    Hi Brian,

    Yes, it is possible to create custom sketching tools in Altair Inspire that respond to user input via mouse clicks on a sketch plane. While the onMousePress( ) method by default captures screen coordinates, you can use Inspire’s API—specifically the getPointOnPlane( ) or getPointOnXyPlane( ) methods from the hwx.inspire.gui.Manipulator class—to convert those clicks into model-space coordinates on a defined plane. To replicate something like the Sketch > Line tool, you would create a custom manipulator class that tracks mouse presses, translates them into 3D points on the sketch plane, and then uses Inspire’s sketching API to draw geometry such as line segments between two clicked points. This allows for interactive geometry creation directly from user input, leveraging the full functionality of Inspire's scripting interface.

    Specifically, methods like:

    • getPointOnPlane(event, origin, normal, snapping=True)
    • getPointOnXyPlane(event, xform, snapping=True)

    The key is calling self.getPointOnPlane(event, origin, normal) inside onMousePress( ) — that will give you click coordinates in model space, not just screen-space pixels .

    I am adding the python code(rough outline) How to use it for a “Sketch → Line” tool

    You can create a custom manipulator class and override onMousePress( ) to capture the clicked point, convert it, and build your tool logic.

    Regards,

    Sourav

    Hi Brian,

    Yes, it is possible to create custom sketching tools in Altair Inspire that respond to user input via mouse clicks on a sketch plane. While the onMousePress( ) method by default captures screen coordinates, you can use Inspire’s API—specifically the getPointOnPlane( ) or getPointOnXyPlane( ) methods from the hwx.inspire.gui.Manipulator class—to convert those clicks into model-space coordinates on a defined plane. To replicate something like the Sketch > Line tool, you would create a custom manipulator class that tracks mouse presses, translates them into 3D points on the sketch plane, and then uses Inspire’s sketching API to draw geometry such as line segments between two clicked points. This allows for interactive geometry creation directly from user input, leveraging the full functionality of Inspire's scripting interface.

    Specifically, methods like:

    • getPointOnPlane(event, origin, normal, snapping=True)
    • getPointOnXyPlane(event, xform, snapping=True)

    The key is calling self.getPointOnPlane(event, origin, normal) inside onMousePress( ) — that will give you click coordinates in model space, not just screen-space pixels .

    I am adding the python code(rough outline) How to use it for a “Sketch → Line” tool

    You can create a custom manipulator class and override onMousePress( ) to capture the clicked point, convert it, and build your tool logic.

    Regards,

    Sourav