Automator PostProcess Script to launch another process
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:
Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------
Best Answer
-
Max,Max_20905 said:Mo,
The script almost works. The first process runs fine, but doesn't kick off the second process and I get this error:
Error in RunSP method. Could not load file or assembly 'Datawatch.Core.Logging, Version=14.0.0.87, Culture=neutral, PublicKeyToken=47f651eeea82774c' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 11:52 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
I compiled the script and there was no error.
Add the following script to the first project post-export script. Replace the instancename, configserver, and processname variables with valid values.
Dim msg As String = ""
Dim ok As Boolean = true
Dim InstanceName As String ="DefaultInstance"
Dim ConfigServer As string = "net.tcp://WIN201264DC3:808"
Dim Processname As string = "YourSecondProcessName"
Try
If Exportcompleted then
ok = RunSP(ProcessName, msg, InstanceName, ConfigServer)
If ok=false then log.addevent("Error in RunSP method. " & msg)
else
log.addevent("Error...Export to sql database table failed")
End If
catch ex As Exception
log.addevent("Error in Post-Export script. " & ex.message)
End Try
Add the following script in the first project post-export script in the global declarations tab. If you intend you use the following script in other Std Processes then you may add it in the global script's global declarations ta.
Function RunSP(byVal ProcessName As String, byRef msg As String, Optional byVal InstanceName As String = "DefaultInstance", Optional ByVal ConfigServer As String = "net.tcp://WIN201264DC3:808") As Boolean
Dim oPump As PumpAPI = Nothing
Dim ok As Boolean = True
Try
oPump = New PumpAPI(InstanceName, ConfigServer)
oPump.StartProcess(ProcessName)
catch ex as exception
ok = False
msg = ex.message
End Try
If Not IsNothing(oPump) Then oPump = Nothing
Return ok
End Function
Locate the Monarch Server Automator installation package. in the \Installer Advanced 64-bit\Tools folder un-compress all files from Datawatch.DataPump.PumpAPI.7z file in to a folder on the Automator server, and then make reference to the PumpAPI in the SP global script references, imports tab (per screenshot below). I also assumed you installed x64 version of the Automator. If you installed x86 version of the Automator then you need to get the pumpapi files from \Installer Advanced 86-bit\Tools folder.
Note that in my environment, I un-compressed all files in C:\Datawatch\Datawatch.DataPump.PumpAPI_14.3 folder. Its a good idea to add system.io to the namespace too.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
Original Message:
Sent: 11-22-2019 11:15 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"
One reason that you are getting this error is, its version mismatch.
For example if the Automator version is 15.3 but you un-compress Pumpapi version 14, then you will get this error.
Make sure that you use the same Pumpapi zip file that is shipped with the Automator software package.
If you check the DwchServer.PumpAPI.dll version in the folder that you un-compressed should match any DLLs in the c:\program files\Datawatch\Datawatch\agent folder (please see screenshots below)
Also ensure that all references are correct (per my pervious screenshot that I sent you earlier).
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 03:36 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Mo,
The script almost works. The first process runs fine, but doesn't kick off the second process and I get this error:
Error in RunSP method. Could not load file or assembly 'Datawatch.Core.Logging, Version=14.0.0.87, Culture=neutral, PublicKeyToken=47f651eeea82774c' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-22-2019 11:52 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
I compiled the script and there was no error.
Add the following script to the first project post-export script. Replace the instancename, configserver, and processname variables with valid values.
Dim msg As String = ""
Dim ok As Boolean = true
Dim InstanceName As String ="DefaultInstance"
Dim ConfigServer As string = "net.tcp://WIN201264DC3:808"
Dim Processname As string = "YourSecondProcessName"
Try
If Exportcompleted then
ok = RunSP(ProcessName, msg, InstanceName, ConfigServer)
If ok=false then log.addevent("Error in RunSP method. " & msg)
else
log.addevent("Error...Export to sql database table failed")
End If
catch ex As Exception
log.addevent("Error in Post-Export script. " & ex.message)
End Try
Add the following script in the first project post-export script in the global declarations tab. If you intend you use the following script in other Std Processes then you may add it in the global script's global declarations ta.
Function RunSP(byVal ProcessName As String, byRef msg As String, Optional byVal InstanceName As String = "DefaultInstance", Optional ByVal ConfigServer As String = "net.tcp://WIN201264DC3:808") As Boolean
Dim oPump As PumpAPI = Nothing
Dim ok As Boolean = True
Try
oPump = New PumpAPI(InstanceName, ConfigServer)
oPump.StartProcess(ProcessName)
catch ex as exception
ok = False
msg = ex.message
End Try
If Not IsNothing(oPump) Then oPump = Nothing
Return ok
End Function
Locate the Monarch Server Automator installation package. in the \Installer Advanced 64-bit\Tools folder un-compress all files from Datawatch.DataPump.PumpAPI.7z file in to a folder on the Automator server, and then make reference to the PumpAPI in the SP global script references, imports tab (per screenshot below). I also assumed you installed x64 version of the Automator. If you installed x86 version of the Automator then you need to get the pumpapi files from \Installer Advanced 86-bit\Tools folder.
Note that in my environment, I un-compressed all files in C:\Datawatch\Datawatch.DataPump.PumpAPI_14.3 folder. Its a good idea to add system.io to the namespace too.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 11:15 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0
Answers
-
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------
"0 -
Hi Baba,Baba Majekodunmi_21731 said:
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------
"
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------
"0 -
Hi Max,Max_20905 said:Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------
"
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0 -
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!Baba Majekodunmi_21731 said:Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0 -
Awesome. I will try and get it done this evening or tmw morning.Max_20905 said:No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"
will tag you once it's completed.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0 -
As Baba suggested you can migrate your standard processes into one Visual Process.Baba Majekodunmi_21731 said:
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------
"
That being said, can still use your existing monitoring process.
Add both projects in the monitoring process.
The first project exports data from a flat files to a SQL table.
The second project exports data from the SQL table (created in the first project) to an Excel file, and place it on the shared drive.
The trick is, configure the first project to move/rename/delete the input in the input distribution.
In the second project in the input tab, select not required option (under file grouping), and add a Log input distribution. The Log input distribution adds a message to the server event log. This allows you to configure the process as monitoring process, which only monitors for arrival of the flat files.
No need to add a post-export script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------
"0 -
Hi @Max Strahan,Max_20905 said:No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0 -
Hi @Mahmoud Abdolrahim, thanks for the response. I've already tried setting it up as you mentioned but that's where the original problem I've been having came up. I tried setting it up as a standard process with 2 projects on it, the first being the export from file to SQL table, then the second being SQL table to spreadsheet. When I turn file monitoring on it says I need an input distribution setup so I set one up for the first project but do not have an option to set an input distribution for the second one since its input is from a SQL table, the tab just isn't there. So, that's when I got the idea of splitting it into two standard processes with the first being on file monitor to pick up the files and then when it's finished, use a post process script to launch the second process to create the spreadsheet and drop it on the share drive.Mahmoud said:As Baba suggested you can migrate your standard processes into one Visual Process.
That being said, can still use your existing monitoring process.
Add both projects in the monitoring process.
The first project exports data from a flat files to a SQL table.
The second project exports data from the SQL table (created in the first project) to an Excel file, and place it on the shared drive.
The trick is, configure the first project to move/rename/delete the input in the input distribution.
In the second project in the input tab, select not required option (under file grouping), and add a Log input distribution. The Log input distribution adds a message to the server event log. This allows you to configure the process as monitoring process, which only monitors for arrival of the flat files.
No need to add a post-export script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------
"
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 05:01 PM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
As Baba suggested you can migrate your standard processes into one Visual Process.
That being said, can still use your existing monitoring process.
Add both projects in the monitoring process.
The first project exports data from a flat files to a SQL table.
The second project exports data from the SQL table (created in the first project) to an Excel file, and place it on the shared drive.
The trick is, configure the first project to move/rename/delete the input in the input distribution.
In the second project in the input tab, select not required option (under file grouping), and add a Log input distribution. The Log input distribution adds a message to the server event log. This allows you to configure the process as monitoring process, which only monitors for arrival of the flat files.
No need to add a post-export script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------
"0 -
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.Baba Majekodunmi_21731 said:Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
-------------------------------------------
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0 -
Hi Max,Max_20905 said:Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
------------------------------
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0 -
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.Mahmoud said:Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0 -
Max,Max_20905 said:I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"
I compiled the script and there was no error.
Add the following script to the first project post-export script. Replace the instancename, configserver, and processname variables with valid values.
Dim msg As String = ""
Dim ok As Boolean = true
Dim InstanceName As String ="DefaultInstance"
Dim ConfigServer As string = "net.tcp://WIN201264DC3:808"
Dim Processname As string = "YourSecondProcessName"
Try
If Exportcompleted then
ok = RunSP(ProcessName, msg, InstanceName, ConfigServer)
If ok=false then log.addevent("Error in RunSP method. " & msg)
else
log.addevent("Error...Export to sql database table failed")
End If
catch ex As Exception
log.addevent("Error in Post-Export script. " & ex.message)
End Try
Add the following script in the first project post-export script in the global declarations tab. If you intend you use the following script in other Std Processes then you may add it in the global script's global declarations ta.
Function RunSP(byVal ProcessName As String, byRef msg As String, Optional byVal InstanceName As String = "DefaultInstance", Optional ByVal ConfigServer As String = "net.tcp://WIN201264DC3:808") As Boolean
Dim oPump As PumpAPI = Nothing
Dim ok As Boolean = True
Try
oPump = New PumpAPI(InstanceName, ConfigServer)
oPump.StartProcess(ProcessName)
catch ex as exception
ok = False
msg = ex.message
End Try
If Not IsNothing(oPump) Then oPump = Nothing
Return ok
End Function
Locate the Monarch Server Automator installation package. in the \Installer Advanced 64-bit\Tools folder un-compress all files from Datawatch.DataPump.PumpAPI.7z file in to a folder on the Automator server, and then make reference to the PumpAPI in the SP global script references, imports tab (per screenshot below). I also assumed you installed x64 version of the Automator. If you installed x86 version of the Automator then you need to get the pumpapi files from \Installer Advanced 86-bit\Tools folder.
Note that in my environment, I un-compressed all files in C:\Datawatch\Datawatch.DataPump.PumpAPI_14.3 folder. Its a good idea to add system.io to the namespace too.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 11:15 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0 -
Mo,Mahmoud said:Max,
I compiled the script and there was no error.
Add the following script to the first project post-export script. Replace the instancename, configserver, and processname variables with valid values.
Dim msg As String = ""
Dim ok As Boolean = true
Dim InstanceName As String ="DefaultInstance"
Dim ConfigServer As string = "net.tcp://WIN201264DC3:808"
Dim Processname As string = "YourSecondProcessName"
Try
If Exportcompleted then
ok = RunSP(ProcessName, msg, InstanceName, ConfigServer)
If ok=false then log.addevent("Error in RunSP method. " & msg)
else
log.addevent("Error...Export to sql database table failed")
End If
catch ex As Exception
log.addevent("Error in Post-Export script. " & ex.message)
End Try
Add the following script in the first project post-export script in the global declarations tab. If you intend you use the following script in other Std Processes then you may add it in the global script's global declarations ta.
Function RunSP(byVal ProcessName As String, byRef msg As String, Optional byVal InstanceName As String = "DefaultInstance", Optional ByVal ConfigServer As String = "net.tcp://WIN201264DC3:808") As Boolean
Dim oPump As PumpAPI = Nothing
Dim ok As Boolean = True
Try
oPump = New PumpAPI(InstanceName, ConfigServer)
oPump.StartProcess(ProcessName)
catch ex as exception
ok = False
msg = ex.message
End Try
If Not IsNothing(oPump) Then oPump = Nothing
Return ok
End Function
Locate the Monarch Server Automator installation package. in the \Installer Advanced 64-bit\Tools folder un-compress all files from Datawatch.DataPump.PumpAPI.7z file in to a folder on the Automator server, and then make reference to the PumpAPI in the SP global script references, imports tab (per screenshot below). I also assumed you installed x64 version of the Automator. If you installed x86 version of the Automator then you need to get the pumpapi files from \Installer Advanced 86-bit\Tools folder.
Note that in my environment, I un-compressed all files in C:\Datawatch\Datawatch.DataPump.PumpAPI_14.3 folder. Its a good idea to add system.io to the namespace too.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 11:15 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"
The script almost works. The first process runs fine, but doesn't kick off the second process and I get this error:
Error in RunSP method. Could not load file or assembly 'Datawatch.Core.Logging, Version=14.0.0.87, Culture=neutral, PublicKeyToken=47f651eeea82774c' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 11:52 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
I compiled the script and there was no error.
Add the following script to the first project post-export script. Replace the instancename, configserver, and processname variables with valid values.
Dim msg As String = ""
Dim ok As Boolean = true
Dim InstanceName As String ="DefaultInstance"
Dim ConfigServer As string = "net.tcp://WIN201264DC3:808"
Dim Processname As string = "YourSecondProcessName"
Try
If Exportcompleted then
ok = RunSP(ProcessName, msg, InstanceName, ConfigServer)
If ok=false then log.addevent("Error in RunSP method. " & msg)
else
log.addevent("Error...Export to sql database table failed")
End If
catch ex As Exception
log.addevent("Error in Post-Export script. " & ex.message)
End Try
Add the following script in the first project post-export script in the global declarations tab. If you intend you use the following script in other Std Processes then you may add it in the global script's global declarations ta.
Function RunSP(byVal ProcessName As String, byRef msg As String, Optional byVal InstanceName As String = "DefaultInstance", Optional ByVal ConfigServer As String = "net.tcp://WIN201264DC3:808") As Boolean
Dim oPump As PumpAPI = Nothing
Dim ok As Boolean = True
Try
oPump = New PumpAPI(InstanceName, ConfigServer)
oPump.StartProcess(ProcessName)
catch ex as exception
ok = False
msg = ex.message
End Try
If Not IsNothing(oPump) Then oPump = Nothing
Return ok
End Function
Locate the Monarch Server Automator installation package. in the \Installer Advanced 64-bit\Tools folder un-compress all files from Datawatch.DataPump.PumpAPI.7z file in to a folder on the Automator server, and then make reference to the PumpAPI in the SP global script references, imports tab (per screenshot below). I also assumed you installed x64 version of the Automator. If you installed x86 version of the Automator then you need to get the pumpapi files from \Installer Advanced 86-bit\Tools folder.
Note that in my environment, I un-compressed all files in C:\Datawatch\Datawatch.DataPump.PumpAPI_14.3 folder. Its a good idea to add system.io to the namespace too.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
Original Message:
Sent: 11-22-2019 11:15 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0 -
Max,Max_20905 said:Mo,
The script almost works. The first process runs fine, but doesn't kick off the second process and I get this error:
Error in RunSP method. Could not load file or assembly 'Datawatch.Core.Logging, Version=14.0.0.87, Culture=neutral, PublicKeyToken=47f651eeea82774c' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 11:52 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
I compiled the script and there was no error.
Add the following script to the first project post-export script. Replace the instancename, configserver, and processname variables with valid values.
Dim msg As String = ""
Dim ok As Boolean = true
Dim InstanceName As String ="DefaultInstance"
Dim ConfigServer As string = "net.tcp://WIN201264DC3:808"
Dim Processname As string = "YourSecondProcessName"
Try
If Exportcompleted then
ok = RunSP(ProcessName, msg, InstanceName, ConfigServer)
If ok=false then log.addevent("Error in RunSP method. " & msg)
else
log.addevent("Error...Export to sql database table failed")
End If
catch ex As Exception
log.addevent("Error in Post-Export script. " & ex.message)
End Try
Add the following script in the first project post-export script in the global declarations tab. If you intend you use the following script in other Std Processes then you may add it in the global script's global declarations ta.
Function RunSP(byVal ProcessName As String, byRef msg As String, Optional byVal InstanceName As String = "DefaultInstance", Optional ByVal ConfigServer As String = "net.tcp://WIN201264DC3:808") As Boolean
Dim oPump As PumpAPI = Nothing
Dim ok As Boolean = True
Try
oPump = New PumpAPI(InstanceName, ConfigServer)
oPump.StartProcess(ProcessName)
catch ex as exception
ok = False
msg = ex.message
End Try
If Not IsNothing(oPump) Then oPump = Nothing
Return ok
End Function
Locate the Monarch Server Automator installation package. in the \Installer Advanced 64-bit\Tools folder un-compress all files from Datawatch.DataPump.PumpAPI.7z file in to a folder on the Automator server, and then make reference to the PumpAPI in the SP global script references, imports tab (per screenshot below). I also assumed you installed x64 version of the Automator. If you installed x86 version of the Automator then you need to get the pumpapi files from \Installer Advanced 86-bit\Tools folder.
Note that in my environment, I un-compressed all files in C:\Datawatch\Datawatch.DataPump.PumpAPI_14.3 folder. Its a good idea to add system.io to the namespace too.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
Original Message:
Sent: 11-22-2019 11:15 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"
One reason that you are getting this error is, its version mismatch.
For example if the Automator version is 15.3 but you un-compress Pumpapi version 14, then you will get this error.
Make sure that you use the same Pumpapi zip file that is shipped with the Automator software package.
If you check the DwchServer.PumpAPI.dll version in the folder that you un-compressed should match any DLLs in the c:\program files\Datawatch\Datawatch\agent folder (please see screenshots below)
Also ensure that all references are correct (per my pervious screenshot that I sent you earlier).
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 03:36 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Mo,
The script almost works. The first process runs fine, but doesn't kick off the second process and I get this error:
Error in RunSP method. Could not load file or assembly 'Datawatch.Core.Logging, Version=14.0.0.87, Culture=neutral, PublicKeyToken=47f651eeea82774c' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-22-2019 11:52 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
I compiled the script and there was no error.
Add the following script to the first project post-export script. Replace the instancename, configserver, and processname variables with valid values.
Dim msg As String = ""
Dim ok As Boolean = true
Dim InstanceName As String ="DefaultInstance"
Dim ConfigServer As string = "net.tcp://WIN201264DC3:808"
Dim Processname As string = "YourSecondProcessName"
Try
If Exportcompleted then
ok = RunSP(ProcessName, msg, InstanceName, ConfigServer)
If ok=false then log.addevent("Error in RunSP method. " & msg)
else
log.addevent("Error...Export to sql database table failed")
End If
catch ex As Exception
log.addevent("Error in Post-Export script. " & ex.message)
End Try
Add the following script in the first project post-export script in the global declarations tab. If you intend you use the following script in other Std Processes then you may add it in the global script's global declarations ta.
Function RunSP(byVal ProcessName As String, byRef msg As String, Optional byVal InstanceName As String = "DefaultInstance", Optional ByVal ConfigServer As String = "net.tcp://WIN201264DC3:808") As Boolean
Dim oPump As PumpAPI = Nothing
Dim ok As Boolean = True
Try
oPump = New PumpAPI(InstanceName, ConfigServer)
oPump.StartProcess(ProcessName)
catch ex as exception
ok = False
msg = ex.message
End Try
If Not IsNothing(oPump) Then oPump = Nothing
Return ok
End Function
Locate the Monarch Server Automator installation package. in the \Installer Advanced 64-bit\Tools folder un-compress all files from Datawatch.DataPump.PumpAPI.7z file in to a folder on the Automator server, and then make reference to the PumpAPI in the SP global script references, imports tab (per screenshot below). I also assumed you installed x64 version of the Automator. If you installed x86 version of the Automator then you need to get the pumpapi files from \Installer Advanced 86-bit\Tools folder.
Note that in my environment, I un-compressed all files in C:\Datawatch\Datawatch.DataPump.PumpAPI_14.3 folder. Its a good idea to add system.io to the namespace too.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 11:15 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0 -
I'll re-unzip the files and give it another shot. Will let you know how it turns out, thanks!Mahmoud said:Max,
One reason that you are getting this error is, its version mismatch.
For example if the Automator version is 15.3 but you un-compress Pumpapi version 14, then you will get this error.
Make sure that you use the same Pumpapi zip file that is shipped with the Automator software package.
If you check the DwchServer.PumpAPI.dll version in the folder that you un-compressed should match any DLLs in the c:\program files\Datawatch\Datawatch\agent folder (please see screenshots below)
Also ensure that all references are correct (per my pervious screenshot that I sent you earlier).
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 03:36 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Mo,
The script almost works. The first process runs fine, but doesn't kick off the second process and I get this error:
Error in RunSP method. Could not load file or assembly 'Datawatch.Core.Logging, Version=14.0.0.87, Culture=neutral, PublicKeyToken=47f651eeea82774c' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-22-2019 11:52 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
I compiled the script and there was no error.
Add the following script to the first project post-export script. Replace the instancename, configserver, and processname variables with valid values.
Dim msg As String = ""
Dim ok As Boolean = true
Dim InstanceName As String ="DefaultInstance"
Dim ConfigServer As string = "net.tcp://WIN201264DC3:808"
Dim Processname As string = "YourSecondProcessName"
Try
If Exportcompleted then
ok = RunSP(ProcessName, msg, InstanceName, ConfigServer)
If ok=false then log.addevent("Error in RunSP method. " & msg)
else
log.addevent("Error...Export to sql database table failed")
End If
catch ex As Exception
log.addevent("Error in Post-Export script. " & ex.message)
End Try
Add the following script in the first project post-export script in the global declarations tab. If you intend you use the following script in other Std Processes then you may add it in the global script's global declarations ta.
Function RunSP(byVal ProcessName As String, byRef msg As String, Optional byVal InstanceName As String = "DefaultInstance", Optional ByVal ConfigServer As String = "net.tcp://WIN201264DC3:808") As Boolean
Dim oPump As PumpAPI = Nothing
Dim ok As Boolean = True
Try
oPump = New PumpAPI(InstanceName, ConfigServer)
oPump.StartProcess(ProcessName)
catch ex as exception
ok = False
msg = ex.message
End Try
If Not IsNothing(oPump) Then oPump = Nothing
Return ok
End Function
Locate the Monarch Server Automator installation package. in the \Installer Advanced 64-bit\Tools folder un-compress all files from Datawatch.DataPump.PumpAPI.7z file in to a folder on the Automator server, and then make reference to the PumpAPI in the SP global script references, imports tab (per screenshot below). I also assumed you installed x64 version of the Automator. If you installed x86 version of the Automator then you need to get the pumpapi files from \Installer Advanced 86-bit\Tools folder.
Note that in my environment, I un-compressed all files in C:\Datawatch\Datawatch.DataPump.PumpAPI_14.3 folder. Its a good idea to add system.io to the namespace too.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 11:15 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 04:18 PM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
One reason that you are getting this error is, its version mismatch.
For example if the Automator version is 15.3 but you un-compress Pumpapi version 14, then you will get this error.
Make sure that you use the same Pumpapi zip file that is shipped with the Automator software package.
If you check the DwchServer.PumpAPI.dll version in the folder that you un-compressed should match any DLLs in the c:\program files\Datawatch\Datawatch\agent folder (please see screenshots below)
Also ensure that all references are correct (per my pervious screenshot that I sent you earlier).
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
Original Message:
Sent: 11-22-2019 03:36 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Mo,
The script almost works. The first process runs fine, but doesn't kick off the second process and I get this error:
Error in RunSP method. Could not load file or assembly 'Datawatch.Core.Logging, Version=14.0.0.87, Culture=neutral, PublicKeyToken=47f651eeea82774c' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:52 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
I compiled the script and there was no error.
Add the following script to the first project post-export script. Replace the instancename, configserver, and processname variables with valid values.
Dim msg As String = ""
Dim ok As Boolean = true
Dim InstanceName As String ="DefaultInstance"
Dim ConfigServer As string = "net.tcp://WIN201264DC3:808"
Dim Processname As string = "YourSecondProcessName"
Try
If Exportcompleted then
ok = RunSP(ProcessName, msg, InstanceName, ConfigServer)
If ok=false then log.addevent("Error in RunSP method. " & msg)
else
log.addevent("Error...Export to sql database table failed")
End If
catch ex As Exception
log.addevent("Error in Post-Export script. " & ex.message)
End Try
Add the following script in the first project post-export script in the global declarations tab. If you intend you use the following script in other Std Processes then you may add it in the global script's global declarations ta.
Function RunSP(byVal ProcessName As String, byRef msg As String, Optional byVal InstanceName As String = "DefaultInstance", Optional ByVal ConfigServer As String = "net.tcp://WIN201264DC3:808") As Boolean
Dim oPump As PumpAPI = Nothing
Dim ok As Boolean = True
Try
oPump = New PumpAPI(InstanceName, ConfigServer)
oPump.StartProcess(ProcessName)
catch ex as exception
ok = False
msg = ex.message
End Try
If Not IsNothing(oPump) Then oPump = Nothing
Return ok
End Function
Locate the Monarch Server Automator installation package. in the \Installer Advanced 64-bit\Tools folder un-compress all files from Datawatch.DataPump.PumpAPI.7z file in to a folder on the Automator server, and then make reference to the PumpAPI in the SP global script references, imports tab (per screenshot below). I also assumed you installed x64 version of the Automator. If you installed x86 version of the Automator then you need to get the pumpapi files from \Installer Advanced 86-bit\Tools folder.
Note that in my environment, I un-compressed all files in C:\Datawatch\Datawatch.DataPump.PumpAPI_14.3 folder. Its a good idea to add system.io to the namespace too.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 11:15 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0 -
Max,Max_20905 said:I'll re-unzip the files and give it another shot. Will let you know how it turns out, thanks!
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-22-2019 04:18 PM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
One reason that you are getting this error is, its version mismatch.
For example if the Automator version is 15.3 but you un-compress Pumpapi version 14, then you will get this error.
Make sure that you use the same Pumpapi zip file that is shipped with the Automator software package.
If you check the DwchServer.PumpAPI.dll version in the folder that you un-compressed should match any DLLs in the c:\program files\Datawatch\Datawatch\agent folder (please see screenshots below)
Also ensure that all references are correct (per my pervious screenshot that I sent you earlier).
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
Original Message:
Sent: 11-22-2019 03:36 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Mo,
The script almost works. The first process runs fine, but doesn't kick off the second process and I get this error:
Error in RunSP method. Could not load file or assembly 'Datawatch.Core.Logging, Version=14.0.0.87, Culture=neutral, PublicKeyToken=47f651eeea82774c' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:52 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
I compiled the script and there was no error.
Add the following script to the first project post-export script. Replace the instancename, configserver, and processname variables with valid values.
Dim msg As String = ""
Dim ok As Boolean = true
Dim InstanceName As String ="DefaultInstance"
Dim ConfigServer As string = "net.tcp://WIN201264DC3:808"
Dim Processname As string = "YourSecondProcessName"
Try
If Exportcompleted then
ok = RunSP(ProcessName, msg, InstanceName, ConfigServer)
If ok=false then log.addevent("Error in RunSP method. " & msg)
else
log.addevent("Error...Export to sql database table failed")
End If
catch ex As Exception
log.addevent("Error in Post-Export script. " & ex.message)
End Try
Add the following script in the first project post-export script in the global declarations tab. If you intend you use the following script in other Std Processes then you may add it in the global script's global declarations ta.
Function RunSP(byVal ProcessName As String, byRef msg As String, Optional byVal InstanceName As String = "DefaultInstance", Optional ByVal ConfigServer As String = "net.tcp://WIN201264DC3:808") As Boolean
Dim oPump As PumpAPI = Nothing
Dim ok As Boolean = True
Try
oPump = New PumpAPI(InstanceName, ConfigServer)
oPump.StartProcess(ProcessName)
catch ex as exception
ok = False
msg = ex.message
End Try
If Not IsNothing(oPump) Then oPump = Nothing
Return ok
End Function
Locate the Monarch Server Automator installation package. in the \Installer Advanced 64-bit\Tools folder un-compress all files from Datawatch.DataPump.PumpAPI.7z file in to a folder on the Automator server, and then make reference to the PumpAPI in the SP global script references, imports tab (per screenshot below). I also assumed you installed x64 version of the Automator. If you installed x86 version of the Automator then you need to get the pumpapi files from \Installer Advanced 86-bit\Tools folder.
Note that in my environment, I un-compressed all files in C:\Datawatch\Datawatch.DataPump.PumpAPI_14.3 folder. Its a good idea to add system.io to the namespace too.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 11:15 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"
Also right click on the zip file select properties, on the general tab if the unblock is checked, uncheck it before uncompressing files.
I tested the PumpApi on my local Automator v15.3 and worked fine. So if it does now work for you, I suggest you try the visual process.
Another option would be to engage the Altair prof services to assist you.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
-------------------------------------------
Original Message:
Sent: 11-25-2019 08:43 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'll re-unzip the files and give it another shot. Will let you know how it turns out, thanks!
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-22-2019 04:18 PM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
One reason that you are getting this error is, its version mismatch.
For example if the Automator version is 15.3 but you un-compress Pumpapi version 14, then you will get this error.
Make sure that you use the same Pumpapi zip file that is shipped with the Automator software package.
If you check the DwchServer.PumpAPI.dll version in the folder that you un-compressed should match any DLLs in the c:\program files\Datawatch\Datawatch\agent folder (please see screenshots below)
Also ensure that all references are correct (per my pervious screenshot that I sent you earlier).
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 03:36 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Mo,
The script almost works. The first process runs fine, but doesn't kick off the second process and I get this error:
Error in RunSP method. Could not load file or assembly 'Datawatch.Core.Logging, Version=14.0.0.87, Culture=neutral, PublicKeyToken=47f651eeea82774c' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:52 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
I compiled the script and there was no error.
Add the following script to the first project post-export script. Replace the instancename, configserver, and processname variables with valid values.
Dim msg As String = ""
Dim ok As Boolean = true
Dim InstanceName As String ="DefaultInstance"
Dim ConfigServer As string = "net.tcp://WIN201264DC3:808"
Dim Processname As string = "YourSecondProcessName"
Try
If Exportcompleted then
ok = RunSP(ProcessName, msg, InstanceName, ConfigServer)
If ok=false then log.addevent("Error in RunSP method. " & msg)
else
log.addevent("Error...Export to sql database table failed")
End If
catch ex As Exception
log.addevent("Error in Post-Export script. " & ex.message)
End Try
Add the following script in the first project post-export script in the global declarations tab. If you intend you use the following script in other Std Processes then you may add it in the global script's global declarations ta.
Function RunSP(byVal ProcessName As String, byRef msg As String, Optional byVal InstanceName As String = "DefaultInstance", Optional ByVal ConfigServer As String = "net.tcp://WIN201264DC3:808") As Boolean
Dim oPump As PumpAPI = Nothing
Dim ok As Boolean = True
Try
oPump = New PumpAPI(InstanceName, ConfigServer)
oPump.StartProcess(ProcessName)
catch ex as exception
ok = False
msg = ex.message
End Try
If Not IsNothing(oPump) Then oPump = Nothing
Return ok
End Function
Locate the Monarch Server Automator installation package. in the \Installer Advanced 64-bit\Tools folder un-compress all files from Datawatch.DataPump.PumpAPI.7z file in to a folder on the Automator server, and then make reference to the PumpAPI in the SP global script references, imports tab (per screenshot below). I also assumed you installed x64 version of the Automator. If you installed x86 version of the Automator then you need to get the pumpapi files from \Installer Advanced 86-bit\Tools folder.
Note that in my environment, I un-compressed all files in C:\Datawatch\Datawatch.DataPump.PumpAPI_14.3 folder. Its a good idea to add system.io to the namespace too.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 11:15 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0 -
Got it working! Thanks again for your help, much appreciated.Mahmoud said:Max,
Also right click on the zip file select properties, on the general tab if the unblock is checked, uncheck it before uncompressing files.
I tested the PumpApi on my local Automator v15.3 and worked fine. So if it does now work for you, I suggest you try the visual process.
Another option would be to engage the Altair prof services to assist you.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
-------------------------------------------
Original Message:
Sent: 11-25-2019 08:43 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'll re-unzip the files and give it another shot. Will let you know how it turns out, thanks!
------------------------------
Max Strahan
SQL Developer
------------------------------
Original Message:
Sent: 11-22-2019 04:18 PM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
One reason that you are getting this error is, its version mismatch.
For example if the Automator version is 15.3 but you un-compress Pumpapi version 14, then you will get this error.
Make sure that you use the same Pumpapi zip file that is shipped with the Automator software package.
If you check the DwchServer.PumpAPI.dll version in the folder that you un-compressed should match any DLLs in the c:\program files\Datawatch\Datawatch\agent folder (please see screenshots below)
Also ensure that all references are correct (per my pervious screenshot that I sent you earlier).
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 03:36 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Mo,
The script almost works. The first process runs fine, but doesn't kick off the second process and I get this error:
Error in RunSP method. Could not load file or assembly 'Datawatch.Core.Logging, Version=14.0.0.87, Culture=neutral, PublicKeyToken=47f651eeea82774c' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:52 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
I compiled the script and there was no error.
Add the following script to the first project post-export script. Replace the instancename, configserver, and processname variables with valid values.
Dim msg As String = ""
Dim ok As Boolean = true
Dim InstanceName As String ="DefaultInstance"
Dim ConfigServer As string = "net.tcp://WIN201264DC3:808"
Dim Processname As string = "YourSecondProcessName"
Try
If Exportcompleted then
ok = RunSP(ProcessName, msg, InstanceName, ConfigServer)
If ok=false then log.addevent("Error in RunSP method. " & msg)
else
log.addevent("Error...Export to sql database table failed")
End If
catch ex As Exception
log.addevent("Error in Post-Export script. " & ex.message)
End Try
Add the following script in the first project post-export script in the global declarations tab. If you intend you use the following script in other Std Processes then you may add it in the global script's global declarations ta.
Function RunSP(byVal ProcessName As String, byRef msg As String, Optional byVal InstanceName As String = "DefaultInstance", Optional ByVal ConfigServer As String = "net.tcp://WIN201264DC3:808") As Boolean
Dim oPump As PumpAPI = Nothing
Dim ok As Boolean = True
Try
oPump = New PumpAPI(InstanceName, ConfigServer)
oPump.StartProcess(ProcessName)
catch ex as exception
ok = False
msg = ex.message
End Try
If Not IsNothing(oPump) Then oPump = Nothing
Return ok
End Function
Locate the Monarch Server Automator installation package. in the \Installer Advanced 64-bit\Tools folder un-compress all files from Datawatch.DataPump.PumpAPI.7z file in to a folder on the Automator server, and then make reference to the PumpAPI in the SP global script references, imports tab (per screenshot below). I also assumed you installed x64 version of the Automator. If you installed x86 version of the Automator then you need to get the pumpapi files from \Installer Advanced 86-bit\Tools folder.
Note that in my environment, I un-compressed all files in C:\Datawatch\Datawatch.DataPump.PumpAPI_14.3 folder. Its a good idea to add system.io to the namespace too.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 11:15 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"
------------------------------
Max Strahan
SQL Developer
------------------------------
-------------------------------------------
Original Message:
Sent: 11-25-2019 08:51 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
Also right click on the zip file select properties, on the general tab if the unblock is checked, uncheck it before uncompressing files.
I tested the PumpApi on my local Automator v15.3 and worked fine. So if it does now work for you, I suggest you try the visual process.
Another option would be to engage the Altair prof services to assist you.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
------------------------------
Original Message:
Sent: 11-25-2019 08:43 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'll re-unzip the files and give it another shot. Will let you know how it turns out, thanks!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 04:18 PM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
One reason that you are getting this error is, its version mismatch.
For example if the Automator version is 15.3 but you un-compress Pumpapi version 14, then you will get this error.
Make sure that you use the same Pumpapi zip file that is shipped with the Automator software package.
If you check the DwchServer.PumpAPI.dll version in the folder that you un-compressed should match any DLLs in the c:\program files\Datawatch\Datawatch\agent folder (please see screenshots below)
Also ensure that all references are correct (per my pervious screenshot that I sent you earlier).
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 03:36 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Mo,
The script almost works. The first process runs fine, but doesn't kick off the second process and I get this error:
Error in RunSP method. Could not load file or assembly 'Datawatch.Core.Logging, Version=14.0.0.87, Culture=neutral, PublicKeyToken=47f651eeea82774c' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:52 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Max,
I compiled the script and there was no error.
Add the following script to the first project post-export script. Replace the instancename, configserver, and processname variables with valid values.
Dim msg As String = ""
Dim ok As Boolean = true
Dim InstanceName As String ="DefaultInstance"
Dim ConfigServer As string = "net.tcp://WIN201264DC3:808"
Dim Processname As string = "YourSecondProcessName"
Try
If Exportcompleted then
ok = RunSP(ProcessName, msg, InstanceName, ConfigServer)
If ok=false then log.addevent("Error in RunSP method. " & msg)
else
log.addevent("Error...Export to sql database table failed")
End If
catch ex As Exception
log.addevent("Error in Post-Export script. " & ex.message)
End Try
Add the following script in the first project post-export script in the global declarations tab. If you intend you use the following script in other Std Processes then you may add it in the global script's global declarations ta.
Function RunSP(byVal ProcessName As String, byRef msg As String, Optional byVal InstanceName As String = "DefaultInstance", Optional ByVal ConfigServer As String = "net.tcp://WIN201264DC3:808") As Boolean
Dim oPump As PumpAPI = Nothing
Dim ok As Boolean = True
Try
oPump = New PumpAPI(InstanceName, ConfigServer)
oPump.StartProcess(ProcessName)
catch ex as exception
ok = False
msg = ex.message
End Try
If Not IsNothing(oPump) Then oPump = Nothing
Return ok
End Function
Locate the Monarch Server Automator installation package. in the \Installer Advanced 64-bit\Tools folder un-compress all files from Datawatch.DataPump.PumpAPI.7z file in to a folder on the Automator server, and then make reference to the PumpAPI in the SP global script references, imports tab (per screenshot below). I also assumed you installed x64 version of the Automator. If you installed x86 version of the Automator then you need to get the pumpapi files from \Installer Advanced 86-bit\Tools folder.
Note that in my environment, I un-compressed all files in C:\Datawatch\Datawatch.DataPump.PumpAPI_14.3 folder. Its a good idea to add system.io to the namespace too.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 11:15 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
I'm going to give the visual process a shot because I haven't used it yet and would like to learn about it. Also, for the sake of learning, I am curious how the scripting would work out too if you don't mind showing, that would be great.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 11:00 AM
From: Mahmoud Abdolrahim
Subject: Automator PostProcess Script to launch another process
Hi Max,
I did not realize that having a SQL database table as input, the input distribution tab will not be displayed. Well that can be a problem.
You may migrate the first Standard Process (SP), which uploads data to a SQL database table to a Visual Process (VP).
make sure you disable the monitoring in the SP, because the migrated VP is also monitoring for the flat file.
In the VP add run standard process object to the canvas and connect the export to SQL table object to the run SP object (per screenshot below).
Or you can migrate both SPs to VPs. In the first VP, which should be a monitoring process, add run VP object to the canvas and connect the export to SQL table object to the run VP object (per screenshot below).
Of course you can add a script in the first SP post-export script that would launch the second SP. But that requires more work and writing script.
Let me know if you still want to go with script and I will provide you a script.
Regards
Mo
------------------------------
Mahmoud Abdolrahim
Senior Implementation & Integration Engineer
Datawatch Corporation
MA
(978) 935-3840
Original Message:
Sent: 11-22-2019 10:10 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba, thanks again! I'll take a look at this option as well and see how it works out for what I'm trying to accomplish. I appreciate the material so I can learn more about this system.
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-22-2019 05:45 AM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi @Max Strahan,
You'll find below a link to the video here in the community. I did this assuming it's a straight forward model where you're just extracting the data. What I forgot to ask is what 'data transformations' are being done in SQL.
For example if you're exporting the data into SQL to then joining that data with other tables in SQL before you finally export the spreadsheet, then I would highly recommend using the Data Prep Studio because we can do it all in one process.
The process is by and large the same. If the video below is sufficient then great. If you'd like to see how we can incorporate the transformations you're doing in SQL in one-fell-swoop visual process then we can do that too. Wouldn't take long to create that video. It would just be an additional view of this one below, like a part 2 :-)
https://community.datawatch.com/viewdocument/monitored-visual-prcoess-example#ItemCommentPanel
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 03:40 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
No email is involved. I like the video demonstration idea so I can watch and then give it a shot myself since that's how I learn best. I appreciate your assistance!
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 03:36 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
okay, so they are ultimately being emailed a spreadsheet each day once the source file comes in? So multiple spreadsheets each time?
If that's the case then this would be much easier as a Visual process. Here are a couple of options;
1. We can hop on a quickly Webex (actually go-to meeting) and I can walk you through it.
2. I can upload a video demonstration here in the community and tag you in it.
3. Or you can give it a shot and post any further questions here in the community.
either way I think I will do Nbr 2.
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 02:53 PM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi Baba,
Thanks for the response. You are correct on everything for the most part so far. The only difference being that the spreadsheet is actually getting dropped multiple times throughout the day to a share drive. In other words, each time the file monitor picks up the flat file and stores it in a SQL table for records and some data transformation, a new spreadsheet will be created by pulling from the SQL table then dropped on the share drive.
In this case, these end users will only have access to the share drive folder where the spreadsheets will be dropped. That is how they requested to receive the report.
Thanks again,
Max
------------------------------
Max Strahan
SQL Developer
Original Message:
Sent: 11-21-2019 02:43 PM
From: Baba Majekodunmi
Subject: Automator PostProcess Script to launch another process
Hi Max,
This is a good question you have here. So to recap I believe you're doing the following;
1. First Standard Process Monitors a folder and loads data into a SQL Table
2. Second Standard Process connects to the SQL Table and exports a spreadsheet to be sent to multiple end users
So far is this correct? If so I have a question about the spreadsheet, is it the same spreadsheet sent to all users? or there multiple spreadsheets sent to distinct users?
One option is to use a Single Visual Process that can combine the two processes. It can be set it kick off when the files arrive, update the SQL table, and then email the spreadsheet. However another question I have is do you want the users email every single time there's an update, or do you want them emailed once the SQL table has had all the appropriate updates for the day?
Lastly, if the end users have access to the SQL database, they could just connect to it in Data Prep Studio, SQL (or in Excel) and see it updated frequently.
@Mahmoud Abdolrahim Mo, what do you think?
------------------------------
Baba Majekodunmi
Solutions Consultant
Altair Engineering Inc.
Manassas VA
978-275-9325
Original Message:
Sent: 11-21-2019 09:46 AM
From: Max Strahan
Subject: Automator PostProcess Script to launch another process
Hi there, I've been having a bit of trouble trying to figure this issue out so figured I'd ask around here to see if anyone can help.
I've got a project in Automator that extracts data from a flat file into a SQL table for record keeping. The process will run on file monitor since multiple files get dropped throughout the day that need to be appended to the table. I have another project that exports data from the table to a spreadsheet that is then sent to other teams. The problem I'm having is, I cannot setup file monitoring with both projects on the same standard process, since the report file project is an export from a SQL table, there is no input distribution option. So, what I would like to do is setup a standard process for each project. The first runs on file monitor and picks up any files when they're dropped and when it finishes, the second is launched by a PostProcess script. I don't really know VB.NET so not even sure how to formulate a script that does this nor do I know how to setup the references and imports, or global declarations to get the script to work.
I was given this old script by a coworker but we've been unable to get it to work:Dim PumpAPI As DwchServer.PumpAPI
Dim strTrackingID As String
Dim strProcess(0) as string
Dim strCurrent as string
PumpAPI = New DwchServer.PumpAPI("default")
strProcess(0) = "PROCESS NAME"
for each strCurrent in strProcess
Try
strTrackingID = PumpAPI.StartProcess(strCurrent)
Catch ex As Exception
Log.AddEvent("Process: " + strCurrent + " Status: Failed to start - " + ex.Message)
end try
next strcurrent
I tried searching around the community here, and every post I found related to my issue had links to other threads that have since been removed. I've already reached out to Altair for assistance with this issue and they offered assistance at a price so they might remove these types of posts since they're trying to profit off these types of answers but hopefully I can get the help I need!
Thanks in advance for any assistance.
-Max
------------------------------
Max Strahan
SQL Developer
------------------------------"0