Streamline the Day-to-Day with IT Automation

IT teams have a long list of responsibilities—from desktop tasks, batch jobs, systems infrastructure, and everything in between. And tedious tasks like data entry, file transfers, and event log monitoring can get in the way of more critical tasks. But with IT automation solutions, you can empower your team to complete more work in less time and centralize your IT team’s entire schedule.

In this on-demand webinar, the automation experts from Fortra and CM First to explore ways to leverage IT automation tools in your most critical business projects—including those on IBM i, including those generated with CA 2E(Synon) and CA Plex. Our automation experts will walk you through the types of projects your IT colleagues are running, how they’re doing it, and what tools they’re using.

In this on-demand webinar, we’ll cover:

  • Ideas for IT processes to automate including report generation, application integration, overnight processes, automatic resolution of incidents, manual operation checks, and many more!
  • The differences between and uses for automation technologies like robotic process automation, job scheduling, and managed file transfers.  
  • Use cases of real projects from IT teams around the world.
  • Tips and best practices to keep in mind for a successful automation project.

Watch now to transform your tedious IT processes with automation to optimize resources, reduce errors, and save valuable time.

 

 
Video Transcript

Joan Blanch:                       Okay, let's start the webinar. Good morning, good afternoon. Thanks for joining this webinar with us today. The topic today for this webinar, it's Streamline the Day-to-Day with IT Automation. It's a webinar that I personally love. You will see it. It has a lot, a lot, a lot of content. Also, we try to include three light minute demos because it's an IT webinar. Mostly in IT, we all agree that we prefer demos to see software, to see technologies than to see too much PowerPoint and this. We try to do that.

                                                Basically, before starting... We are leaving one minute for the people that is collecting right now. Let me start with a real story. I'd say it like this. It's a guy that said, "I have automated most of my job. Now, I only need one hour per week to really do what I'm supposed to do, to cover my job, to make my goals. I just need to work one hour per week. He's asking himself, "Should I tell my boss?"

                                                The funny thing of this is that this is a news that I got from a magazine, but I found at least three cases very similar to this in real companies, while working with them. Basically, this kind of situations will be happening today more and more with several of the new technologies that we have four automating.

                                                Today, we are going to see one of the technologies that is the one that was being used by this guy. Something funny. Maybe it happens to some of you some day. Let's start with the introductions. Myself, I'm Joan Blanch. I'm working... My title is Pre-Sales Program Manager.

                                                I'm working basically with customers, with companies, helping them, defining automation strategies. Basically, working on defining processes and projects, analyzing processes, define automation, which is the best solution for you. This is my day-to-day work. Today, we have Roger Hammer. Roger, how are you doing?

Roger Hammer:                Yes, thank you. I'm Roger Hammer, I'm the Director of Software Development & Services for CM First Group. I manage both our product development and also our services and I lead many of those services projects as a project manager and also do quite a bit with the Automate product we'll talk about today. Thank you.

Joan Blanch:                       Fantastic. Thank you Roger. I'm happy to have you here. Very quick introduction about our companies. Myself, I'm working at HelpSystems. You will see because of my accent, I'm located in Barcelona, in Spain. I'm working in one of the HelpSystems offices in Europe. Basically, what we do at HelpSystems, we develop, we create solutions, software solutions in two big areas, automation, cyber security.

                                                We are growing both a lot. We have a very, very interesting and powerful portfolio today. Today, this webinar will be very focused on the automation part and we will be showing the benefits, the use cases, what are the technologies. The different of these solutions are AI robotic process automation, workload automation, managed file transfer. We will be talking a little bit about these today.

                                                Also, for you to know, we are very, very strong on rolling on cyber security with different solutions on data classification, data loss prevention, identity management and more that are of course outside the scope of this webinar today. Roger, if you want to introduce a little bit what do you do...

Roger Hammer:                Sure, CM First Group is headquartered in Austin, Texas. We have a number of partners including Broadcom/CA. We'll talk a little bit about CA PLEX and CA2E today, which are key products many of our customers use. Also, partnered with IBM and of course HelpSystems. We have a number of customers over the last 10, 11 years that we've been working with to both do modernization and also services to help day-to-day development and of course automation using automated RPA.

Joan Blanch:                       Fantastic. This success a little bit the agenda for today. There is a lot of content. There's a lot of content that we are going to cover today. Today, our objective of Roger and myself is to give you a high-level overview on different technologies, but focusing on benefits, on use cases, on real scenarios.

                                                Also, what we will do during this webinar, it's four the most. Three will be live, another will be a quick video, so we can also see the technology, and it's not only a PowerPoint what we see today. Before starting, the agenda will cover workload automation, managed file transfer, RPA robotic process automation. Then Roger will explain a little bit some use cases of RPA for CA2E and CA PlEX automation.

                                                Before starting, give me one second. I will send a poll. The idea of this is to share the results on what you say about if your company's today any of these technologies that we are going to see today. Thanks for participating this poll. The idea, it's to share a little bit the great of adoption of these technologies in the companies today. Let's see if the result is what we expect.

                                                I see the partial results. It's more or less what I expect. Thank you very much for participating. I will close the poll and share the results. Thank you very much. I'm sharing, so you should see. I see that screen very small. Give me one second.

                                                Number one, workload automation is not a surprise. It's not a surprise. Workload automation is the technology of the three technologies that we are going to see today that has been more years on the market. It's been in the market since '80s if I'm not wrong. Also, HelpSystems' first product, the first product that we started developing at HelpSystems was a job scheduler, so we know this very well.

                                                Then RPA, it's maybe the latest one to arrive into the market, so it's still in a process of adoption. Very interesting. Thank you very much for participating on this. Now, I will hide the poll. Thank you very much. Let us start, workload automation, for those who do not know what workload automation, basically, today again, remember this is very, very high level overview on the technology and what it is. We are not going to enter into loads of details today.

                                                But I want to remember to you, there is... We will leave some minutes at the end of this webinar so you can make your questions. We will try to respond as many as we can at the end during five, 10 minutes at the end. Feel free to write your questions during the webinar and we will cover those at the end.

                                                With this workload automation, also known as enterprise job scheduling. Basically, it is a technology. It's an application that controls the unattended execution of other programs. It has several names in the market. It's better known as a scheduler. "Do you have a scheduler?", is its most known name maybe. But it has different names, workload automation, enterprise job scheduling, scheduler and so on.

                                                Basically, this is the idea. If you see the background of this slide, basically represents the chaos. Sometimes when I share this slide with a customer, with a company, they typically say, "No, no, our processes are more complex than what you have in this slide." This is a little bit the idea.

                                                Today, applications, we.. IT has changed a lot during the last 10, 15, 20 years, where we had monolithic systems, very big mainframes, IBMIs and today everything is distributed. We have lots of complexity, but at the end, we have lots of processes that are multi platform, multi application, information that we have to move.

                                                JAMS job scheduler is a software that aims to control all of this complexity. Basically, what are the benefits of implementing a job scheduler? Basically, what companies are looking for it's to simplify these volume management instead of having cron tasks, instead of having scheduled tasks in Windows, instead of having millions of scripts of multiple systems. Normally a job scheduler makes a lot of sense when a company has hundreds, thousands of different jobs in different systems in different applications that need to be executed.

                                                If you don't have a central piece of software that is controlling, orchestrating all these executions, it's very, very complex to maintain, to detect an error, to react, to fix a problem soon. Basically, our customers are looking for simplifying list management, decrease the number of errors, detect the problems as soon as possible and automating even their solutions to those problems automatically.

                                                If we try to explain that with a simple example, let's say this example of process... It would be a very typical sales consolidation process. Imagine we have several branches selling things that are generating several files daily and we have to run a process that consists on generate these files. These files have to be sent, received, consolidated, integrated with the ERP, ran a process to make a stock calculation, send this information to the warehouse, run an ETL and send information data warehouse, generate reporting and so on.

                                                Very typical process, very simplified. It's a very simplified process, but of course what we have typically in processes like this it's deadlines. Look at this seven right here. Normally, what we expect or someone from business will expect is that this process finishes before our deadline because we need this information to do something, for instance, next day.

                                                If we take a closer look at this process and we try to understand how these process are managed today, we will find different pieces, different problems. One is that most of these small things that we have to run are scripts, are small programs, are custom developments, are Python. It's a PowerShell. It's a whatever, shell script. With the scripting, we have the problem that they are hard to maintain. If something fails, it's very hard to know that that specific script failed, what was the problem.

                                                At the end, maybe we realize that we don't have the information in the warehouse application, but to diagnose where something failed, we have to take a look at everything. One thing, scripting, non-centralized scripting. On the other side, we have dependencies and time constraints. Probably we have different pieces, different programs, different executions that have to happen in a order.

                                                There are dependencies because one job will direct files or information that will be picked by the next one. Basically, typically, a customer that is not using a job scheduler, the way we are managing those dependencies are time based, creating rules and say, "Okay, I assume these files will be ready 2:00 AM."

                                                I will run a process in here 3:00 AM just in case it takes a little bit more and this I will run 4:00 AM just in case, but this is absolutely non efficient. Basically, with a job scheduler, companies look to optimize also the process. We optimize the process because they are monitored. We have everything centralized. Just any problem that happens during the execution, we see, look at, we cover it.

                                                We retry automatically, but also because it gives a lot of visibility in the processes. It helps a lot to analyze what's happening. If we put every single execution of any process that is happening in our company, or core processes, into a scheduler, at the end what we have is visibility on everything that is happening in our company. We can optimize those processes, for instance, understand which parts can be executed in parallel, which parts we are waiting.

                                                There is no need to wait. The idea is that after putting a scheduler, it's very easy to understand what's happening and make the process more efficient, quicker to execute. We have lots of customers where this is very, very, very sensitive, very critical, especially the ones that are running heavy nightly batch processes, from deadlines.

                                                I will explain a real example on that. This is one of the main benefits of a scheduler. Let's see a real story. This is a success story. It's a company selling consumer loans. It's a customer... I cannot give the name of course, but it's a consumer loans company that has more than 180 stores in malls and shopping centers from where they give instant loans to consumers like for me, if I want to buy a new big TV for my home.

                                                Basically, they had a very, very critical nightly batch process that consisted on updating all the loans prices for all the stores and distribute all those prices to all the stores. It's a super critical batch process because it's updating these prices and there is a business where they cannot open the stores, they cannot give loans without updated prices.

                                                Basically, a failure of the process results in stores not being able to open. This is very typical. What was the project trigger in this case? Most basically, that they have an incident in that process that delayed the execution time of that process just three hours. It might not seem a lot, but the process was delayed three hours. That meant that all the stores, 180 stores not being able to open to sell during three hours because a batch process failed during the night.

                                                The reason was a manual problem. It was a batch process that was not fully automated IT operations. The IT operator had to do several manual tasks during the night and he did a mistake. We are human. We do mistakes. That manual mistake made the process fail. Then it's very typical that when a critical problem like this happens, it's business requiring IT in this case to automate and control the process.

                                                Normally, and this is said, and it's true in IT yet today, sometimes you have investment on a technology. We need a big problem to happen and then the company realizes how important it is to have this under control. That was the case in this company. Then it was business saying, "Hey, we have to automate and make this process very strong. This cannot fail. It's critical to our business."

                                                The return of the investment of this project it's avoid another problem. You don't want this process to fail. The company was very, very important. The cost of a problem in this batch one day it's bigger. It's more than all the cost of implementing a scheduler for them. In this case, we don't have our ROI calculated as a number because it's very difficult. In this case, what they are looking for is to avoid another problem. Of course, they are saving a lot of time on manual tasks that were done by operators manually.

                                                Time for the first demo. This is not a real demo. I will open JAMS. JAMS is one of our job schedulers. I will do a very, very, very light high-level demo just to show you a little bit that you can feel, so you understand. I think it's better to see something working, even if it's very basic, than just a PowerPoint. This is the idea, but don't take it as a full demo. I will just scratch less than 1% of the features of what we can do in here.

                                                This is how it looks like. This is the central console where the IT operations team will be monitoring, looking at the executions, this list. It's what we call the monitor. It's basically every execution that is happening real time. Everything that was executed, it's here. We have centralized visibility on everything that is being executed on multiple, on hundreds of systems.

                                                The way this works, first thing we have to do, normally it's to deploy an agent. We install an agent on each operating system. Of course, we provide agents for most of the platforms that we might need. We have lots of different type of agents for the different platforms. In here, we have agents for Windows, Linux, AS400, zOS, AIX, UNIX and so on.

                                                We deploy an agent. We install a local agent in that remote system or we can do remote executions also, depending on the technology. That would be the number one thing that we do. Next thing we will do is to define which are the jobs that we want to run. Today, I will show you something very, very, very basic. I have a billing batch process example that we are redefining several jobs.

                                                Let me show you very quickly. If I create a job... Again, here we will have hundreds of types of jobs that we can run on here. This is what we call execution methods. These are some examples. All these can be expanded. We can add more execution methods, but just for you to see very quickly typical execution method that we will execute. It's a command, operating system command, a path file, a shell. By transfers, AS400, ODBCs, SQL Agent and so on.

                                                These are all easy to script. We can run PowerShell, UNIX scripts. There are a number of technologies that we can interact with when we create a job, but in this case, I will just show you our job that is already created, which is very basic. It's just doing a ping. It's doing nothing, very simple example.

                                                What this job will be doing, just executing a script, a command. A job has a number of properties from scheduling... If I want to run this job on a schedule after another job, we can define any kind of complex logic to map the different dependencies and executions. Then we can create, for instance, what we call a sequence, which is a graphical representation on dependencies on the different jobs that we want to run.

                                                This is a very simple example of a sequence of jobs. Some of the executions will be in parallel. Some of them will sequential. What I will do right now is to launch this. I will submit this sequence. It's going to be executed real time. If I go to monitor, you will see first line it's running. If I double click this, we can see lots of details. We have the log file. We can refresh real time.

                                                I can see the sequence that will show us real time. You will see green boxes. It's that it's executed or being executed. This is updated real time. Now, it's executing, still finishing the calculating data. Now, it's calculating job. Then it will finish. Also, real time, we can update the logs and have everything centralized.

                                                That was very light, very, very light demo, so at least you can see the look and feel of what we do in the scheduler. We define these jobs, orchestrate these jobs and launch them and have everything controlled from a central console.

                                                Ok another quick question. Let me launch the poll. I just share the results and see how do you think your company, it's very automated, not very automated. We'll wait for some seconds for your responses. I see a lot of people replying. Very good. One of the options is zero percent by the moment.

                                                I will close the poll. Thanks for participating. Let me share the results so you can see. Most of you consider that your company is in a normal... You are automating process right now. At the end, we have someone that said that it's very low. Most of you, you think your company it's normal, so you started automating. You have probably started implementing several technologies or improved processes, so you are more and more automated.

                                                Just 18% of the participants on the poll consider that your companies is very highly automated. Still a lot of room to improve. Thanks again for participating. For the next one, we are going to see a little bit the same idea, very quick overview on what is managed file transfer. Again, I'll did a very, very quick demo on it.

                                                What is managed file transfer? It's another automation technology. The name itself gives us a clue on what this will do. Basically, if we take the file transfer approach historically, basically, file transfer means sharing files with others. It can be inside our company or with external companies of course. A typical approach is of file transfers. It has been batched, so basically screens, programs, FTPs and so on. Let's say unattended-batch, machine-to-machine exchanges of information.

                                                Then we have users. With the persons, we normally use email, cloud platforms also to exchange files with others. This is the standard, the typical file transfer approach. What is managed file transfer providing on top of them? Managed file transfer, it's basically this file transfer, add some layers of automation, encryption, security, auditing and compliance.

                                                Basically, a managed file transfer solution has to provide all these five things also to be considered a managed file transfer solution. That would be the critical things. Basically, which are the main use cases? What are companies looking to solve with a managed file transfer solution?

                                                This, in my opinion, it's number one use case. It's very similar to what we have seen with scheduling. It's to orchestrate the processes, but normally it's on processes that are very, very, very focused on files and information exchange. Most of our customers on managed file transfer, the starting point was that they had custom systems that they developed in house four years, legacy systems.

                                                That is what is being used to exchange information, files. Basically, there are some challenges on that systems today. Some of them are how to guarantee, for instance, file delivery. We want to be sure that a file has been delivered, but sometimes I explain a case from a bank that the trigger of the person was that they lost one file. It was a file to execute a Swift transaction to move four million euros from one account to another account. It just was one file, but this file was lost because the systems were so complex.

                                                This file has to be moved across 15 different systems to execute the transaction and it was lost. It was just one file, but that meant four million euros not moving. When companies cannot afford, cannot allow to lose one single file, they normally need a managed file transfer solution.

                                                Basically, orchestrate processes, let's say replace that custom development to something standardized and reduce support costs, but it's not only moving files. One may think on managed file transfer solutions like a solution to FPPs or copy files. Yes, of course, managed file transfer solutions do this, but it's not only this. With a managed file transfer solution, we can automate any kind of information workflow.

                                                That can be of course file transfers, FTPs, SFPT, FTPS of course, but also EDL, AS2, AS3, transform, manipulate XML, JSON, connect with web services, execute REST, SOAP, web services calls, databases and any, any cloud platform, OneDrive, SharePoint, AWS, Azure Blob, AWS S3 and so on.

                                                We have with the different solutions and mostly any managed file transfer solution in the market have today multiple connectors. It's not only moving one file from one place to another or doing FTPs, which was kind of the typical approach of file transfer, but to connect with any hybrid platform that we have open prem in cloud and so on.

                                                That will be second most common use case for implementing a managed file transfer. With managed file transfer, we find something that I like finally. Sometimes the ones that want to implement a managed file transfer, it's IT. It's normally IT operations department that are the ones that have lots of hours to fix, diagnose, improve processes, be sure that the files are delivered.

                                                But 50% of the times, it's security department that needs to improve and needs to be compliant with any regulation. We have lots and lots of cases where a managed file transfer solution has been implemented together, security and IT operations, because it was solving problems for departments. This is very important use case also for a managed file transfer solution also to be able to encrypt the data at any point, to have super powerful auditing information, reportings on everything. This is very, very important.

                                                More use cases on MFT? Another one would be if we need to enable, to create a way so our partners can exchange information with us, kind of an SFTP serve, an HTTPS, a web service, a portal, so a partner from us, a customer can send files to us or we can send files to them. Very important use case also.

                                                End-to-end security, very important in management also solution. It's not only encryption, of course, but also antivirus, data loss prevention, analyze the data, assure data integrity. We want to be sure that a file that is being sent from here to another continent, when it's received, it's the same file, and that no one could access that file.

                                                Also, very, very important, compliant architecture. We can with MFT solutions deploy any security that complies and meets with any, any security requirement. For sure we have super powerful architectures to comply with any security requirement because it's one of the critical things in an MFT solution.

                                                Another use case is a different one from the rest that we have seen, but a very important one in some sectors like media, for instance, accelerate exchange of huge volume of data. If we need to exchange information, terabytes, between continents, we need probably something that allows us to reduce the time that we need to exchange these files.

                                                We can send a file that in FTP maybe it needs two hours, and we have protocols that maybe we can send the same file within one minute, two hours versus one minute. Very specific use case, but very important in some verticals.

                                                Last use case would be user-to-user exchange of information. The persons now we need to send files and share files with other people. At the end, the way we normally do, unless we have another way to do it, it's emails or cloud platforms, where we can upload and share information, but lots of companies are very worried, very concerned about that. They need to provide a very secure way so the users can exchange information between them. So this is something that we can also provide within these solutions. Roger?

Roger Hammer:                Yes, thanks. We wanted to share an example from a customer that we worked with here in a managed file transfer. Let's say that you have a process to send transactions to a trading partner. Maybe it's a report of daily transactions. This might originate in the IMB I, application data or perhaps by a reporting package like Crystal Reports you see there.

                                                It could be of course a combination the above. Rather than a human sending a report manually, the managed file transfer can initiate that, encrypt it and securely send those files to the trading partner. You can also automate the acception process to a large degree, which is what we've done here in this customer case.

                                                We monitor and parse the process from the partner. When there's an error, that is the message via Slack or Teams, so it could be immediately attended to. Then of course it could also report the help desk ticket. This particular scenario's very much paying off for our customer and really saving them significant amounts of time. The humans in the process are really only dealing now with the exceptions rather than the entire process.

Joan Blanch:                       Fantastic. Thank you Roger. To finish this part, very quickly, a real story on a customer. This is a very big bank, very, very big bank. It's one of the top five banks worldwide that is using a MFT solution from us. Basically, just to explain this case because it's the most typical case maybe we have on MFT.

                                                Initial situation, the bank had, it's a home-made solution to exchange files based on scripts and custom developments that they did. What they were doing, it's to transfer 400,000 files daily across worldwide. It's a lot of files. Imagine when before I was explaining to you, you don't want to lose one single file. If you are moving this amount of files daily, it's very easy that you lose one file.

                                                Basically, they had no central visibility. They had errors happening often because it's normal. Moving that amount of information, you will have errors for sure. The main problem they were having is diagnostic and resolution time. Every time a file was lost, a lot of time to investigate where that file is right now.

                                                Also, of course, they are moving very sensitive data and multiple security regulations were applying to these file transfers. Basically, the most important outcome on this, it's how SLAs were increased, how the transfers, the exchanges of information were very solid after implementing the software and the incidence or the solution time has been reduced by 80%.

                                                But then, of course, if we look at the number of hours that has been saved by IT operations teams, it's hundreds and hundreds of hours monthly of time and fix any problem on the file transfers. Very, very quick demo. This slide is showing you three of the managed file transfer solutions that we have at HelpSystems in our portfolio. We are very, very strong. We have a lot of knowledge and we have tested millions of different situations.

                                                Today, I'm going to very quickly show you GoAnywhere, but again my apologies. It's not a real full demo. It's just a very quick review so you can at least have a high-level review on this. I'm logging in my GoAnywhere web interface. This is the administrative portal from where we define anything that we need to configure. I will basically show you two things. I will show you again less than 1% of the pictures.

                                                It's not the idea of today, just to show you a little bit the overview. The first thing, inbound services. First thing I'm going to show you today is inbound services. We have a platform where we can enable different listeners, different services that will accept incoming connections like SFTP, HTTPS and others.

                                                When we define these services, you have these. I have SFTP started. We can define users that will be able to connect to these services. In this case, I will be showing you a demo user for myself. We have millions of options that of course I will not explain today. Every user it can be authenticated locally or interactive with LDAP active directory and others sources.

                                                We can define lots of options on how the user will be authenticated, double-factor authentication if it connects through the web interface. It's connecting through SFTP. It will use a key, a password, both. Multiple options that I'm not showing today, but very quickly I just have an SFTP started with my user and I will just connect to show you.

                                                This is WinSCP. It's a FTP client I will use to connect to my SFTP. I'm connected to GoAnywhere SFTP with my user. For the same information, let me very quickly show you, I can access using a web interface that can be customized. I'm connecting using exactly the same username just to very quickly show you that the same files and information that I can access using an SFTP client can be accessed through a website that is with a full, secure architecture that can be published to internet securely. So exactly the same information.

                                                Closing this. Inbound services, number one thing I wanted to show you. Number two, automations, the workflows. Do you remember the case of the big bank that was moving 400,000 files daily. This is how they did. What we will do here, to define the automation, it's basically create what we call workflows. It's like a script that we will build, drag and drop several actions that we have in here.

                                                No time today to show you. We can do almost anything, database queries, translate information, read XMLs, read JSONs, indirect elements, file transfers. We support multiple protocols and everything, it's very easy to build because it's drag and drop to build something like this. It's like programming, but without programming problems.

                                                This example that I have in here, I will run it. Let me show you a little bit what it will do. This folder in here, we have this SFTP here. We have this workflow in here. What the workflow will do is take the files in this folder, zip them and send using SFTP to that system, which is a very, very typical workflow automating files. Let me just run it.

                                                It done... Look at this. The files have disappeared from here, not here anymore. They are in the sent folder. If I refresh the SFTP, you will see compressed file and the current time stamp with a zip file that contains the files that we sent. This is an example of a very, very typical and simple workflow of taking files from a folder, zipping them, SFTP and also send a confirmation email if needed, say, "Hey, the files have been delivered."

                                                Hope it makes sense. It was a very quick demo again. At least you have seen a little bit of that. Okay. Last piece of technology today, RPA, robotic process automation. What is robotic process automation? Basically, it's the newest of the three technologies we are showing today. It's more recent. Let's say five, six, seven years ago nobody was talking about RPA. It didn't exist the way it is today.

                                                Now, we are in a moment where most of the companies, probably 70, 80% of the companies are implementing or have a RPA already implemented today. Basically, what RPA does it's automate manual tasks that are normally tedious, repetitive tasks using software robots. This is basically what automating can do.

                                                These bots can both attended or unattended. That means they can work together with a human, maybe the process has things that require human intelligence in the middle of the process, so you can have a bot working together with you. Or it can be unattended, which means that it runs in the background in a virtual machine, not interacting with the user and does background processing of what am I testing.

                                                Normally, we will be looking to high return of investment on RPA processes. This is a very, very critical point that when I'm working with companies that are implementing RPAs, it's something I insist a lot. An RPA project, best to have a high return of investment at the beginning, so probably we are talking about a project that will deliver value in days or few weeks, not months, at least to start.

                                                If it requires months to start delivering value, then it probably means that the approach is not the good approach for RPA because RPA is a very, very pragmatic technology that consists on automating the processes as is, let's say. The way they are today.

                                                Typical things that the RPA bots will be able to do will be opening websites, navigate, download files, extract data from anywhere, from applications, from databases, from a website, interact with multiple applications at the same time, desktop applications, move files, copy files, create reports and an infinite list of things that we'll be able to do. \

                                                Hopefully you will see that in a demo, but if we focus on IT because RPA, it's a technology that is widely implemented in business or by business departments. Most of the RPA projects are led by a business department that has problems at the moment to solve. We see lots and lots of cases where IT cannot take advantage of having an RPA solution and they say, "No, this is a business project. It's not for IT."

                                                But there are lots and lots of tasks that we see that can be automated in IT. These are some of the most relevant examples again. We will find hundreds and hundreds of different things that can be automated, but very typical ones could be, for instance, reporting. In every IT department in the world, probably, there is someone building a report, building an Excel, an SLA report, incidence report. So this is a very typical task that can be automated probably.

                                                What else? For instance, application testing. We have customers using RPA to test applications. Monitoring, this one it's very typical also. You can use bots, RPA robots to open up an online banking login, make four clicks and measure that everything is working fine and the response time is good. Use it as a monitoring tool also.

                                                Help desk and service desk, this is also very, very typical. For instance, solve some type of tickets. There will be some types of requests where the resolution will be always the same so why don't we put a robot that does this. For instance, reset a password, that could be a use case that maybe you always want to do exactly the same for a specific ticket.

                                                The use cases could be infinite. Let me very quickly... Because we're a little bit short of time and there's a lot of content today. Very quickly, explain you a customer case. It's Carter Bank & Trust. We have more information on our website if you are interested on knowing more on this. I will not enter into lots of details, but I want to highlight two things of this.

                                                One is look at the return of investment, it's huge. It's huge. It's a huge return of investment. Basically, this comes from there are typically lots of processes that can be automated. That's a good thing. In most IT departments, we find lots and lots and lots of things that can be automated. When you start automating, happens this. Look at this case. They have 75 bots in production.

                                                What happens with a project like Carter Bank & Trust is that they want... When you start a project on RPA, day one or the first week, you don't expect that you will have 70, 100, 150 bots, but this is what happens in most of our customers at the end, that you start with one, two, three critical processes and in one year, when we talk again with the company, they say, "No, we have 100 bots today. There's lots of things that we discovered that could be automated that we didn't know."

                                                That would be, for me, one of the most important, interesting things because this is a common pattern. We have lots and lots of customers where this situation is common. Then this process include anything that you can imagine. This is one of the examples. One of the processes was migrate information. This is a very good use case for RPA. Sometimes when we want to migrate information from one application to another, of course we can develop something, we can do it manually. Lots of times an RPA bot can do it very quickly, so it's a very, very quick use case also.

                                                Very quick demo. It will be one minute of demo. I will show you our RPA solution automate, but again very light demo. Let me start in here. What I defined in here, it's an example. What we do... This is a workflow. It's what we call a workflow. A workflow is basically high-level representation of a process that can have multiple steps. This example will open Amazon, get iPhone prices, build a report with that information. Then it would upload this to SharePoint and if anything fails, open a ticket.

                                                What I'm going to show you is I will double click this. I already have this open. In here, this is the interface where we develop the bots. You can see it's the source code if you will of the bot, but again it's very easy to use. It's very powerful, so we have lots and lots of probabilities in here. The way it works, we have actions in the left. If we want to do something like, for instance, move the mouse to a position on the screen, what I do is drag and drop and release where I want and configure which is the position where I want to move the mouse.

                                                You see this is moving. If I press insert, I will capture those coordinates. This is the way we configure. It's pretty simple. We have 700 actions here to build scripts to do almost anything, interact with Excel, copy files, work with PDFs and run programs, open a terminal so we can connect with a mainframe or an IBM i. We can open a website, Chrome, Firefox, interact with it.

                                                Very, very powerful. On this, we can automate almost anything you can imagine, but in this example, I will just play it. So you can see, I will... I'm not doing anything. It's the bot executing right now. What it's supposed to do is open Amazon, search iPhones and generate a report with information of the prices.

                                                I will drink some water while the bot works for me. You see, and now it's clicking next page and everything. It will extract... I think it's three, four pages of Amazon. It's something that of course you can define. By the way, this is a very typical example that we have in several customers that want to get prices from competitors or from partners or distributors and have kind of a report daily, automatically generated in a folder, for instance.

                                                Pretty simple. I think it's clear what it does. So I will stop it. This example, we interacted with our browser, clicking buttons, extract content and also with an Excel file to write the data. Basically, the bot was writing this information with the different prices from Amazon in this case. Roger, all yours.

Roger Hammer:                Thank you. We wanted to spend a little bit of time talking about CA PLEX, CA2E and how RPA is used in those areas. Much like everything else, so it's not really different, but first, just covering some of the different options for backend automation, just being clear that everything doesn't have to be backend automation. It doesn't have to be run on its own. If you have small tasks that your users do, then they can automate those also. They can just execute those tasks.

                                                Maybe it's some kind of a data transfer or something that interacts with their others steps that they need to make decisions that are not easily automatable. You can do that work also and then, of course, integrating between applications. We think this is a really important area where you can access data or interact between applications and also using APIs and other kinds of integration tools.

                                                CA PLEX, CA2E, I just say here it's not about the technology, it's about the business. Using these applications that you have in these model-driven environments and building bots that can drive different aspects of those applications can be great examples where RPA can work.

                                                File and email where you may have a person working and receiving this information, adding it into an application, great scenario for RPA, integrating between enterprise apps, obviously ZenDesk, Salesforce, Zoho. We use Zoho, so integrating those applications with other internal applications or processes to move data in and out of them, great scenario.

                                                As I mentioned, employee assisted or fully automated are both options. Then, as I said, supporting your users with macro-like bots to provide that higher accuracy, faster process is a big value. As an integration tool, I think RPA is a great capability, basically reading and writing data either directly from a database or from the application UI, from one application to another ingrading through APIs, connecting these different systems, as your users accomplish your work.

                                                The key thing is it's really... It's not like coding. It's drag and drop so integrating these different projects can be very quick to do. Then of course you can create the automation for the integration and really it becomes a painless process. Fully automated, human and bot working together, all great scenarios, depending on your specific needs and the ROIs that you can individually.

                                                We're going to do a quick video here. We've got three scenarios of RPA running in CA2E and CA PLEX. There's a series of steps that are outlined here. This first one is a Plex C++ application. It's going to work through and read a file from Excel, then it's going to delete information from the small application and then you can see the file dropped in place. It's going to recognize that file, read it in and begin the process of deleting the information from this application.

                                                When this application completes, it's also got an integration with Trello. Trello is going to get a card there stating that this step or this process is complete, so it's stepping through the deletion process, bringing up each record and deleting it in order here. Once that's complete, we'll see that Trello card pop up here in just a moment.

                                                Also, we're saying that... There it is. There's Trello cards up there. C++ application and the date it was created. We're going to move on now to a CA2E application. The process is going to be similar, except in this case, it's going to read the data and create new records in that application. Again, it's going to start with an Excel file that's going to be dropped into place. It could be coming from an email. It could be an FTP file, could be a number of ways that file gets in place, but now it's going to kick off the application, running in 5250 window here and interact with this application just as a user would.

                                                You see those entries are going in from the RPA tool very quickly and providing those inputs for the application, entering the users from that, that Excel file, getting them input into the application. Again, it's going to finish by creating a Trello card. There we go. We've got a new CA2E card there.

                                                Our last step here is going to do CA PLEX. This is going to be a web application. It's again going to be similar, just showing all the different scenarios that you may have in your environment with a model-driven development. Again, we're going to read that file and in a web application this run the same series of steps, creating new users in that particular application. It's going to bring up the web, start up the browser interface.

                                                It's going to log on of course and get us into the application. It's going to step through a series of menus, getting us into the location of the application that we want to work in. You will see me actually pick this window up and move it because it's hiding that Trello area. Key point here is that it's not getting lost just because that window moved. It knows where it's working within that window and keeping focus there to make sure the application is executing correctly.

                                                Weaving our data in again... taking a moment to enter the data. This is running in a debug when I recorded it, so it's a little slower than you would see otherwise. That way you can actually understand what's happening and we'll run through that scenario a couple more times, entering users. Then we'll finish again the same way. It's going to create a Trello card through the API integration that's provided in Automate. Really nice functionality to interact with other applications that have APIs.

                                                Last one. Going to enter that data for that person, get it added in and then we'll see that Trello card pop up and then we'll wrap up our demonstration. There we go and there's our third new person information in the Trello card. Hopefully that'll give you a good idea of how working with CA2E and CA PLEX is really no different than any other application. We just want to show that RPA works great there too.

Joan Blanch:                       Brilliant. Thanks. Last poll. We are finishing already. We will launch the last poll for you all. After you have seen these technologies, maybe some of them you already knew. Maybe something was new for you. Now, in your opinion, what do you think it will be better in your company? What do you think your company will get most benefit of implementing any of those technologies?

                                                Thank you. I see you are voting. I will close the poll. Thank you. Now, we'll share again the poll results. Thank you very much for your responses. Let's see your responses a little bit. You should be seeing, most of you. Number one, robotic process automation, it's what I expected, let's say. I was right on this one because it's the newest. Probably workload automation, it's been more years in the market and most of the medium-sized, big-sized companies already have one maybe, but RPA, it's something still in expansion and has millions of possible use cases where it can help.

                                                Thanks again for participating in the poll. We are finishing very quick recap on what we have seen today. We have seen three technologies, workload automation. Remember, main problem we want to address with a scheduler, it's managing large volume of multi-platform jobs. Then we have seen RPA, robotic process automation. Lots of use cases for RPA, lots of use cases, but main use case could be free in the case of IT, to the IT staff of doing manual tasks that can be automaed.

                                                The thing is that now we can automate lots of things that we were not able to automate 10 years ago. This is something quite new. Then we have seen MFT, managed file transfer. Remember again the main use case for this is if we need to secure and automate critical information exchange, we cannot lose one file. That would be the main requirements for managed file transfer.

                                                Of course, today's website has been regenerate, high level, just to give you an overview. I think we shared lots of information with you today. Hope it's not too much. Probably you need now some minutes to relax a little bit. This is a typical overlap and unique features between them because there are lots of processes that can automated with one, with another, with both.

                                                Maybe this detail, it's outside of today's scope of course, but maybe the message with this slide is also that we are here, Roger, ourselves to help you if you think having a conversation with automation experts can be of any use for you. We are here. Feel free to reach us. We will be happy to engage with you. We will have one, two minutes to read maybe some questions very quickly. We want to finish on time. Let's see if there are many questions that we can respond.

                                                Look there is a good one. There is a very thin line of the difference between RPA and workload automation. Can RPA replace workload automation tool? This is a very good question, but I would need to go back to this slide. Yes, in some cases. Not always I would say in my personal experience.

                                                I think RPA it's very good on some use cases, which is basically automating front-end applications, automating also back-end processes, but when we are talking about automating thousands of jobs with dependencies, with multiple calendars, multiple countries, multiple logics, multiple platforms.

                                                Definitely a workload automation solution, it's a benefit to manage those specific requirements because of features like forecasting, performance analysis of executions, things like that that are very, very useful to manage this job volume management dependencies, but it's a very, very good question.

                                                Another question, can we run different processes in the same server in the same time. I assume this question is for RPA, but in any case, the response is yes in the three of them, including RPA and MFT. In RPA, the answer is yes. Let's say the limitation would be that with one single machine from where we are launching executions, we can just execute one execution at the same time that requires interactivity.

                                                Let's say if we need to move the mouse and make a click, in that case, we will need sequential execution in that computer, but we can put more bots, more agents to run executions in parallel with front-desk tools a that same time. Not sure if I replied to your question, but yeah, we can manage this and we can have multiple, concurrent executions at the same time, but maybe we need multiple desktops to do those.

                                                I think we are on time. There are still some questions. I don't think we have time to reply today. Just wanted to thank you for joining today. Hope it was useful. Hope you'll find at least something on all this content that we shared with you today that was useful for you. Thank you very much for joining us. We are here if we can help you in anyway, feel free to reach. We will be happy. Have a nice day. I'll see you soon.

Roger Hammer:                Thank you. Have a great day everyone.

Joan Blanch:                       Bye Roger. Thanks for joining.

Roger Hammer:                All righty, bye-bye.

Joan Blanch:                       Bye-bye.